Log in

Efficient and hardware-friendly methods to implement competitive learning for spiking neural networks

  • Original Article
  • Published:
Neural Computing and Applications Aims and scope Submit manuscript

Abstract

Spiking neural network (SNN) trained by spike-timing-dependent plasticity (STDP) is a promising computing paradigm for energy-efficient artificial intelligence systems. During the learning procedure of SNN trained by STDP, another two bio-inspired mechanisms of lateral inhibition and homeostasis are usually implemented to achieve competitive learning. However, the previous methods to implement lateral inhibition and homeostasis are not designed with hardware in mind, resulting in solutions that are not efficient for deployment on neuromorphic hardware. For example, the existing lateral inhibition methods induce a great number of connections that are proportional to the square of the number of learning neurons. The classical homeostasis methods depend on the fine-tuned membrane threshold with no hardware solution provided. In this paper, we propose two hardware-friendly and scalable methods to achieve lateral inhibition and homeostasis. Using only one inhibitory neuron for one learning layer, our proposed lateral inhibition method can reduce inhibitory connection number from \(N^2\) to 2N and hardware overhead by sharing refractory control circuits. Utilizing the adaptive resistance of memristor, we propose a novel homeostasis method through adapting the leaky current of spiking neurons. In addition, the learning efficiency of different homeostasis methods are studied for the first time by simulating on the cognitive task of digital recognition of MNIST dataset. Simulation results show that our proposed homeostasis method can improve the learning efficiency by 30–50% while maintaining the state-of-the-art performance.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
EUR 32.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or Ebook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price includes VAT (Germany)

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10

Similar content being viewed by others

References

  1. Maas W (1997) Networks of spiking neurons: the third generation of neural network models. Neural Netw 14(4):1659. https://doi.org/10.1016/S0893-6080(97)00011-7

    Article  Google Scholar 

  2. Schuman CD, Potok TE, Patton RM, Birdwell JD, Dean ME, Rose GS, Plank JS (2017) A survey of neuromorphic computing and neural networks in hardware. ar**v Neural and Evolutionary Computing. ar**v:1705.06963

  3. Kheradpisheh SR, Ganjtabesh M, Masquelier T (2016) Bio-inspired unsupervised learning of visual features leads to robust invariant object recognition. Neurocomputing 205:382. https://doi.org/10.1016/j.neucom.2016.04.029

    Article  Google Scholar 

  4. Yu Q, Yan R, Tang H, Tan KC, Li H (2016) A spiking neural network system for robust sequence recognition. IEEE Trans Neural Netw 27(3):621. https://doi.org/10.1109/TNNLS.2015.2416771

    Article  MathSciNet  Google Scholar 

  5. Thiele JC, Bichler O, Dupret A (2018) Event-based, timescale invariant unsupervised online deep learning with STDP. Front Comput Neurosci. https://doi.org/10.3389/fncom.2018.00046

    Article  Google Scholar 

  6. Masquelier T, Guyonneau R, Thorpe SJ (2009) Competitive STDP-based spike pattern learning. Neural Comput 21(5):1259. https://doi.org/10.1162/neco.2008.06-08-804

    Article  MATH  Google Scholar 

  7. Huang S, Rozas C, Trevino M, Contreras J, Yang S, Song L, Yoshioka T, Lee HK, Kirkwood A (2014) Associative Hebbian synaptic plasticity in primate visual cortex. J Neurosci 34:7575. https://doi.org/10.1523/JNEUROSCI.0983-14.2014

    Article  Google Scholar 

  8. Arthur JV, Boahen K (2005) Learning in silicon: timing is everything. In: Advances in neural information processing systems 18, pp 75–82, Vancouver, British Columbia, Canada. http://papers.nips.cc/paper/2859-learning-in-silicon-timing-is-everything

  9. Azghadi MR, Iannella N, Alsarawi SF, Abbott D (2014) tunable low energy, compact and high performance neuromorphic circuit for spike-based synaptic plasticity. PLOS ONE. https://doi.org/10.1371/journal.pone.0088326

    Article  Google Scholar 

  10. Krichmar JL, Coussy P, Dutt ND (2015) Large-scale spiking neural networks using neuromorphic hardware compatible models. ACM J Emerg Technol Comput Syst 11(4):36. https://doi.org/10.1145/2629509

    Article  Google Scholar 

  11. Du Z, Rubin DDB, Chen Y, Hel L, Chen T, Zhang L, Wu C, Temam O (2015) Neuromorphic accelerators: a comparison between neuroscience and machine-learning approaches. In: International symposium on microarchitecture, pp 494–507. https://doi.org/10.1145/2830772.2830789

  12. Gregory KC, Raghavan K, Ekin SH, Phil CK, Ram KK (2019) A 4096-neuron 1M-synapse 3.8-pJ/SOP spiking neural network with on-chip STDP learning and sparse weights in 10-nm FinFET CMOS. IEEE J Solid-State Circuits 54(4):992. https://doi.org/10.1109/JSSC.2018.2884901

    Article  Google Scholar 

  13. Lee C, Panda P, Srinivasan G, Roy K (2018) Training deep spiking convolutional neural networks with STDP-based unsupervised pre-training followed by supervised fine-tuning. Front Neurosci 12:435. https://doi.org/10.3389/fnins.2018.00435

    Article  Google Scholar 

  14. Querlioz D, Bichler O, Gamrat C (2011) Simulation of a memristor-based spiking neural network immune to device variations. In: International joint conference on neural network, pp 1775–1781. https://doi.org/10.1109/IJCNN.2011.6033439

  15. Querlioz D, Bichler O, Dollfus P, Gamrat C (2013) Immunity to device variations in a spiking neural network with memristive nanodevices. IEEE Trans Nanotechnol 12:288. https://doi.org/10.1109/TNANO.2013.2250995

    Article  Google Scholar 

  16. Kristofor DC, Micah R, Nikil D, Jeffrey LK (2013) Biologically plausible models of homeostasis and STDP: stability and learning in spiking neural networks. In: The 2013 international joint conference on neural networks (IJCNN), pp 1–8. https://doi.org/10.1109/IJCNN.2013.6706961

  17. Marder E, Goaillard J (2006) Variability, compensation and homeostasis in neuron and network function. Nat Rev Neurosci 7(7):563. https://doi.org/10.1038/nrn1949

    Article  Google Scholar 

  18. Carlson KD, Richert M, Dutt ND, Krichmar JL (2013) Biologically plausible models of homeostasis and STDP: stability and learning in spiking neural networks. In: International joint conference on neural network, pp 1–8. https://doi.org/10.1109/IJCNN.2013.6706961

  19. Diehl PU, Cook M (2015) Unsupervised learning of digit recognition using spike-timing-dependent plasticity. Front Comput Neurosci 9:99. https://doi.org/10.3389/fncom.2015.00099

    Article  Google Scholar 

  20. Panda P, Srinivasan G, Roy K (2017) Convolutional spike timing dependent plasticity based feature learning in spiking neural networks. ar**v Neural and Evolutionary Computing. ar**v:1703.03854

  21. Kheradpisheh SR, Ganjtabesh M, Thorpe SJ, Masquelier T (2018) STDP-based spiking deep convolutional neural networks for object recognition. Neural Netw 99:56. https://doi.org/10.1016/j.neunet.2017.12.005

    Article  Google Scholar 

  22. Gopalakrishnan S, Kaushik R (2019) ReStoCNet: residual stochastic binary convolutional spiking neural network for memory-efficient neuromorphic computing. CoRR abs/1902.04161. ar**v:1902.04161

  23. Dong M, Huang X, Xu B (2018) Unsupervised speech recognition through spike-timing-dependent plasticity in a convolutional spiking neural network. PLOS ONE. https://doi.org/10.1371/journal.pone.0204596

    Article  Google Scholar 

  24. Lee C, Srinivasan G, Panda P, Roy K (2018) Deep spiking convolutional neural network trained with unsupervised spike timing dependent plasticity. IEEE Trans Cogn Dev Syst. https://doi.org/10.1109/tcds.2018.2833071

    Article  Google Scholar 

  25. Tavanaei A, Maida AS (2017) Multi-layer unsupervised learning in a spiking convolutional neural network. In: International joint conference on neural network, pp 2023–2030. https://doi.org/10.1109/IJCNN.2017.7966099

  26. Bayat FM, Prezioso M, Chakrabarti B, Kataeva I, Strukov DB (2017) Implementation of multilayer perceptron network with highly uniform passive memristive crossbar circuits. CoRR abs/1712.01253. arxiv:1712.01253

  27. Mikhaylov AN, Morozov OA, Ovchinnikov PE, Antonov IN, Belov AI, Korolev DS, Sharapov AN, Gryaznov EG, Gorshkov ON, Pigareva YI, Pimashkin AS, Lobov SA, Kazantsev VB (2018) One-board design and simulation of double-layer perceptron based on metal-oxide memristive nanostructures. IEEE Trans Emerg Top Comput Intell 2(5):371. https://doi.org/10.1109/TETCI.2018.2829922

    Article  Google Scholar 

  28. Zayer F, Dghais W, Benabdeladhim M, Hamdi B (2019) Low power, ultrafast synaptic plasticity in 1R-ferroelectric tunnel memristive structure for spiking neural networks. AEU-Int J Electron Commun 100:56. https://doi.org/10.1016/j.aeue.2019.01.003

    Article  Google Scholar 

  29. Guo Y, Wu H, Gao B, Qian H (2019) Unsupervised learning on resistive memory array based spiking neural networks. Front Neurosci. https://doi.org/10.3389/fnins.2019.00812

    Article  Google Scholar 

  30. Wang Z, Joshi S, Saveliev S, Song W, Midya R, Li Y, Rao M, Yan P, Asapu S, Zhuo Y, Jiang H, Lin P, Li C, Yoon JH, Upadhyay NK, Zhang J, Hu M, Strachan JP, Barnell M, Wu Q, Wu H, Williams R, **a Q, Yang JJ (2018) Fully memristive neural networks for pattern classification with unsupervised learning. Nat Electron. https://doi.org/10.1038/s41928-018-0023-2

    Article  Google Scholar 

  31. Lapkin DA, Emelyanov AV, Demin VA, Erokhin V, Feigin LA, Kashkarov PK, Kovalchuk MV (2018) Polyaniline-based memristive microdevice with high switching rate and endurance. Appl Phys Lett 112(4):043302. https://doi.org/10.1063/1.5013929

    Article  Google Scholar 

  32. Mikhaylov AN, Gryaznov EG, Belov AI, Korolev DS, Sharapov AN, Guseinov DV, Tetelbaum DI, Tikhov SV, Malekhonova NV, Bobrov AI et al (2016) Field- and irradiation-induced phenomena in memristive nanomaterials. Phys Status Solidi (c) 13:870. https://doi.org/10.1002/pssc.201600083

    Article  Google Scholar 

  33. Wang Z, Joshi S, Savelev S, Jiang H, Midya R, Lin P, Hu M, Ge N, Strachan JP, Li Z et al (2017) Memristors with diffusive dynamics as synaptic emulators for neuromorphic computing. Nat Mater 16(1):101. https://doi.org/10.1038/nmat4756

    Article  Google Scholar 

  34. Liu X, Li S, Nandi SK, Venkatachalam DK, Elliman RG (2016) Threshold switching and electrical self-oscillation in niobium oxide films. J Appl Phys 120(12):124102. https://doi.org/10.1063/1.4963288

    Article  Google Scholar 

  35. Pedretti G, Milo V, Ambrogio S, Carboni R, Bianchi S, Calderoni A, Ramaswamy N, Spinelli AS, Ielmini D (2017) Memristive neural network for on-line learning and tracking with brain-inspired spike timing dependent plasticity. Sci Rep 7(1):5288. https://doi.org/10.1038/s41598-017-05480-0

    Article  Google Scholar 

  36. Wijesinghe P, Ankit A, Sengupta A, Roy K (2018) An all-memristor deep spiking neural computing system: a step toward realizing the low-power stochastic brain. IEEE Trans Emerg Top Comput Intell 2(5):345. https://doi.org/10.1109/TETCI.2018.2829924

    Article  Google Scholar 

  37. Seo K, Kim I, Jung S, Jo M, Park S, Park J, Shin J, Biju KP, Kong J, Lee K et al (2011) Analog memory and spike-timing-dependent plasticity characteristics of a nanoscale titanium oxide bilayer resistive switching device. Nanotechnology 22(25):254023. https://doi.org/10.1088/0957-4484/22/25/254023

    Article  Google Scholar 

  38. Jo SH, Chang T, Ebong I, Bhadviya B, Mazumder P, Lu WY (2010) Nanoscale memristor device as synapse in neuromorphic systems. Nano Lett 10(4):1297. https://doi.org/10.1021/nl904092h

    Article  Google Scholar 

  39. Bartolozzi C, Nikolayeva O, Indiveri G (2008) Implementing homeostatic plasticity in VLSI networks of spiking neurons. In: International conference on electronics, circuits, and systems, pp 682–685. https://doi.org/10.1109/ICECS.2008.4674945

  40. Abbott LF, Depasquale B, Memmesheimer R (2016) Building functional networks of spiking model neurons. Nat Neurosci 19(3):350. https://doi.org/10.1038/nn.4241

    Article  Google Scholar 

  41. Vanrullen R, Guyonneau R, Thorpe SJ (2005) Spike times make sense. Trends Neurosci 28(1):1. https://doi.org/10.1016/j.tins.2004.10.010

    Article  Google Scholar 

  42. Iakymchuk T, Rosadomunoz A, Guerreromartinez JF, Batallermompean M, Francesvillora JV (2015) Simplified spiking neural network architecture and STDP learning algorithm applied to image classification. EURASIP J Image Video Process 2015(1):4. https://doi.org/10.1186/s13640-015-0059-4

    Article  Google Scholar 

  43. Sboev A, Vlasov D, Rybka R, Serenko A (2018) Solving a classification task by spiking neurons with STDP and temporal coding. Procedia Comput Sci 123:494. https://doi.org/10.1016/j.procs.2018.01.075

    Article  MATH  Google Scholar 

  44. Shrestha A, Ahmed K, Wang Y, Qiu Q (2017) Stable spike-timing dependent plasticity rule for multilayer unsupervised and supervised learning. In: 2017 international joint conference on neural networks (IJCNN), pp 1999–2006. https://doi.org/10.1109/IJCNN.2017.7966096

  45. Bi G, Poo M (2001) Synaptic modification by correlated activity: Hebb’s postulate revisited. Annu Rev Neurosci 24:139. https://doi.org/10.1146/annurev.neuro.24.1.139

    Article  Google Scholar 

  46. Matveyev Y, Kirtaev RV, Fetisova A, Zakharchenko S, Negrov D, Zenkevich A (2016) Crossbar nanoscale HfO2-based electronic synapses. Nanoscale Res Lett 11:147. https://doi.org/10.1186/s11671-016-1360-6

    Article  Google Scholar 

  47. Tang T, **a L, Li B, Luo R, Chen Y, Wang Y, Yang H (2015) Spiking neural network with RRAM: can we use it for real-world application? In: Design, automation, and test in Europe, pp 860–865. https://doi.org/10.7873/DATE.2015.1085

  48. Soures N, Hays L, Bohannon E, Zyarah AM, Kudithipudi D (2017) On-device STDP and synaptic normalization for neuromemristive spiking neural network. In: International midwest symposium on circuits and systems, pp 1081–1084. https://doi.org/10.1109/MWSCAS.2017.8053115

  49. Brader JM, Senn W, Fusi S (2007) Learning real-world stimuli in a neural network with spike-driven synaptic dynamics. Neural Comput 19:2881. https://doi.org/10.1162/neco.2007.19.11.2881

    Article  MathSciNet  MATH  Google Scholar 

  50. Goodman DFM, Brette R (2009) The brian simulator. Front Neurosci 3:192. https://doi.org/10.3389/neuro.01.026.2009

    Article  Google Scholar 

  51. Romain B, Dan G (2019) The brian spiking neural network simulator. http://briansimulator.org. Accessed 4 April 2019

  52. Lecun Y, Bottou L, Bengio Y, Haffner P (1998) Gradient-based learning applied to document recognition. Proc IEEE 86:2278. https://doi.org/10.1109/5.726791

    Article  Google Scholar 

  53. Mochizuki Y, Shinomoto S (2014) Analog and digital codes in the brain. Phys Rev E 89:022705. https://doi.org/10.1103/PhysRevE.89.022705

    Article  Google Scholar 

  54. Pierre F, Pierre T, Ioan MB, Philippe D, Pierre B (2018) Mastering the output frequency in spiking neural networks. In: 2018 international joint conference on neural networks (IJCNN), pp. 1–8. https://doi.org/10.1109/IJCNN.2018.8489410

  55. Liu D, Yue S (2019) Event-driven continuous STDP learning with deep structure for visual pattern recognition. IEEE Trans Cybern 49:1377. https://doi.org/10.1109/tcyb.2018.2801476

    Article  Google Scholar 

Download references

Acknowledgements

This research is funded by the National Key R&D Program of China [Grant No. 2018YFB2202603] and in part by the National Natural Science Foundation of China [Grant Nos. 61802427 and 61832018].

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Lianhua Qu.

Ethics declarations

Conflict of interest

The authors declare that they have no conflict of interest.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Qu, L., Zhao, Z., Wang, L. et al. Efficient and hardware-friendly methods to implement competitive learning for spiking neural networks. Neural Comput & Applic 32, 13479–13490 (2020). https://doi.org/10.1007/s00521-020-04755-4

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00521-020-04755-4

Keywords

Navigation