Exploring Emergent Properties of Recurrent Neural Networks Using a Novel Energy Function Formalism

  • Conference paper
  • First Online:
Machine Learning, Optimization, and Data Science (LOD 2023)

Abstract

The stability analysis of dynamical neural network systems typically involves finding a suitable Lyapunov function, as demonstrated in Hopfield’s famous paper on content-addressable memory networks. Another approach is to identify conditions that prevent divergent solutions. In this study, we focus on biological recurrent neural networks (bRNNs), specifically the Cohen-Grossberg networks that require transient external inputs. We propose a general method for constructing Lyapunov functions for recurrent neural networks using physically meaningful energy functions. This approach allows us to investigate the emergent properties of the recurrent network, such as the parameter configuration required for winner-take-all competition in a leaky accumulator design, which extends beyond the scope of standard stability analysis. Furthermore, our method aligns well with standard stability analysis (ordinary differential equation approach), as it encompasses the general stability constraints derived from the energy function formulation. We demonstrate that the Cohen-Grossberg Lyapunov function can be naturally derived from the energy function formalism. Importantly, this construction proves to be a valuable tool for predicting the behavior of actual biological networks in certain cases.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
EUR 32.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or Ebook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
EUR 29.95
Price includes VAT (Germany)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
EUR 69.54
Price includes VAT (Germany)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
EUR 87.73
Price includes VAT (Germany)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free ship** worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

References

  1. Usher, M., Cohen, J.D. : Short term memory and selection processes in a frontal-lobe model. In: Heinke, D., Humphreys, G.W., Olson, A. (eds.) Connectionist Models in Cognitive Neuroscience, pp. 78–91 (1999)

    Google Scholar 

  2. Bogacz, R., Usher, M., Zhang, J., McClelland, J.L.: Extending a biologically inspired model of choice: multi-alternatives, nonlinearity and value-based multidimensional choice. Philos. Trans. Roy. Soc. B: Biol. Sci. 362(1655), 1655–1670 (2007)

    Article  Google Scholar 

  3. Sengupta, R., Bapiraju, S., Melcher, D.: A visual sense of number emerges from the dynamics of a recurrent on-center off-surround neural network. Brain Res. 1582, 114–124 (2014)

    Article  Google Scholar 

  4. Andreopoulos, A., Tsotsos, J.K.: 50 Years of object recognition: directions forward. Comput. Vis. Image Underst. 117(8), 827–891 (2013)

    Article  Google Scholar 

  5. Grossberg, S.: Nonlinear neural networks: principles, mechanisms, and architectures. Neural Netw. 1(1), 17–61 (1988)

    Article  Google Scholar 

  6. Cohen, M.A., Grossberg, S.: Absolute stability of global pattern formation and parallel memory storage by competitive neural networks. IEEE Trans. Syst. Man Cybern. SMC-13, 815–826 (1983)

    Google Scholar 

  7. Zhang, H., Wang, Z., Liu, D.: A comprehensive review of stability analysis of continuous-time recurrent neural networks. IEEE Trans. Neural Netw. Learn. Syst. 25, 1229–1262 (2014)

    Article  Google Scholar 

  8. Hopfield, J.J.: Neurons with graded response have collective computational properties like those of two-state neurons. Proc. Natl. Acad. Sci. 81, 3088–3092 (1984)

    Article  Google Scholar 

  9. Durmaz, S., Altay Demirbag, S., Kaya, M.O.: Energy function approach to multiple coupled nonlinear oscillators. Acta Phys. Polonica-Ser. A Gener. Phys. 121, 47–49 (2012)

    Google Scholar 

  10. Sengupta, R., Bapiraju, S., Basu, P., Melcher, D.: Accounting for subjective time expansion based on a decision, rather than perceptual, mechanism. J. Vis. 14, 1150 (2014)

    Article  Google Scholar 

  11. Knops, A., Piazza, M., Sengupta, R., Eger, E., Melcher, D.: A shared, flexible neural map architecture reflects capacity limits in both visual short term memory and enumeration. J. Neurosci. 34, 9857–9866 (2014)

    Article  Google Scholar 

  12. Stanley, G.B.: Reading and writing the neural code. Nat. Neurosci. 16(3), 259–263 (2013)

    Article  Google Scholar 

  13. Van Rullen, R., Thorpe, S.J.: Rate coding versus temporal order coding: what the retinal ganglion cells tell the visual cortex. Neural Comput. 13(6), 1255–1283 (2001)

    Article  Google Scholar 

  14. Gautrais, J., Thorpe, S.: Rate coding versus temporal order coding: a theoretical approach. Biosystems 48(1), 57–65 (1998)

    Article  Google Scholar 

  15. Masquelier, T.: Relative spike time coding and STDP-based orientation selectivity in the early visual system in natural continuous and saccadic vision: a computational model. J. Comput. Neurosci. 32(3), 425–441 (2012)

    Article  MathSciNet  Google Scholar 

  16. Mathewson, K.E., Gratton, G., Fabiani, M., Beck, D.M., Ro, T.: To see or not to see: prestimulus \(\alpha \) phase predicts visual awareness. J. Neurosci. 29(9), 2725–2732 (2009)

    Article  Google Scholar 

  17. Keil, J., Müller, N., Ihssen, N., Weisz, N.: On the variability of the McGurk effect: audiovisual integration depends on prestimulus brain states. Cereb. Cortex 22(1), 221–231 (2012)

    Article  Google Scholar 

  18. May, E.S., Butz, M., Kahlbrock, N., Hoogenboom, N., Brenner, M., Schnitzler, A.: Pre- and post-stimulus alpha activity shows differential modulation with spatial attention during the processing of pain. Neuroimage 62(3), 1965–1974 (2012)

    Article  Google Scholar 

  19. Weisz, N., et al.: Prestimulus oscillatory power and connectivity patterns predispose conscious somatosensory perception. Proc. Natl. Acad. Sci. 111(4), E417–E425 (2014)

    Article  Google Scholar 

  20. Sengupta, R., Raja Shekar, P.V.: Oscillatory dynamics in complex recurrent neural networks. Biophys. Rev. Lett. 17(1), 75–85 (2022)

    Article  Google Scholar 

  21. Hopfield, J.J.: Neural networks and physical systems with emergent collective computational abilities. Proc. Natl. Acad. Sci. 79(8), 2554–2558 (1982)

    Article  MathSciNet  Google Scholar 

  22. Hopfield, J.J., Brody, C.D.: Pattern recognition computation using action potential timing for stimulus representation. Nature 376(3535), 33–36 (1995)

    Article  Google Scholar 

  23. Amari, S.-I.: Dynamics of pattern formation in lateral-inhibition type neural fields. Biol. Cybern. 27(2), 77–87 (1977)

    Article  MathSciNet  Google Scholar 

  24. Izhikevich, E.M.: Simple model of spiking neurons. IEEE Trans. Neural Netw. 14(6), 1569–1572 (2003)

    Article  MathSciNet  Google Scholar 

  25. Sengupta, R., Bapiraju, S., Melcher, D.: Big and small numbers: empirical support for a single, flexible mechanism for numerosity perception. Attent. Percept. Psychophys. 79, 253–266 (2017)

    Article  Google Scholar 

  26. Faydasicok, O.: An improved Lyapunov functional with application to stability of Cohen-Grossberg neural networks of neutral-type with multiple delays. Neural Netw. 132, 532–539 (2020)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Rakesh Sengupta .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2024 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Sengupta, R., Bapiraju, S., Pattanayak, A. (2024). Exploring Emergent Properties of Recurrent Neural Networks Using a Novel Energy Function Formalism. In: Nicosia, G., Ojha, V., La Malfa, E., La Malfa, G., Pardalos, P.M., Umeton, R. (eds) Machine Learning, Optimization, and Data Science. LOD 2023. Lecture Notes in Computer Science, vol 14505. Springer, Cham. https://doi.org/10.1007/978-3-031-53969-5_23

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-53969-5_23

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-53968-8

  • Online ISBN: 978-3-031-53969-5

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics

Navigation