Abstract
The stability analysis of dynamical neural network systems typically involves finding a suitable Lyapunov function, as demonstrated in Hopfield’s famous paper on content-addressable memory networks. Another approach is to identify conditions that prevent divergent solutions. In this study, we focus on biological recurrent neural networks (bRNNs), specifically the Cohen-Grossberg networks that require transient external inputs. We propose a general method for constructing Lyapunov functions for recurrent neural networks using physically meaningful energy functions. This approach allows us to investigate the emergent properties of the recurrent network, such as the parameter configuration required for winner-take-all competition in a leaky accumulator design, which extends beyond the scope of standard stability analysis. Furthermore, our method aligns well with standard stability analysis (ordinary differential equation approach), as it encompasses the general stability constraints derived from the energy function formulation. We demonstrate that the Cohen-Grossberg Lyapunov function can be naturally derived from the energy function formalism. Importantly, this construction proves to be a valuable tool for predicting the behavior of actual biological networks in certain cases.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Usher, M., Cohen, J.D. : Short term memory and selection processes in a frontal-lobe model. In: Heinke, D., Humphreys, G.W., Olson, A. (eds.) Connectionist Models in Cognitive Neuroscience, pp. 78–91 (1999)
Bogacz, R., Usher, M., Zhang, J., McClelland, J.L.: Extending a biologically inspired model of choice: multi-alternatives, nonlinearity and value-based multidimensional choice. Philos. Trans. Roy. Soc. B: Biol. Sci. 362(1655), 1655–1670 (2007)
Sengupta, R., Bapiraju, S., Melcher, D.: A visual sense of number emerges from the dynamics of a recurrent on-center off-surround neural network. Brain Res. 1582, 114–124 (2014)
Andreopoulos, A., Tsotsos, J.K.: 50 Years of object recognition: directions forward. Comput. Vis. Image Underst. 117(8), 827–891 (2013)
Grossberg, S.: Nonlinear neural networks: principles, mechanisms, and architectures. Neural Netw. 1(1), 17–61 (1988)
Cohen, M.A., Grossberg, S.: Absolute stability of global pattern formation and parallel memory storage by competitive neural networks. IEEE Trans. Syst. Man Cybern. SMC-13, 815–826 (1983)
Zhang, H., Wang, Z., Liu, D.: A comprehensive review of stability analysis of continuous-time recurrent neural networks. IEEE Trans. Neural Netw. Learn. Syst. 25, 1229–1262 (2014)
Hopfield, J.J.: Neurons with graded response have collective computational properties like those of two-state neurons. Proc. Natl. Acad. Sci. 81, 3088–3092 (1984)
Durmaz, S., Altay Demirbag, S., Kaya, M.O.: Energy function approach to multiple coupled nonlinear oscillators. Acta Phys. Polonica-Ser. A Gener. Phys. 121, 47–49 (2012)
Sengupta, R., Bapiraju, S., Basu, P., Melcher, D.: Accounting for subjective time expansion based on a decision, rather than perceptual, mechanism. J. Vis. 14, 1150 (2014)
Knops, A., Piazza, M., Sengupta, R., Eger, E., Melcher, D.: A shared, flexible neural map architecture reflects capacity limits in both visual short term memory and enumeration. J. Neurosci. 34, 9857–9866 (2014)
Stanley, G.B.: Reading and writing the neural code. Nat. Neurosci. 16(3), 259–263 (2013)
Van Rullen, R., Thorpe, S.J.: Rate coding versus temporal order coding: what the retinal ganglion cells tell the visual cortex. Neural Comput. 13(6), 1255–1283 (2001)
Gautrais, J., Thorpe, S.: Rate coding versus temporal order coding: a theoretical approach. Biosystems 48(1), 57–65 (1998)
Masquelier, T.: Relative spike time coding and STDP-based orientation selectivity in the early visual system in natural continuous and saccadic vision: a computational model. J. Comput. Neurosci. 32(3), 425–441 (2012)
Mathewson, K.E., Gratton, G., Fabiani, M., Beck, D.M., Ro, T.: To see or not to see: prestimulus \(\alpha \) phase predicts visual awareness. J. Neurosci. 29(9), 2725–2732 (2009)
Keil, J., Müller, N., Ihssen, N., Weisz, N.: On the variability of the McGurk effect: audiovisual integration depends on prestimulus brain states. Cereb. Cortex 22(1), 221–231 (2012)
May, E.S., Butz, M., Kahlbrock, N., Hoogenboom, N., Brenner, M., Schnitzler, A.: Pre- and post-stimulus alpha activity shows differential modulation with spatial attention during the processing of pain. Neuroimage 62(3), 1965–1974 (2012)
Weisz, N., et al.: Prestimulus oscillatory power and connectivity patterns predispose conscious somatosensory perception. Proc. Natl. Acad. Sci. 111(4), E417–E425 (2014)
Sengupta, R., Raja Shekar, P.V.: Oscillatory dynamics in complex recurrent neural networks. Biophys. Rev. Lett. 17(1), 75–85 (2022)
Hopfield, J.J.: Neural networks and physical systems with emergent collective computational abilities. Proc. Natl. Acad. Sci. 79(8), 2554–2558 (1982)
Hopfield, J.J., Brody, C.D.: Pattern recognition computation using action potential timing for stimulus representation. Nature 376(3535), 33–36 (1995)
Amari, S.-I.: Dynamics of pattern formation in lateral-inhibition type neural fields. Biol. Cybern. 27(2), 77–87 (1977)
Izhikevich, E.M.: Simple model of spiking neurons. IEEE Trans. Neural Netw. 14(6), 1569–1572 (2003)
Sengupta, R., Bapiraju, S., Melcher, D.: Big and small numbers: empirical support for a single, flexible mechanism for numerosity perception. Attent. Percept. Psychophys. 79, 253–266 (2017)
Faydasicok, O.: An improved Lyapunov functional with application to stability of Cohen-Grossberg neural networks of neutral-type with multiple delays. Neural Netw. 132, 532–539 (2020)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2024 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Sengupta, R., Bapiraju, S., Pattanayak, A. (2024). Exploring Emergent Properties of Recurrent Neural Networks Using a Novel Energy Function Formalism. In: Nicosia, G., Ojha, V., La Malfa, E., La Malfa, G., Pardalos, P.M., Umeton, R. (eds) Machine Learning, Optimization, and Data Science. LOD 2023. Lecture Notes in Computer Science, vol 14505. Springer, Cham. https://doi.org/10.1007/978-3-031-53969-5_23
Download citation
DOI: https://doi.org/10.1007/978-3-031-53969-5_23
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-53968-8
Online ISBN: 978-3-031-53969-5
eBook Packages: Computer ScienceComputer Science (R0)