Transient Chaotic Dimensionality Expansion by Recurrent Networks
Neurons in the brain communicate with spikes, which are discrete events in time and value. Functional network models often employ rate units that are continuously coupled by analog signals. Is there a qualitative difference implied by these two forms of signaling? We develop a unified mean-field the...
Guardado en:
Autores principales: | , , , |
---|---|
Formato: | article |
Lenguaje: | EN |
Publicado: |
American Physical Society
2021
|
Materias: | |
Acceso en línea: | https://doaj.org/article/d9b148f6487e4d21b90cabda7eb6ce92 |
Etiquetas: |
Agregar Etiqueta
Sin Etiquetas, Sea el primero en etiquetar este registro!
|
Sumario: | Neurons in the brain communicate with spikes, which are discrete events in time and value. Functional network models often employ rate units that are continuously coupled by analog signals. Is there a qualitative difference implied by these two forms of signaling? We develop a unified mean-field theory for large random networks to show that first- and second-order statistics in rate and binary networks are in fact identical if rate neurons receive the right amount of noise. Their response to presented stimuli, however, can be radically different. We quantify these differences by studying how nearby state trajectories evolve over time, asking to what extent the dynamics is chaotic. Chaos in the two models is found to be qualitatively different. In binary networks, we find a network-size-dependent transition to chaos and a chaotic submanifold whose dimensionality expands stereotypically with time, while rate networks with matched statistics are nonchaotic. Dimensionality expansion in chaotic binary networks aids classification in reservoir computing and optimal performance is reached within about a single activation per neuron; a fast mechanism for computation that we demonstrate also in spiking networks. A generalization of this mechanism extends to rate networks in their respective chaotic regimes. |
---|