How complex patterns generated by neural systems are represented in individual neuronal activity is an essential problem in computational neuroscience as well as machine learning communities. Here, based on recurrent neural networks in the form of feedback reservoir computers, we show microscopic features resulting in generating spatiotemporal patterns including multicluster and chimera states. We show the effect of individual neural trajectories as well as whole-network activity distributions on exhibiting particular regimes. In addition, we address the question how trained output weights contribute to the autonomous multidimensional dynamics.

1.
J.
Pathak
,
Z.
Lu
,
B. R.
Hunt
,
M.
Girvan
, and
E.
Ott
, “
Using machine learning to replicate chaotic attractors and calculate Lyapunov exponents from data
,”
Chaos
27
,
121102
(
2017
).
2.
J.
Pathak
,
A.
Wikner
,
R.
Fussell
,
S.
Chandra
,
B. R.
Hunt
,
M.
Girvan
, and
E.
Ott
, “
Hybrid forecasting of chaotic processes: Using machine learning in conjunction with a knowledge-based model
,”
Chaos
28
,
041101
(
2018
).
3.
P. R.
Vlachas
,
J.
Pathak
,
B. R.
Hunt
,
T. P.
Sapsis
,
M.
Girvan
,
E.
Ott
, and
P.
Koumoutsakos
, “
Backpropagation algorithms and reservoir computing in recurrent neural networks for the forecasting of complex spatiotemporal dynamics
,”
Neural Networks
126
,
191
217
(
2020
).
4.
A.
Griffith
,
A.
Pomerance
, and
D. J.
Gauthier
, “
Forecasting chaotic systems with very low connectivity reservoir computers
,”
Chaos
29
,
123108
(
2019
).
5.
M.
Masoliver
,
J.
Davidsen
, and
W.
Nicola
, “
Embedded chimera states in recurrent neural networks
,”
Commun. Phys.
5
,
205
(
2022
).
6.
A. V.
Andreev
,
A. A.
Badarin
,
V. A.
Maximenko
, and
A. E.
Hramov
, “
Forecasting macroscopic dynamics in adaptive Kuramoto network using reservoir computing
,”
Chaos
32
,
103126
(
2022
).
7.
O.
Barak
, “
Recurrent neural networks as versatile tools of neuroscience research
,”
Curr. Opin. Neurobiol.
46
,
1
6
(
2017
).
8.
M.
Lukoševičius
and
H.
Jaeger
, “
Reservoir computing approaches to recurrent neural network training
,”
Comput. Sci. Rev.
3
,
127
149
(
2009
).
9.
B.
Schrauwen
,
D.
Verstraeten
, and
J.
Van Campenhout
, “An overview of reservoir computing: Theory, applications and implementations,” in Proceedings of the 15th European Symposium on Artificial Neural Networks (ESANN, 2007), pp. 471–482; available at https://www.i6doc.com/en/book/?gcoi=28001100477480#h2tabDetails.
10.
M.
Lukoševičius
,
H.
Jaeger
, and
B.
Schrauwen
, “
Reservoir computing trends
,”
Künstliche Intelligenz
26
,
365
371
(
2012
).
11.
D.
Sussillo
and
O.
Barak
, “
Opening the black box: Low-dimensional dynamics in high-dimensional recurrent neural networks
,”
Neural Computation
25
,
626
649
(
2013
).
12.
D.
Sussillo
and
L. F.
Abbott
, “
Generating coherent patterns of activity from chaotic neural networks
,”
Neuron
63
,
544
557
(
2009
).
13.
D.
Sussillo
, “
Neural circuits as computational dynamical systems
,”
Curr. Opin. Neurobiol.
25
,
156
163
(
2014
).
14.
B.
DePasquale
,
C. J.
Cueva
,
K.
Rajan
,
L.
Abbott
et al., “
full-Force: A target-based method for training recurrent networks
,”
PLoS One
13
,
e0191527
(
2018
).
15.
O. V.
Maslennikov
and
V. I.
Nekorkin
, “
Collective dynamics of rate neurons for supervised learning in a reservoir computing system
,”
Chaos
29
,
103126
(
2019
).
16.
L. F.
Abbott
,
B.
DePasquale
, and
R.-M.
Memmesheimer
, “
Building functional networks of spiking model neurons
,”
Nat. Neurosci.
19
,
350
(
2016
).
17.
W.
Nicola
and
C.
Clopath
, “
Supervised learning in spiking neural networks with force training
,”
Nat. Commun.
8
,
2208
(
2017
).
18.
C. M.
Kim
and
C. C.
Chow
, “
Learning recurrent dynamics in spiking networks
,”
eLife
7
,
e37124
(
2018
).
19.
D.
Kasatkin
and
V.
Nekorkin
, “
Synchronization of chimera states in a multiplex system of phase oscillators with adaptive couplings
,”
Chaos
28
,
093115
(
2018
).
20.
D.
Kasatkin
,
S.
Yanchuk
,
E.
Schöll
, and
V.
Nekorkin
, “
Self-organized emergence of multilayer structure and chimera states in dynamical networks with adaptive couplings
,”
Phys. Rev. E
96
,
062211
(
2017
).
21.
J.
Lehnert
,
P.
Hövel
,
A.
Selivanov
,
A.
Fradkov
, and
E.
Schöll
, “
Controlling cluster synchronization by adapting the topology
,”
Phys. Rev. E
90
,
042914
(
2014
).
22.
O. V.
Maslennikov
and
V. I.
Nekorkin
, “Adaptive dynamical networks,
Phys.-Usp.
60
(7), 694 (
2017
).
23.
Y.
Kuramoto
, Chemical Oscillations, Waves, and Turbulence (Springer-Verlag, Berlin, Heidelberg,
1984
).
24.
J. A.
Acebrón
,
L. L.
Bonilla
,
C. J. P.
Vicente
,
F.
Ritort
, and
R.
Spigler
, “
The Kuramoto model: A simple paradigm for synchronization phenomena
,”
Rev. Mod. Phys.
77
,
137
(
2005
).
25.
F. A.
Rodrigues
,
T. K. D.
Peron
,
P.
Ji
, and
J.
Kurths
, “
The Kuramoto model in complex networks
,”
Phys. Rep.
610
,
1
98
(
2016
).
26.
S.
Boccaletti
,
G.
Bianconi
,
R.
Criado
,
C. I.
Del Genio
,
J.
Gómez-Gardenes
,
M.
Romance
,
I.
Sendina-Nadal
,
Z.
Wang
, and
M.
Zanin
, “
The structure and dynamics of multilayer networks
,”
Phys. Rep.
544
,
1
122
(
2014
).
27.
M.
De Domenico
,
A.
Solé-Ribalta
,
E.
Cozzo
,
M.
Kivelä
,
Y.
Moreno
,
M. A.
Porter
,
S.
Gómez
, and
A.
Arenas
, “
Mathematical formulation of multilayer networks
,”
Phys. Rev. X
3
,
041022
(
2013
).
28.
H.
Sompolinsky
,
A.
Crisanti
, and
H.-J.
Sommers
, “
Chaos in random neural networks
,”
Phys. Rev. Lett.
61
,
259
(
1988
).
29.
F.
Schuessler
,
A.
Dubreuil
,
F.
Mastrogiuseppe
,
S.
Ostojic
, and
O.
Barak
, “
Dynamics of random recurrent networks with correlated low-rank structure
,”
Phys. Rev. Res.
2
,
013111
(
2020
).
30.
S. S.
Haykin
,
Adaptive Filter Theory
(
Pearson Education India
,
2002
).
31.
M.
Ramezanian-Panahi
,
G.
Abrevaya
,
J.-C.
Gagnon-Audet
,
V.
Voleti
,
I.
Rish
, and
G.
Dumas
, “
Generative models of brain dynamics,
Front. Artif. Intell
.
5
, 807406 (
2022
).
You do not currently have access to this content.