We present an approach for data-driven prediction of high-dimensional chaotic time series generated by spatially-extended systems. The algorithm employs a convolutional autoencoder for dimension reduction and feature extraction combined with a probabilistic prediction scheme operating in the feature space, which consists of a conditional random field. The future evolution of the spatially-extended system is predicted using a feedback loop and iterated predictions. The excellent performance of this method is illustrated and evaluated using Lorenz-96 systems and Kuramoto-Sivashinsky equations of different size generating time series of different dimensionality and complexity.

1.
M.
Abadi
,
A.
Agarwal
,
P.
Barham
,
E.
Brevdo
,
Z.
Chen
,
C.
Citro
,
G. S.
Corrado
,
A.
Davis
,
J.
Dean
,
M.
Devin
,
S.
Ghemawat
,
I.
Goodfellow
,
A.
Harp
,
G.
Irving
,
M.
Isard
,
Y.
Jia
,
R.
Jozefowicz
,
L.
Kaiser
,
M.
Kudlur
,
J.
Levenberg
,
D.
Mané
,
R.
Monga
,
S.
Moore
,
D.
Murray
,
C.
Olah
,
M.
Schuster
,
J.
Shlens
,
B.
Steiner
,
I.
Sutskever
,
K.
Talwar
,
P.
Tucker
,
V.
Vanhoucke
,
V.
Vasudevan
,
F.
Viégas
,
O.
Vinyals
,
P.
Warden
,
M.
Wattenberg
,
M.
Wicke
,
Y.
Yu
, and
X.
Zheng
, “TensorFlow: Large-scale machine learning on heterogeneous systems” (2015), see tensorflow.org.
2.
A. J.
Abebe
,
D. P.
Solomatine
, and
R. G. W.
Venneker
, “
Application of adaptive fuzzy rule-based models for reconstruction of missing precipitation events
,”
Hydrolog. Sci. J.
45
,
425
436
(
2000
).
3.
G. E. P.
Box
and
G.
Jenkins
,
Time Series Analysis, Forecasting and Control
(
Holden-Day Inc.
,
San Francisco, CA
,
1990
).
4.
Z.
Cheng
,
H.
Sun
,
M.
Takeuchi
, and
J.
Katto
, “Deep convolutional autoencoder-based lossy image compression,” in 2018 Picture Coding Symposium, PCS 2018—Proceedings (Institute of Electrical and Electronics Engineers Inc., 2018), pp. 253–257.
5.
F.
Chollet
et al., see https://keras.io for “Keras” (2015).
6.
S.
Cox
and
P.
Matthews
, “
Exponential time differencing for stiff systems
,”
J. Comput. Phys.
176
,
430
455
(
2002
).
7.
C.
Dong
,
C. C.
Loy
,
K.
He
, and
X.
Tang
, “Learning a deep convolutional network for image super-resolution,” in Computer Vision—ECCV 2014, edited by D. Fleet, T. Pajdla, B. Schiele, and T. Tuytelaars (Springer International Publishing, Cham, 2014), pp. 184–199.
8.
S.
Gao
,
R.
Brekelmans
,
G. V.
Steeg
, and
A.
Galstyan
, “Auto-encoding total correlation explanation,” e-print arXiv:1802.05822[cs.LG] (2018).
9.
R. H.
Hahnloser
,
R.
Sarpeshkar
,
M. A.
Mahowald
,
R. J.
Douglas
, and
H. S.
Seung
, “
Digital selection and analogue amplification coexist in a cortex-inspired silicon circuit
,”
Nature
405
,
947
(
2000
).
10.
K.
He
,
X.
Zhang
,
S.
Ren
, and
J.
Sun
, “Deep residual learning for image recognition,” in 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR) (IEEE, 2016), pp. 770–778; see https://ieeexplore.ieee.org/document/7780459.
11.
S.
Herzog
,
F.
Wörgötter
, and
U.
Parlitz
, “
Data-driven modeling and prediction of complex spatio-temporal dynamics in excitable media
,”
Front. Appl. Math. Stat.
4
,
60
(
2018
).
12.
I.
Higgins
,
L.
Matthey
,
X.
Glorot
,
A.
Pal
,
B.
Uria
,
C.
Blundell
,
S.
Mohamed
, and
A.
Lerchner
, “Early visual concept learning with unsupervised deep learning,” e-print arXiv:1606.05579 (2016).
13.
G. E.
Hinton
and
R. R.
Salakhutdinov
, “
Reducing the dimensionality of data with neural networks
,”
Science
313
,
504
507
(
2006
).
14.
S.
Ioffe
and
C.
Szegedy
, “Batch normalization: Accelerating deep network training by reducing internal covariate shift,” e-print arXiv:1502.03167[cs.LG] (2015).
15.
J.
Isensee
,
G.
Datseris
, and
U.
Parlitz
, “
Predicting spatio-temporal time series using dimension reduced local states
,”
J. Nonlinear Sci.
(published online).
16.
M.
Jaderberg
,
V.
Dalibard
,
S.
Osindero
,
W. M.
Czarnecki
,
J.
Donahue
,
A.
Razavi
,
O.
Vinyals
,
T.
Green
,
I.
Dunning
,
K.
Simonyan
,
C.
Fernando
, and
K.
Kavukcuoglu
, “Population based training of neural networks,” e-print arXiv:1711.09846[cs.LG] (2017).
17.
J. L.
Kaplan
and
J. A.
Yorke
, “Chaotic behavior of multidimensional difference equations,” in Functional Differential Equations and Approximation of Fixed Points, edited by H.-O. Peitgen and H.-O. Walther (Springer, Berlin, 1979), pp. 204–227.
18.
D. P.
Kingma
and
J.
Ba
, “Adam: A method for stochastic optimization,” e-print arXiv:1412.6980 (2014).
19.
D.
Koller
and
N.
Friedman
,
Probabilistic Graphical Models: Principles and Techniques (Adaptive Computation and Machine Learning Series)
(
The MIT Press
,
2009
).
20.
A.
Krizhevsky
,
I.
Sutskever
, and
G. E.
Hinton
, “Imagenet classification with deep convolutional neural networks,” in Proceedings of the 25th International Conference on Neural Information Processing Systems—Volume 1, series and number NIPS’12 (Curran Associates Inc., 2012), pp. 1097–1105.
21.
Y.
Kuramoto
, “
Diffusion-induced chaos in reaction systems
,”
Prog. Theor. Phys. Suppl.
64
,
346
367
(
1978
).
22.
J.
Lafferty
,
A.
McCallum
, and
F.
Pereira
, “Conditional random fields: Probabilistic models for segmenting and labeling sequence data,” in Proceedings of the 18th International Conference on Machine Learning (Morgan Kaufmann Publishers Inc., 2001), pp. 282–289.
23.
Y.
LeCun
,
B. E.
Boser
,
J. S.
Denker
,
D.
Henderson
,
R. E.
Howard
,
W. E.
Hubbard
, and
L. D.
Jackel
, “Handwritten digit recognition with a back-propagation network,” in Advances in Neural Information Processing Systems 2, edited by D. S. Touretzky (Morgan-Kaufmann, 1990), pp. 396–404.
24.
E.
Lorenz
, “Predictability: a problem partly solved,” in Seminar on Predictability, 4–8 September 1995, ECMWF (ECMWF, Shinfield Park, Reading, 1995), Vol. 1, pp. 1–18.
25.
E. N.
Lorenz
, “
Deterministic nonperiodic flow
,”
J. Atmos. Sci.
20
,
130
141
(
1963
).
26.
A. L.
Maas
,
A. Y.
Hannun
, and
A. Y.
Ng
, “Rectifier nonlinearities improve neural network acoustic models,” in Proceedings of the 30th International Conference on Machine Learning (Atlanta, Georgia, 2013); see also: https://ai.stanford.edu/∼amaas/papers/relu_hybrid_icml2013_final.pdf.
27.
A.
McCallum
, “Efficiently inducing features of conditional random fields,” in Proceedings of the Nineteenth Conference on Uncertainty in Artificial Intelligence, series and number UAI’03 (Morgan Kaufmann Publishers Inc., San Francisco, CA, 2003), pp. 403–410.
28.
J.
Nassar
,
S.
Linderman
,
M.
Bugallo
, and
I. M.
Park
, “Tree-structured recurrent switching linear dynamical systems for multi-scale modeling,” in International Conference on Learning Representations (International Conference on Learning Representations (ICLR), 2019).
29.
A.
Paraschos
,
C.
Daniel
,
J. R.
Peters
, and
G.
Neumann
, “Probabilistic movement primitives,” in Advances in Neural Information Processing Systems 26, edited by C. J. C. Burges, L. Bottou, M. Welling, Z. Ghahramani, and K. Q. Weinberger (Curran Associates, Inc., 2013), pp. 2616–2624.
30.
U.
Parlitz
and
C.
Merkwirth
, “
Prediction of spatiotemporal time series based on reconstructed local states
,”
Phys. Rev. Lett.
84
,
1890
1893
(
2000
).
31.
J.
Pathak
,
B.
Hunt
,
M.
Girvan
,
Z.
Lu
, and
E.
Ott
, “
Model-free prediction of large spatiotemporally chaotic systems from data: A reservoir computing approach
,”
Phys. Rev. Lett.
120
,
024102
(
2018
).
32.
M.
Pavlovski
,
F.
Zhou
,
N.
Arsov
,
L.
Kocarev
, and
Z.
Obradovic
, “Generalization-aware structured regression towards balancing bias and variance,” in Proceedings of the Twenty-Seventh International Joint Conference on Artificial Intelligence, IJCAI-18 (International Joint Conferences on Artificial Intelligence Organization, 2018), pp. 2616–2622.
33.
B.
Penkovsky
,
X.
Porte
,
M.
Jacquot
,
L.
Larger
, and
D.
Brunner
, “
Coupled nonlinear delay systems as deep convolutional neural networks
,”
Phys. Rev. Lett.
123
,
054101
(
2019
).
34.
L.
Petzold
, “
Automatic selection of methods for solving stiff and nonstiff systems of ordinary differential equations
,”
SIAM J. Sci. Stat. Comput.
4
,
136
148
(
1983
).
35.
F. R.
Pinheiro
,
P. J.
van Leeuwen
, and
U.
Parlitz
, “
An ensemble framework for time delay synchronization
,”
Q. J. R. Meteorolog. Soc.
144
,
305
316
(
2018
).
36.
A.
Quattoni
,
S.
Wang
,
L.
Morency
,
M.
Collins
, and
T.
Darrell
, “
Hidden conditional random fields
,”
IEEE Trans. Pattern Anal. Mach. Intell.
29
,
1848
1852
(
2007
).
37.
D.
Rey
,
M.
Eldridge
,
M.
Kostuk
,
H. D.
Abarbanel
,
J.
Schumann-Bischoff
, and
U.
Parlitz
, “
Accurate state and parameter estimation in nonlinear systems with sparse observations
,”
Phys. Lett. A
378
,
869
873
(
2014
).
38.
G.
van Rossum
and
F. L.
Drake
,
The Python Language Reference Manual
(
Network Theory Ltd.
,
2011
).
39.
R. R.
Selvaraju
,
M.
Cogswell
,
A.
Das
,
R.
Vedantam
,
D.
Parikh
, and
D.
Batra
, “Grad-cam: Visual explanations from deep networks via gradient-based localization,” e-print arXiv:1610.02391[cs.CV] (2016).
40.
G.
Sivashinsky
, “
Nonlinear analysis of hydrodynamic instability in laminar flames–i. derivation of basic equations
,”
Acta. Astronaut.
4
,
1177
1206
(
1977
).
41.
G. I.
Sivashinsky
, “
On flame propagation under conditions of stoichiometry
,”
SIAM J. Appl. Math.
39
,
67
82
(
1980
).
42.
G. I.
Sivashinsky
and
D. M.
Michelson
, “
On irregular wavy flow of a liquid film down a vertical plane
,”
Progr. Theor. Phys.
63
,
2112
2114
(
1980
).
43.
N.
Srivastava
,
G.
Hinton
,
A.
Krizhevsky
,
I.
Sutskever
, and
R.
Salakhutdinov
, “
Dropout: A simple way to prevent neural networks from overfitting
,”
J. Mach. Learn. Res.
15
,
1929
1958
(
2014
); available at http://dl.acm.org/citation.cfm?id=2627435.2670313.
44.
C.
Sutton
and
A.
McCallum
, “
An introduction to conditional random fields
,”
Found. Trends Mach. Learn.
4
,
267
373
(
2012
).
45.
M.
Tschannen
,
O.
Bachem
, and
M.
Lucic
, “Recent advances in autoencoder-based representation learning,” e-print arXiv:1812.05069 (2018).
46.
V.
Vemuri
, Artificial Neural Networks: Theoretical Concepts, Computer Society Press Technology Series: Neural networks (Computer Society Press of the IEEE, 1988).
47.
P.
Vlachas
,
W.
Byeon
,
Z.
Yi Wan
,
T. P.
Sapsis
, and
P
Koumoutsakos
, “
Data-driven forecasting of high-dimensional chaotic systems with long-short term memory networks
,”
Proc. R. Soc. A Math. Phys. Eng. Sci.
474
,
1
(
2018
).
48.
W.
Zhang
,
B.
Wang
,
Z.
Ye
, and
J.
Quan
, “
Efficient method for limit cycle flutter analysis based on nonlinear aerodynamic reduced-order models
,”
AIAA J.
50
,
1019
1028
(
2012
).
You do not currently have access to this content.