The detection of anomalies or transitions in complex dynamical systems is of critical importance to various applications. In this study, we propose the use of machine learning to detect changepoints for high-dimensional dynamical systems. Here, changepoints indicate instances in time when the underlying dynamical system has a fundamentally different characteristic—which may be due to a change in the model parameters or due to intermittent phenomena arising from the same model. We propose two complementary approaches to achieve this, with the first devised using arguments from probabilistic unsupervised learning and the latter devised using supervised deep learning. To accelerate the deployment of transition detection algorithms in high-dimensional dynamical systems, we introduce dimensionality reduction techniques. Our experiments demonstrate that transitions can be detected efficiently, in real-time, for the two-dimensional forced Kolmogorov flow and the Rössler dynamical system, which are characterized by anomalous regimes in phase space where dynamics are perturbed off the attractor at potentially uneven intervals. Finally, we also demonstrate how variations in the frequency of detected changepoints may be utilized to detect a significant modification to the underlying model parameters by utilizing the Lorenz-63 dynamical system.

1.
M.
Stival
,
M.
Bernardi
, and
P.
Dellaportas
, “Doubly-online changepoint detection for monitoring health status during sports activities,” arXiv:2206.11578v1, 23 (2022).
2.
S.
Liu
,
A.
Wright
, and
M.
Hauskrecht
, “
Change-point detection method for clinical decision support system rule monitoring
,”
Artif. Intell. Med.
91
,
49
56
(
2018
).
3.
M. F. R.
Chowdhury
,
S.-A.
Selouani
, and
D.
O’Shaughnessy
, “
Bayesian on-line spectral change point detection: A soft computing approach for online asr
,”
Int. J. Speech Technol.
15
,
5
23
(
2012
).
4.
S.
Jung
,
L.
Oudre
,
C.
Truong
,
E.
Dorveaux
,
L.
Gorintin
,
N.
Vayatis
, and
D.
Ricard
, “
Adaptive change-point detection for studying human locomotion
,”
Annu. Int. Conf. IEEE Eng. Med. Biol. Soc.
2021
,
2020
2024
.
5.
P.
Fearnhead
and
Z.
Liu
, “
On-line inference for multiple changepoint problems
,”
J. R. Stat. Soc. Ser. B (Stat. Methodol.)
69
,
589
605
(
2007
).
6.
P.
Fearnhead
and
G.
Rigaill
, “
Changepoint detection in the presence of outliers
,”
J. Am. Stat. Assoc.
114
,
169
183
(
2019
).
7.
R. P.
Adams
and
D. J. C.
MacKay
, “Bayesian online changepoint detection,” in Proceedings of the 23rd Conference on Uncertainty in Artificial Intelligence (Proceedings of Machine Learning Research, 2007), pp. 10–18.
8.
N.
Kalouptsidis
,
G.
Mileounis
,
B.
Babadi
, and
V.
Tarokh
, “
Adaptive algorithms for sparse system identification
,”
Signal Process.
91
,
1910
1919
(
2011
).
9.
S.
Srinivasan
,
J.
Billeter
, and
D.
Bonvin
, “
Sequential model identification of reaction systems—The missing path between the incremental and simultaneous approaches
,”
AIChE J.
65
,
1211
1221
(
2019
).
10.
C.
Chen
,
N.
Chen
, and
J.
Wu
, “CEBoosting: Online sparse identification of dynamical systems with regime switching by causation entropy boosting,” arXiv.2304.07863.
11.
S. B.
Kotsiantis
,
I.
Zaharakis
,
P.
Pintelas
et al., “
Supervised machine learning: A review of classification techniques
,”
Emerg. Artif. Intell. Applic. Compt. Enginer.
161
(1),
3
24
(
2007
).
12.
M.
Munir
,
S. A.
Siddiqui
,
A.
Dengel
, and
S.
Ahmed
, “
Deepant: A deep learning approach for unsupervised anomaly detection in time series
,”
IEEE Access
7
,
1
(
2018
).
13.
G.
Pang
,
C.
Shen
,
L.
Cao
, and
A.
van den Hengel
, “
Deep learning for anomaly detection: A review
,”
ACM Comput. Surv.
1
,
1
(
2020
).
14.
N.
Ahad
and
M. A.
Davenport
, “Semi-supervised sequence classification through change point detection,” in The Thirty-Fifth AAAI Conference on Artificial Intelligence (AAAI-21) (AAAI Press, 2021).
15.
S.
Aminikhanghahi
and
D. J.
Cook
, “
A survey of methods for time series change point detection
,”
Knowledge Inform. Syst.
51
,
339
367
(
2017
).
16.
G. J.
Van den Burg
and
C. K.
Williams
, “An evaluation of change point detection algorithms,” arXiv:2003.06222 (2020).
17.
A. J.
Scott
and
M. A.
Knott
, “
A cluster analysis method for grouping means in the analysis of variance
,”
Biometrics
30
,
507
512
(
1974
).
18.
I. E.
Auger
and
C. E.
Lawrence
, “
Algorithms for the optimal identification of segment neighborhoods
,”
Bull. Math. Biol.
51
,
39
54
(
1989
).
19.
B.
Jackson
,
J. D.
Scargle
,
D.
Barnes
,
S.
Arabhi
,
A.
Alt
,
P.
Gioumousis
,
E.
Gwin
,
P.
Sangtrakulcharoen
,
L.
Tan
, and
T.-C.
Tsai
, “
An algorithm for optimal partitioning of data on an interval
,”
IEEE Signal Process. Lett.
12
,
105
108
(
2005
).
20.
R.
Killick
,
P.
Fearnhead
, and
I. A.
Eckley
, “
Optimal detection of changepoints with a linear computational cost
,”
J. Am. Stat. Assoc.
107
,
1590
1598
(
2012
).
21.
J. S.
Liu
and
C. E.
Lawrence
, “
Bayesian inference on biopolymer models
,”
Bioinformatics
15
,
38
52
(
1999
).
22.
E.
Ruggieri
and
C. E.
Lawrence
, “
The bayesian change point and variable selection algorithm: Application to the δ 18o proxy record of the plio-pleistocene
,”
J. Comput. Graph. Stat.
23
,
87
110
(
2014
).
23.
K.
Gu
,
L.
Yan
,
X.
Li
,
X.
Duan
, and
J.
Liang
, “
Change point detection in multi-agent systems based on higher-order features
,”
Chaos
32
,
111102
(
2022
).
24.
J.
Knoblauch
and
T.
Damoulas
, “Spatio-temporal Bayesian on-line changepoint detection with model selection,” in Proceedings of the 35th International Conference on Machine Learning, Proceedings of Machine Learning Research (JMLR, Inc., 2018), Vol. 80 , p. 2718.
25.
H.-J.
Rong
,
N.
Sundararajan
,
G.-B.
Huang
, and
P.
Saratchandran
, “
Sequential adaptive fuzzy inference system (SAFIS) for nonlinear system identification and prediction
,”
Fuzzy Sets Syst.
157
,
1260
1275
(
2006
).
26.
T.
Lombaerts
,
H.
Huisman
,
Q.
Chu
,
J. A.
Mulder
, and
D.
Joosten
, “
Nonlinear recon- figuring flight control based on online physical model identification
,”
J. Guidance Control Dynam.
32
,
727
748
(
2009
).
27.
Y.
Kopsinis
,
K.
Slavakis
, and
S.
Theodoridis
, “
Online sparse system identification and signal reconstruction using projections onto weighted l1 balls
,”
IEEE Trans. Signal Process.
59
,
936
952
(
2010
).
28.
T.
Chen
,
M. S.
Andersen
,
L.
Ljung
,
A.
Chiuso
, and
G.
Pillonetto
, “
System identification via sparse multiple kernel-based regularization using sequential convex optimization techniques
,”
IEEE Trans. Autom. Control
59
,
2933
2945
(
2014
).
29.
C. E. P.
De Jesús
and
M. D.
Graham
, “
Data-driven low-dimensional dynamic model of Kolmogorov flow
,”
Phys. Rev. Fluids
8
,
044402
(
2023
).
30.
A.
Pagotto
, “Bayesian online changepoint detection,” R Package, Ver. 0.1.1; available at https://CRAN.R-project.org/package=ocp.
31.
K. P.
Murphy
, “
Conjugate Bayesian analysis of the Gaussian distribution
,”
Public Lecture Notes
1
,
16
(
2007
).
32.
S.
Hochreiter
and
J.
Schmidhuber
, “
Long short-term memory
,”
Neural Comput.
9
,
1735
1780
(
1997
).
33.
F. A.
Gers
,
N. N.
Schraudolph
, and
J.
Schmidhuber
, “
Learning precise timing with LSTM recurrent networks
,”
J. Mach. Learn. Res. (JMLR)
3
,
115
143
(
2002
).
34.
F. A.
Gers
,
J.
Schmidhuber
, and
F.
Cummins
, “
Learning to forget: Continual prediction with lstm
,”
Neural Comput.
12
,
2451
2471
(
2000
).
35.
X.
Chen
,
T.
Weng
,
C.
Li
, and
H.
Yang
, “
Equivalence of machine learning models in modeling chaos
,”
Chaos, Solitons Fractals
165
,
112831
(
2022
).
36.
G.
Uribarri
and
G. B.
Mindlin
, “
Dynamical time series embeddings in recurrent neural networks
,”
Chaos, Solitons Fractals
154
,
111612
(
2022
).
37.
M.
Sangiorgio
and
F.
Dercole
, “
Robustness of lstm neural networks for multi-step forecasting of chaotic time series
,”
Chaos, Solitons Fractals
139
,
110045
(
2020
).
38.
S.
Coulibaly
,
F.
Bessin
,
M. G.
Clerc
, and
A.
Mussot
, “
Precursors-driven machine learning prediction of chaotic extreme pulses in kerr resonators
,”
Chaos, Solitons Fractals
160
,
112199
(
2022
).
39.
P. J.
Werbos
, “
Backpropagation through time: What it does and how to do it
,”
Proc. IEEE
78
,
1550
1560
(
1990
).
40.
R. J.
Williams
and
D.
Zipser
, “
A learning algorithm for continually running fully recurrent neural networks
,”
Neural Comput.
1
,
270
280
(
1989
).
41.
M. C.
Mozer
, “Induction of multiscale temporal structure,” in Advances in Neural Information Processing Systems 4 (Morgan Kaufmann, 1992), pp. 275–282.
42.
R. T. Q.
Chen
,
X.
Li
,
R.
Grosse
, and
D.
Duvenaud
, “Isolating sources of disentanglement in vaes,” arXiv:1802.04942v5, (2019) p. 23.
43.
I.
Higgins
,
L.
Matthey
,
A.
Pal
,
C.
Burgess
,
X.
Glorot
,
M.
Botvinick
,
S.
Mohamed
, and
A.
Lerchner
, “ β-vae: Learning basic visual concepts with a constrained variational framework,” in International Conference on Learning Representations (Open Review, 2017).
44.
D. P.
Kingma
and
M.
Welling
, “Auto-encoding variational bayes,” arXiv:1312.6114v11 (2022), p. 10.
45.
R. D.
Turner
,
Y.
Saatci
, and
C. E.
Rasmussen
, “Adaptive sequential Bayesian change point detection,” in Temporal Segmentation Workshop at NIPS, edited by Harchaoui, Z. (Cambridge Machine Learning Group, 2009).
46.
C. J. V.
Rijsbergen
,
Information Retrieval
(
Butterworths
,
1979
).
47.
C.
Truong
,
L.
Oudre
, and
N.
Vayatis
, “
Selective review of offline change point detection methods
,”
Signal Process.
167
,
107299
(
2020
).
You do not currently have access to this content.