The use of deep learning has become increasingly popular in reduced-order models (ROMs) to obtain low-dimensional representations of full-order models. Convolutional autoencoders (CAEs) are often used to this end as they are adept at handling data that are spatially distributed, including solutions to partial differential equations. When applied to unsteady physics problems, ROMs also require a model for time-series prediction of the low-dimensional latent variables. Long short-term memory (LSTM) networks, a type of recurrent neural network useful for modeling sequential data, are frequently employed in data-driven ROMs for autoregressive time-series prediction. When making predictions at unseen design points over long time horizons, error propagation is a frequently encountered issue, where errors made early on can compound over time and lead to large inaccuracies. In this work, we propose using bagging, a commonly used ensemble learning technique, to develop a fully data-driven ROM framework referred to as the CAE-eLSTM ROM that uses CAEs for spatial reconstruction of the full-order model and LSTM ensembles for time-series prediction. When applied to two unsteady fluid dynamics problems, our results show that the presented framework effectively reduces error propagation and leads to more accurate time-series prediction of latent variables at unseen points.

1.
D. J.
Lucia
,
P. S.
Beran
, and
W. A.
Silva
, “
Reduced-order modeling: New approaches for computational physics
,”
Prog. Aerosp. Sci.
40
(
1–2
),
51
117
(
2004
).
2.
G.
Berkooz
,
P. J.
Holmes
, and
J.
Lumley
, “
The proper orthogonal decomposition in the analysis of turbulent flows
,”
Annu. Rev. Fluid Mech.
25
,
539
575
(
2003
).
3.
J. S.
Hesthaven
and
S.
Ubbiali
, “
Non-intrusive reduced order modeling of nonlinear problems using neural networks
,”
J. Comput. Phys.
363
,
55
78
(
2018
).
4.
Y.
LeCun
,
Y.
Bengio
, and
G.
Hinton
, “
Deep learning
,”
Nature
521
(
7553
),
436
444
(
2015
).
5.
K.
Lee
and
K.
Carlberg
, “
Model reduction of dynamical systems on nonlinear manifolds using deep convolutional autoencoders
,”
J. Comput. Phys.
404
,
108973
(
2020
).
6.
R.
Halder
,
K. J.
Fidkowski
, and
K. J.
Maki
, “
Non-intrusive reduced-order modeling using convolutional autoencoders
,”
Int. J. Numer. Methods Eng.
123
(
21
),
5369
5390
(
2022
).
7.
H.
Eivazi
,
S.
Le Clainche
,
S.
Hoyas
, and
R.
Vinuesa
, “
Towards extraction of orthogonal and parsimonious non-linear modes from turbulent flows
,”
Expert Syst. Appl.
202
,
117038
(
2022
).
8.
A.
Solera-Rico
,
C. S.
Vila
,
M. A.
Gómez
,
Y.
Wang
,
A.
Almashjary
,
S. T. M.
Dawson
, and
R.
Vinuesa
, “
β-variational autoencoders and transformers for reduced-order modelling of fluid flows
,”
Nat. Commun.
15
,
1361
(2023)
.
9.
S.
Wang
,
H.
Wang
, and
P.
Perdikaris
, “
Learning the solution operator of parametric partial differential equations with physics-informed DeepONets
,”
Sci. Adv.
7
(
40
),
eabi8605
(
2021
).
10.
Z.
Li
,
H.
Zheng
,
N.
Kovachki
,
D.
Jin
,
H.
Chen
,
B.
Liu
,
K.
Azizzadenesheli
, and
A.
Anandkumar
, “
Physics-informed neural operator for learning partial differential equations
,”
ACM/JMS J. Data Sci.
1
,
9
(
2021
).
11.
S.
Grimberg
,
C.
Farhat
, and
N.
Youkilis
, “
On the stability of projection-based model order reduction for convection-dominated laminar and turbulent flows
,”
J. Comput. Phys.
419
,
109681
(
2020
).
12.
S. M.
Rahman
,
S.
Pawar
,
O.
San
,
A.
Rasheed
, and
T.
Iliescu
, “
Nonintrusive reduced order modeling framework for quasigeostrophic turbulence
,”
Phys. Rev. E
100
(
5
),
053306
(
2019
).
13.
W.
Xin Zhao
,
K.
Zhou
,
J.
Li
,
T.
Tang
,
X.
Wang
,
Y.
Hou
,
Y.
Min
,
B.
Zhang
,
J.
Zhang
,
Z.
Dong
et al, “
A survey of large language models
,” arXiv:2303.18223 (
2023
).
14.
F. A.
Gers
,
J.
Schmidhuber
, and
F.
Cummins
, “
Learning to forget: Continual prediction with LSTM
,”
Neural Comput.
12
(
10
),
2451
2471
(
2000
).
15.
T.
Hill
,
M.
O'Connor
, and
W.
Remus
, “
Neural network models for time series forecasts
,”
Manage. Sci.
42
(
7
),
1082
1092
(
1996
).
16.
S.
Lee
and
D.
You
, “
Data-driven prediction of unsteady flow over a circular cylinder using deep learning
,”
J. Fluid Mech.
879
,
217
254
(
2019
).
17.
R.
Vinuesa
and
S. L.
Brunton
, “
Enhancing computational fluid dynamics with machine learning
,”
Nat. Comput. Sci.
2
(
6
),
358
366
(
2022
).
18.
P.
Wu
,
F.
Qiu
,
W.
Feng
,
F.
Fang
, and
C.
Pain
, “
A non-intrusive reduced order model with transformer neural network and its application
,”
Phys. Fluids
34
(
11
),
115130
(
2022
).
19.
R.
Maulik
,
B.
Lusch
, and
P.
Balaprakash
, “
Reduced-order modeling of advection-dominated systems with recurrent neural networks and convolutional autoencoders
,”
Phys. Fluids
33
(
3
),
037106
(
2021
).
20.
K.
Hasegawa
,
K.
Fukami
,
T.
Murata
, and
K.
Fukagata
, “
Machine-learning-based reduced-order modeling for unsteady flows around bluff bodies of various shapes
,”
Theor. Comput. Fluid Dyn.
34
,
367
383
(
2020
).
21.
J.
Jeon
,
J.
Lee
,
R.
Vinuesa
, and
S. J.
Kim
, “
Residual-based physics-informed transfer learning: A hybrid method for accelerating long-term CFD simulations via deep learning
,”
Int. J. Heat Mass Transfer
220
,
124900
(
2024
).
22.
O.
Sagi
and
L.
Rokach
, “
Ensemble learning: A survey
,”
Wiley Interdiscip. Rev.
8
(
4
),
e1249
(
2018
).
23.
M.
Mrosek
,
C.
Othmer
, and
R.
Radespiel
, “
Reduced-order modeling of vehicle aerodynamics via proper orthogonal decomposition
,”
SAE Int. J. Passeng. Cars—Mech. Syst.
12
(
3
),
225
236
(
2019
).
24.
Y.
LeCun
,
B.
Boser
,
J.
Denker
,
D.
Henderson
,
R.
Howard
,
W.
Hubbard
, and
L.
Jackel
, “
Handwritten digit recognition with a back-propagation network
,” in
Advances in Neural Information Processing Systems
(
Morgan-Kaufmann
,
1989
), Vol.
2
.
25.
O.
Russakovsky
,
J.
Deng
,
H.
Su
,
J.
Krause
,
S.
Satheesh
,
S.
Ma
,
Z.
Huang
,
A.
Karpathy
,
A.
Khosla
,
M.
Bernstein
et al, “
Imagenet large scale visual recognition challenge
,”
Int. J. Comput. Vis.
115
,
211
252
(
2015
).
26.
J.
Xu
and
K.
Duraisamy
, “
Multi-level convolutional autoencoder networks for parametric prediction of spatio-temporal dynamics
,”
Comput. Methods Appl. Mech. Eng.
372
,
113379
(
2020
).
27.
D. E.
Rumelhart
,
G. E.
Hinton
, and
R. J.
Williams
, “
Learning representations by back-propagating errors
,”
Nature
323
(
6088
),
533
536
(
1986
).
28.
S.
Hochreiter
, “
The vanishing gradient problem during learning recurrent neural nets and problem solutions
,”
Int. J. Uncertainty Fuzziness Knowl. Based Syst.
06
(
02
),
107
116
(
1998
).
29.
S.
Hochreiter
and
J.
Schmidhuber
, “
Long short-term memory
,”
Neural Comput.
9
(
8
),
1735
1780
(
1997
).
30.
Y.
Freund
,
R.
Schapire
, and
N.
Abe
, “
A short introduction to boosting
,”
J.-Japan. Soc. Artif. Intell.
14
(
771–780
),
1612
(
1999
).
31.
L.
Breiman
, “
Bagging predictors
,”
Mach. Learn.
24
,
123
140
(
1996
).
32.
A.
Paszke
,
S.
Gross
,
F.
Massa
,
A.
Lerer
,
J.
Bradbury
,
G.
Chanan
,
T.
Killeen
,
Z.
Lin
,
N.
Gimelshein
,
L.
Antiga
et al, “
Pytorch: An imperative style, high-performance deep learning library
,” in
Advances in Neural Information Processing Systems
(
Curran Associates, Inc
.,
2019
), Vol.
32
.
33.
R.
Jin
,
W.
Chen
, and
A.
Sudjianto
, “
An efficient algorithm for constructing optimal design of computer experiments
,”
J. Stat. Plann. Inference
134
(
1
),
268
287
(
2005
).
34.
J.
Sola
and
J.
Sevilla
, “
Importance of input data normalization for the application of neural networks to complex industrial problems
,”
IEEE Trans. Nucl. Sci.
44
,
1464
1468
(
1997
).
35.
H. G.
Weller
,
G.
Tabor
,
H.
Jasak
, and
C.
Fureby
, “
A tensorial approach to computational continuum mechanics using object-oriented techniques
,”
Comput. Phys.
12
(
6
),
620
631
(
1998
).
36.
N.
Srivastava
,
G.
Hinton
,
A.
Krizhevsky
,
I.
Sutskever
, and
R.
Salakhutdinov
, “
Dropout: A simple way to prevent neural networks from overfitting
,”
J. Mach. Learn. Res.
15
(
1
),
1929
1958
(
2014
).
37.
D. P.
Kingma
and
J.
Ba
, “
Adam: A method for stochastic optimization
,” arXiv:1412.6980 (
2014
).
38.
D. J.
Tritton
, “
Experiments on the flow past a circular cylinder at low Reynolds numbers
,”
J. Fluid Mech.
6
(
4
),
547
567
(
1959
).
39.
E. M.
Saiki
and
S.
Biringen
, “
Numerical simulation of a cylinder in uniform flow: Application of a virtual boundary method
,”
J. Comput. Phys.
123
(
2
),
450
465
(
1996
).
40.
M.
Ataei
and
H.
Salehipour
, “
XLB: A differentiable massively parallel lattice Boltzmann library in Python
,”
Comput. Phys. Commun.
300
,
109187
(
2024
).
41.
S.
Chen
and
G. D.
Doolen
, “
Lattice Boltzmann method for fluid flows
,”
Annu. Rev. Fluid Mech.
30
(
1
),
329
364
(
1998
).
42.
J.
Bradbury
,
R.
Frostig
,
P.
Hawkins
,
M. J.
Johnson
,
C.
Leary
,
D.
Maclaurin
,
G.
Necula
,
A.
Paszke
,
J.
VanderPlas
,
S.
Wanderman-Milne
, and
Q.
Zhang
,
JAX: Composable transformations of Python+NumPy programs
,
2018
; see https://github.com/google/jax.
43.
J. H.
Gerrard
, “
Flow around circular cylinders; volume 1. fundamentals. by M. M. Zdravkovich. oxford science publications, 1997
,”
J. Fluid Mech.
350
,
375
378
(
1997
).
44.
B. N.
Rajani
,
A.
Kandasamy
, and
S.
Majumdar
, “
Numerical simulation of laminar flow past a circular cylinder
,”
Appl. Math. Modell.
33
(
3
),
1228
1247
(
2009
).
45.
M.
Schuster
and
K. K.
Paliwal
, “
Bidirectional recurrent neural networks
,”
IEEE Trans. Signal Process.
45
(
11
),
2673
2681
(
1997
).
46.
S.
Ioffe
and
C.
Szegedy
, “
Batch normalization: Accelerating deep network training by reducing internal covariate shift
,” in
International Conference on Machine Learning
(
PMLR
,
2015
), pp.
448
456
.
You do not currently have access to this content.