In the realm of oil and gas exploration, accurately predicting subsurface fluid types is crucial. Traditional techniques such as core sampling, x-ray diffraction, and x-ray fluorescence, despite providing essential data, are hampered by high costs, time consumption, or limited applications. This paper introduces an interpretable spatiotemporal deep learning network, ISTNet, utilizing well log data to predict fluid types. The framework enhances prediction accuracy and model robustness through a dual-branch design integrating spatial and temporal branches. The spatial branch employs graph neural networks to capture spatial features of well log data, while the temporal branch analyzes time series features using bidirectional long short-term memory networks (BiLSTM). Additionally, ISTNet incorporates the SHapley Additive exPlanations (SHAP) model to augment the interpretability of predictions. Empirical studies in the Tarim Basin demonstrated that ISTNet outperforms seven other advanced models, achieving an average accuracy exceeding 97% on datasets from two distinct wells. ISTNet not only improves the accuracy and robustness of fluid predictions in oil and gas exploration but also enhances transparency and interpretability through the SHAP model, providing geologists and engineers with tools to deeply understand subsurface geological processes and refine exploration and development strategies.

1.
T.
Aifa
, “
Neural network applications to reservoirs: Physics-based models and data models
,”
J. Pet. Sci. Eng.
123
,
1
6
(
2014
).
2.
F.
Anifowose
,
J.
Labadin
, and
A.
Abdulraheem
, “
Improving the prediction of petroleum reservoir characterization with a stacked generalization ensemble model of support vector machines
,”
Appl. Soft Comput.
26
,
483
496
(
2015
).
3.
F. X.
Aymerich
,
J.
Alonso
,
M. E.
Cabanas
et al, “
Decision tree based fuzzy classifier of H1 magnetic resonance spectra from cerebrospinal fluid samples
,”
Fuzzy Sets Syst.
170
(
1
),
43
63
(
2011
).
4.
J.
Chang
,
Y.
Kang
,
Z.
Li
et al, “
Cross-domain lithology identification using active learning and source reweighting
,”
IEEE Geosci. Remote Sens. Lett.
19
,
1
5
(
2020
).
5.
N.
Jafarzadeh
,
A.
Kadkhodaie
,
B. J.
Ahmad
et al, “
Identification of electrical and petrophysical rock types based on core and well logs: Utilizing the results to delineate prolific zones in deep water sandy packages from the Shah Deniz gas field in the south Caspian Sea basin
,”
J. Nat. Gas Sci. Eng.
69
,
102923
(
2019
).
6.
M. I.
Miah
,
S.
Ahmed
,
S.
Zendehboudi
et al, “
Machine learning approach to model rock strength: Prediction and variable selection with aid of log data
,”
Rock Mech. Rock Eng.
53
,
4691
4715
(
2020
).
7.
M.
He
,
H.
Gu
, and
H.
Wan
, “
Log interpretation for lithology and fluid identification using deep neural network combined with MAHAKIL in a tight sandstone reservoir
,”
J. Pet. Sci. Eng.
194
,
107498
(
2020
).
8.
M.
Ali
,
R.
Jiang
,
H.
Ma
et al, “
Machine learning: A novel approach of well logs similarity based on synchronization measures to predict shear sonic logs
,”
J. Pet. Sci. Eng.
203
,
108602
(
2021
).
9.
M.
He
,
Z.
Zhang
, and
N.
Li
, “
Deep convolutional neural network-based method for strength parameter prediction of jointed rock mass using drilling logging data
,”
Int. J. Geomech.
21
(
7
),
04021111
(
2021
).
10.
J.
Tian
,
C.
Qi
,
Y.
Sun
et al, “
Permeability prediction of porous media using a combination of computational fluid dynamics and hybrid machine learning methods
,”
Eng. Comput.
37
,
3455
3471
(
2021
).
11.
J.
Lin
,
H.
Li
,
N.
Liu
et al, “
Automatic lithology identification by applying LSTM to logging data: A case study in X tight rock reservoirs
,”
IEEE Geosci. Remote Sens. Lett.
18
(
8
),
1361
1365
(
2021
).
12.
G.
Zhang
,
S.
Davoodi
,
S.
Shamshirband
et al, “
A robust approach to pore pressure prediction applying petrophysical log data aided by machine learning techniques
,”
Energy Rep.
8
,
2233
2247
(
2022
).
13.
W.
Sang
,
S.
Yuan
,
H.
Han
et al, “
Porosity prediction using semi-supervised learning with biased well log data for improving estimation accuracy and reducing prediction uncertainty
,”
Geophys. J. Int.
232
(
2
),
940
957
(
2023
).
14.
J. J.
Liu
and
J. C.
Liu
, “
Integrating deep learning and logging data analytics for lithofacies classification and 3D modeling of tight sandstone reservoirs
,”
Geosci. Front.
13
(
1
),
101311
(
2022
).
15.
B.
Yan
,
B.
Chen
,
D. R.
Harp
et al, “
A robust deep learning workflow to predict multiphase flow behavior during geological CO2 sequestration injection and post-injection periods
,”
J. Hydrol.
607
,
127542
(
2022
).
16.
F.
Scarselli
,
M.
Gori
,
A. C.
Tsoi
et al, “
The graph neural network model
,”
IEEE Trans. Neural Netw.
20
(
1
),
61
80
(
2009
).
17.
Z.
Cui
,
R.
Ke
,
Z.
Pu
et al, “
Deep bidirectional and unidirectional LSTM recurrent neural network for network-wide traffic speed prediction
,” arXiv:1801.02143 (
2018
).
18.
S. G.
Chang
,
B.
Yu
, and
M.
Vetterli
, “
Adaptive wavelet thresholding for image denoising and compression
,”
IEEE Trans. Image Process.
9
(
9
),
1532
1546
(
2000
).
19.
M.
Chatzianastasis
,
J.
Lutzeyer
,
G.
Dasoulas
et al, “
Graph ordering attention networks
,”
AAAI
37
(
6
),
7006
7014
(
2023
).
20.
A. K.
Jain
, “
Data clustering: 50 years beyond K-means
,”
Pattern Recognit. Lett.
31
(
8
),
651
666
(
2010
).
21.
Q.
Liu
,
L.
Xiao
,
J.
Yang
et al, “
CNN-enhanced graph convolutional network with pixel-and superpixel-level feature fusion for hyperspectral image classification
,”
IEEE Trans. Geosci. Remote Sens.
59
(
10
),
8657
8671
(
2021
).
22.
T. N.
Kipf
and
M.
Welling
, “
Semi-supervised classification with graph convolutional networks
,” arXiv:1609.02907 (
2016
).
23.
S.
Brody
,
U.
Alon
, and
E.
Yahav
, “
How attentive are graph attention networks?
arXiv:2105.14491 (
2021
).
24.
P. L.
Williams
and
R. D.
Beer
, “
Nonnegative decomposition of multivariate information
,” arXiv:1004.2515 (
2010
).
25.
T.
Fischer
and
C.
Krauss
, “
Deep learning with long short-term memory networks for financial market predictions
,”
Eur. J. Oper. Res.
270
(
2
),
654
669
(
2018
).
26.
C.
Cortes
and
V.
Vapnik
, “
Support-vector networks
,”
Mach. Learn.
20
,
273
297
(
1995
).
27.
T.
Chen
and
C.
Guestrin
, “
Xgboost: A scalable tree boosting system
,” in
Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining
(
ACM
,
2016
), pp.
785
794
.
28.
K.
He
,
X.
Zhang
,
S.
Ren
et al, “
Deep residual learning for image recognition
,” in
Proceedings of the 2016 IEEE Conference on Computer Vision and Pattern Recognition
(
IEEE
,
2016
), pp.
770
778
.
You do not currently have access to this content.