Many approaches, which have been developed to express the potential energy of large systems, exploit the locality of the atomic interactions. A prominent example is the fragmentation methods in which the quantum chemical calculations are carried out for overlapping small fragments of a given molecule that are then combined in a second step to yield the system’s total energy. Here we compare the accuracy of the systematic molecular fragmentation approach with the performance of high-dimensional neural network (HDNN) potentials introduced by Behler and Parrinello. HDNN potentials are similar in spirit to the fragmentation approach in that the total energy is constructed as a sum of environment-dependent atomic energies, which are derived indirectly from electronic structure calculations. As a benchmark set, we use all-trans alkanes containing up to eleven carbon atoms at the coupled cluster level of theory. These molecules have been chosen because they allow to extrapolate reliable reference energies for very long chains, enabling an assessment of the energies obtained by both methods for alkanes including up to 10 000 carbon atoms. We find that both methods predict high-quality energies with the HDNN potentials yielding smaller errors with respect to the coupled cluster reference.

1.
I. N.
Levine
,
Quantum Chemistry
, 7th ed. (
Prentice Hall
,
Boston
,
2013
).
2.
C. M.
Handley
and
J.
Behler
, “
Next generation interatomic potentials for condensed systems
,”
Eur. Phys. J. B
87
,
152
(
2014
).
3.
T. B.
Blank
,
S. D.
Brown
,
A. W.
Calhoun
, and
D. J.
Doren
, “
Neural network models of potential energy surfaces
,”
J. Chem. Phys.
103
,
4129
4137
(
1995
).
4.
J.
Behler
and
M.
Parrinello
, “
Generalized neural-network representation of high-dimensional potential-energy surfaces
,”
Phys. Rev. Lett.
98
,
146401
(
2007
).
5.
J.
Behler
, “
Representing potential energy surfaces by high-dimensional neural network potentials
,”
J. Phys. Condens. Matter
26
,
183001
(
2014
).
6.
S.
Manzhos
and
T.
Carrington
, Jr.
, “
A random-sampling high dimensional model representation neural network for building potential energy surfaces
,”
J. Chem. Phys.
125
,
084109
(
2006
).
7.
S.
Manzhos
and
T.
Carrington
, Jr.
, “
Using neural networks to represent potential surfaces as sums of products
,”
J. Chem. Phys.
125
,
194105
(
2006
).
8.
A.
Pukrittayakamee
,
M.
Malshe
,
M.
Hagan
,
L. M.
Raff
,
R.
Narulkar
,
S.
Bukkapatnum
, and
R.
Komanduri
, “
Simultaneous fitting of a potential-energy surface and its corresponding force fields using feedforward neural networks
,”
J. Chem. Phys.
130
,
134101
(
2009
).
9.
B.
Jiang
and
H.
Guo
, “
Permutation invariant polynomial neural network approach to fitting potential energy surfaces
,”
J. Chem. Phys.
139
,
054112
(
2013
).
10.
H. T. T.
Nguyen
and
H. M.
Le
, “
Modified feed-forward neural network structures and combined-function-derivative approximations incorporating exchange symmetry for potential energy surface fitting
,”
J. Phys. Chem. A
116
,
4629
4638
(
2012
).
11.
G.
Montavon
,
K.
Hansen
,
S.
Fazli
,
M.
Rupp
,
F.
Biegler
,
A.
Ziehe
,
A.
Tkatchenko
,
O. A.
von Lilienfeld
, and
K.-R.
Müller
, “
Learning invariant representations of molecules for atomization energy prediction
,” in
Advances in Neural Information Processing Systems
, edited by
P.
Bartlett
,
F.
Pereira
,
C.
Burges
,
L.
Bottou
, and
K.
Weinberger
(
2012
), Vol.
25
, pp.
449
457
.
12.
Q.
Meng
,
J.
Chen
, and
D. H.
Zhang
, “
Communication: Rate coefficients of the H + CH4 → H2 + CH3 reaction from ring polymer molecular dynamics on a highly accurate potential energy surface
,”
J. Chem. Phys.
143
,
101102
(
2015
).
13.
S.
Houlding
,
S. Y.
Liem
, and
P. L. A.
Popelier
, “
A polarizable high-rank quantum topological electrostatic potential developed using neural networks: Molecular dynamics simulations on the hydrogen fluoride dimer
,”
Int. J. Quantum Chem.
107
,
2817
2827
(
2007
).
14.
J.
Schmidhuber
, “
Deep learning in neural networks: An overview
,”
Neural Networks
61
,
85
117
(
2015
).
15.
C. M.
Handley
and
P. L. A.
Popelier
, “
Potential energy surfaces fitted by artificial neural networks
,”
J. Phys. Chem. A
114
,
3371
3383
(
2010
).
16.
J.
Behler
, “
Neural network potential-energy surfaces in chemistry: A tool for large-scale simulations
,”
Phys. Chem. Chem. Phys.
13
,
17930
17955
(
2011
).
17.
X.
He
,
T.
Zhu
,
X.
Wang
,
J.
Liu
, and
J. Z. H.
Zhang
, “
Fragment quantum mechanical calculation of proteins and its applications
,”
Acc. Chem. Res.
47
,
2748
2757
(
2014
).
18.
J.
Behler
,
R.
Martoňák
,
D.
Donadio
, and
M.
Parrinello
, “
Metadynamics simulations of the high-pressure phases of silicon employing a high-dimensional neural network potential
,”
Phys. Rev. Lett.
100
,
185501
(
2008
).
19.
N.
Artrith
and
J.
Behler
, “
High-dimensional neural network potentials for metal surfaces: A prototype study for copper
,”
Phys. Rev. B
85
,
045439
(
2012
).
20.
N.
Artrith
,
B.
Hiller
, and
J.
Behler
, “
Neural network potentials for metals and oxides — First applications to copper clusters at zinc oxide
,”
Phys. Status Solidi B
250
,
1191
1203
(
2013
).
21.
T.
Morawietz
and
J.
Behler
, “
A density-functional theory-based neural network potential for water clusters including van der Waals corrections
,”
J. Phys. Chem. A
117
,
7356
(
2013
).
22.
T.
Morawietz
,
A.
Singraber
,
C.
Dellago
, and
J.
Behler
, “
How Van der Waals Interactions Determine the Unique Properties of Water
” (submitted).
23.
M. S.
Gordon
,
D. G.
Fedorov
,
S. R.
Pruitt
, and
L. V.
Slipchenko
, “
Fragmentation methods: A route to accurate calculations on large systems
,”
Chem. Rev.
112
,
632
672
(
2012
).
24.
M. A.
Collins
and
R. P. A.
Bettens
, “
Energy-based molecular fragmentation methods
,”
Chem. Rev.
115
,
5607
5642
(
2015
).
25.
M. A.
Collins
and
V. A.
Deev
, “
Accuracy and efficiency of electronic energies from systematic molecular fragmentation
,”
J. Chem. Phys.
125
,
104104
(
2006
).
26.
H. M.
Netzloff
and
M. A.
Collins
, “
Ab initio energies of nonconducting crystals by systematic fragmentation
,”
J. Chem. Phys.
127
,
134113
(
2007
).
27.
M. A.
Collins
, “
Systematic fragmentation of large molecules by annihilation
,”
Phys. Chem. Chem. Phys.
14
,
7744
7751
(
2012
).
28.
M. A.
Collins
,
M. W.
Cvitkovic
, and
R. P. A.
Bettens
, “
The combined fragmentation and systematic molecular fragmentation methods
,”
Acc. Chem. Res.
47
,
2776
2785
(
2014
).
29.
M. A.
Addicoat
and
M. A.
Collins
, “
Accurate treatment of nonbonded interactions within systematic molecular fragmentation
,”
J. Chem. Phys.
131
,
104103
(
2009
).
30.
J.
Behler
, “
Atom-centered symmetry functions for constructing high-dimensional neural network potentials
,”
J. Chem. Phys.
134
,
074106
(
2011
).
31.
C. M.
Bishop
,
Pattern Recognition and Machine Learning
, 1st ed. (
Springer
,
New York
,
2006
).
32.
J.
Behler
, “
Constructing high-dimensional neural network potentials: A tutorial review
,”
Int. J. Quantum Chem.
115
,
1032
1050
(
2015
).
33.
L.
Bottou
, “
Stochastic gradient tricks
,” in
Neural Networks, Tricks of the Trade, Reloaded
,
Lecture Notes in Computer Science
, edited by
G.
Montavon
,
G. B.
Orr
, and
K.-R.
Müller
(
Springer
,
2012
), pp.
430
445
.
34.
K.
Levenberg
, “
A method for the solution of certain problems in least squares
,”
Q. Appl. Math.
2
,
164
168
(
1944
).
35.
D.
Marquardt
, “
An algorithm for least-squares estimation of nonlinear parameters
,”
SIAM J. Appl. Math.
11
,
431
441
(
1963
).
36.
S.
Shah
,
F.
Palmieri
, and
M.
Datum
, “
Optimal filtering algorithms for fast learning in feedforward neural networks
,”
Neural Networks
5
,
779
787
(
1992
).
37.
R. E.
Kalman
, “
A new approach to linear filtering and prediction problems
,”
J. Fluids Eng.
82
,
35
45
(
1960
).
38.
M.
Gastegger
and
P.
Marquetand
, “
High-dimensional neural network potentials for organic reactions and an improved training algorithm
,”
J. Chem. Theory Comput.
11
,
2187
2198
(
2015
).
39.
F.
Neese
, “
The ORCA program system
,”
Wiley Interdiscip. Rev.: Comput. Mol. Sci.
2
,
73
78
(
2012
).
40.
T. H.
Dunning
, “
Gaussian basis sets for use in correlated molecular calculations. I. The atoms boron through neon and hydrogen
,”
J. Chem. Phys.
90
,
1007
1023
(
1989
).
41.
K.
Eichkorn
,
O.
Treutler
,
H.
Öhm
,
M.
Häser
, and
R.
Ahlrichs
, “
Auxiliary basis sets to approximate Coulomb potentials
,”
Chem. Phys. Lett.
240
,
283
290
(
1995
).
42.
O.
Vahtras
,
J.
Almlöf
, and
M. W.
Feyereisen
, “
Integral approximations for LCAO-SCF calculations
,”
Chem. Phys. Lett.
213
,
514
518
(
1993
).
43.
F.
Neese
,
F.
Wennmohs
,
A.
Hansen
, and
U.
Becker
, “
Efficient, approximate and parallel Hartree–Fock and hybrid DFT calculations. A ‘chain-of-spheres’ algorithm for the Hartree–Fock exchange
,”
Chem. Phys.
356
,
98
109
(
2009
).
44.
R.
Izsák
and
F.
Neese
, “
An overlap fitted chain of spheres exchange method
,”
J. Chem. Phys.
135
,
144105
(
2011
).
45.
K. A.
Peterson
,
T. B.
Adler
, and
H.-J.
Werner
, “
Systematically convergent basis sets for explicitly correlated wavefunctions: The atoms H, He, BNe, and AlAr
,”
J. Chem. Phys.
128
,
084102
(
2008
).
46.
K. E.
Yousaf
and
K. A.
Peterson
, “
Optimized auxiliary basis sets for explicitly correlated methods
,”
J. Chem. Phys.
129
,
184108
(
2008
).
47.
J. Behler, RuNNer—A program for constructing high-dimensional neural network potentials, Ruhr-Universität Bochum, 2007-2016.
48.
D. H.
Nguyen
and
B.
Widrow
, “
Neural networks for self-learning control systems
,”
IEEE Control Syst. Mag.
10
,
18
23
(
1990
).
49.
See supplementary material at http://dx.doi.org/10.1063/1.4950815 for a listing of the symmetry functions and their respective parameters used to describe the local chemical environments in the present work.

Supplementary Material

You do not currently have access to this content.