As the most important solvent, water has been at the center of interest since the advent of computer simulations. While early molecular dynamics and Monte Carlo simulations had to make use of simple model potentials to describe the atomic interactions, accurate ab initio molecular dynamics simulations relying on the first-principles calculation of the energies and forces have opened the way to predictive simulations of aqueous systems. Still, these simulations are very demanding, which prevents the study of complex systems and their properties. Modern machine learning potentials (MLPs) have now reached a mature state, allowing us to overcome these limitations by combining the high accuracy of electronic structure calculations with the efficiency of empirical force fields. In this Perspective, we give a concise overview about the progress made in the simulation of water and aqueous systems employing MLPs, starting from early work on free molecules and clusters via bulk liquid water to electrolyte solutions and solid–liquid interfaces.

A large fraction of the surface of the Earth is covered by water and, still, some ice, giving our planet its distinctive blue color when viewed from space. Water is carried down deep into the Earth’s crust at subduction zones, influencing volcanism and plate tectonics, and in the atmosphere, in the form of vapor, liquid, or ice, water is a key climate factor from the troposphere up to the stratosphere and mesosphere. Down at the Earth’s surface, water shapes landscapes, provides the basis for life, and is central to many technologies that sustain humanity. Given its significance and abundance, it is no surprise that over the centuries, much research has been undertaken to understand the properties of water and their physical origin.

One of the central scientific questions addressed in water research is how the complex behavior of water, exhibiting many anomalies and a rich phase diagram, arises from the interactions of the chemically rather simple H2O molecules. Due to the limited temporal and spatial resolution of many experimental probes, much of what we know about water has been learned from computer simulations. Specifically, atomistic simulations have provided detailed insights into the directed network of hydrogen bonds between molecules that governs the structure and dynamics of water and its interaction with solutes and surfaces.1–4 Moreover, computer simulations have made it possible to investigate water under extreme conditions that are not accessible in experiments. For instance, simulations have been used to study water and ice under pressure and temperature conditions prevailing in the deep Earth5 and in the interiors of the giant planets Uranus and Neptune,6 as well as in the deeply supercooled state, the so-called no-man’s land, where crystallization occurs extremely quickly.7,8

Following the pioneering Monte Carlo (MC) simulations of Barker and Watts9 and molecular dynamics (MD) studies of Rahman and Stillinger1 in the late 1960s and early 1970s, respectively, many computer simulations of water and aqueous systems were carried out. Initially, these simulations were based on empirical potentials,10 but, later, they relied increasingly on forces and energies obtained from electronic structure calculations.11 In the empirical potential—or force field—approach, the functional form of the interaction potential is constructed to capture the main physical interactions between molecules, with parameters adapted to reproduce some experimentally known quantities and/or quantum mechanical reference data.

Since the first water models for MD simulations were proposed more than half a century ago,12–14 a vast number of empirical potentials were developed for water and ice, ranging from simple forms based on pair interactions to sophisticated many-body potentials, including polarization and charge transfer.10,15–21 Despite their often simple functional form, these models have been remarkably successful in capturing the key properties of water across the phase diagram.22,23 Modeling chemical reactivity, however, has proven difficult using empirical potentials, and with a few exceptions,24–27 empirical water models are usually non-reactive. That is, they lack the capability to represent the dissociation and recombination of water molecules, which is, for example, essential for describing the famous Grotthuss mechanism of proton transport.28 The inclusion of proton transfer events is not only central for simulations of acids and bases but is also of utmost importance in countless chemical reactions with water involved as a reactant, from the hydrolysis of biomolecules to electrochemical water splitting. Nevertheless, to this day, empirical force fields have been used for the vast majority of simulations involving water, particularly in studies of biological macromolecules, where aqueous solvation effects are of crucial importance.18 

A more fundamental route to the computer simulation of water relies on determining interactions from first principles, i.e., ab initio by solving the electronic Schrödinger equation. The first ab initio molecular dynamics (AIMD) simulations of liquid water were carried out 30 years ago based on density functional theory (DFT). Since then, ab initio methods have been used extensively to study water11,29 and, more generally, aqueous systems.30 While, in principle, ab initio approaches have the potential for truly predictive simulations and also provide access to electronic properties, currently available approximate methods are still somewhat limited.31 For instance, predicting the equilibrium density of liquid water has been proven difficult within DFT based on standard exchange correlation functionals, and only the inclusion of dispersion forces produces satisfactory results.30,32 Moreover, some attempts to study water using methods beyond DFT have been made, e.g., using MP2-based simulations33 or employing the random phase approximation (RPA).34 

Compared to empirical force fields, ab initio approaches are computationally more expensive by many orders of magnitude, severely limiting the accessible system size and simulation times. Hence, many processes of interest occurring in aqueous systems, for instance freezing, glassy dynamics, the solvation of complex interfaces and biopolymers, as well as complex chemical reactions, are far beyond the capabilities of current ab initio simulations. Therefore, in order to transfer the reliability of ab initio methods to more complex systems, in the past two decades, considerable effort has been made to develop efficient but accurate potential energy surfaces (PESs) based on systematic and flexible functional forms.35–38 In particular, the growing use of machine learning techniques, such as artificial neural networks (ANNs), and Gaussian processes for the construction of highly accurate and efficient machine learning potentials (MLPs) has revolutionized—and continues to revolutionize—the field of atomistic computer simulations.39–53 These methodological advances in modern MLPs make it now possible to predict even complex properties of condensed systems from first principles, opening up exciting new possibilities in chemistry, materials science, and related disciplines.

Consequently, in recent years, atomistic modeling based on MLPs has also been increasingly applied to study water and, more generally, aqueous systems. The field has been evolving rapidly, in terms of both the underlying methodology and the complexity of systems that can be addressed, from small water clusters in vacuum via bulk water and its phase diagram to solution chemistry and processes at solid–liquid interfaces (cf. Fig. 1).

FIG. 1.

Schematic overview of some important applications of MLPs to aqueous systems with increasing complexity (from bottom to top): neutral and protonated water clusters, liquid water and its interface with ice, liquid water–vapor interface, electrolyte solutions, and solid–water interfaces.

FIG. 1.

Schematic overview of some important applications of MLPs to aqueous systems with increasing complexity (from bottom to top): neutral and protonated water clusters, liquid water and its interface with ice, liquid water–vapor interface, electrolyte solutions, and solid–water interfaces.

Close modal

In this Perspective, we first provide a brief overview of the methodological basis and the current status of MLPs. In doing so, we focus on those types of MLPs that have been most frequently employed in simulations of systems involving water. In the subsequent sections, we discuss different applications of MLPs to aqueous systems. While it is not the goal of this Perspective to provide a comprehensive review covering all MLP-based studies of water and related systems exhaustively, we discuss a broad range of representative applications and point out future research directions to demonstrate the power and versatility of MLPs for the study of water and aqueous systems.

In recent years, machine learning potentials have become an increasingly important tool for atomistic simulations of complex systems in chemistry, physics, and materials science. As a consequence, the development of MLPs is a very active topic of research, and here, we will restrict our discussion to a concise overview about the current status of the field. Readers interested in further details are referred to a very large number of reviews covering all aspects of the methodology of MLPs.39–54 

MLPs offer many advantages, like an excellent numerical agreement with the underlying electronic structure reference method, resulting in typical energy errors of only about 1 meV/atom and force errors in the order of 100 meV/Å. These errors are significantly smaller than the uncertainty, e.g., due to the choice of the exchange correlation functional in DFT, and thus, replacing electronic structure calculations by MLPs does only marginally affect the accuracy of the simulations. Moreover, MLPs can describe the making and breaking of chemical bonds and provide a rather high computational efficiency enabling simulations of systems containing many thousands of atoms. Still, they are usually about one to two orders of magnitude slower to compute than simple classical force fields. A main advantage of MLPs is their ability to transfer the predictive capabilities of ab initio approaches to large systems, which are required for aqueous systems to (1) achieve proper sampling of the liquid, (2) ensure bulk-like water properties at greater distances from solid surfaces, and (3) prevent artificially high electrolyte concentrations and/or artificial periodicity that would be present in relatively small simulation cells directly accessible by electronic structure calculations.

First, MLPs have been introduced more than a quarter of a century ago by Blank et al.,55 who suggested to use a feed-forward neural network (NN) to represent the interactions between diatomic molecules and solid surfaces. A general limitation of this first generation of MLPs, which has been further explored by numerous groups for about a decade for different types of systems, has been the limitation to a few atomic degrees of freedom only, restricting the applicability to small molecules in vacuum or small molecules interacting with frozen surfaces, as summarized in some early reviews.39,40 The major challenge for extending this method to high-dimensional condensed systems, such as liquid water, at that time has been the lack of suitable structural input descriptors for the machine learning algorithms, which ensure the imperative translational, rotational, and permutational invariances of the potential energy surface (PES). Only for some applications, system-specific approximate solutions could be derived when neglecting less important degrees of freedom of the system.56–58 In parallel to these efforts in the development of early neural network-based MLPs, in pioneering work, Braams and Bowman35,59 introduced permutation invariant polynomials (PIPs), which are closely related and enable the construction of very accurate PESs by linear regression based on symmetrized polynomials as basis functions. While PIPs do not employ traditional machine learning algorithms, such as neural networks, they show a similar flexibility and include all invariances exactly, which has been a frustrating challenge for early MLPs employing non-linear models. Still, PIPs share with early MLPs the restriction to small systems with a very limited number of degrees of freedom. Over the years, PIPs have enabled the construction of very accurate potentials and have been applied successfully, for example, to vibrations and reaction dynamics of small molecules in vacuum as well as a variety of water clusters.35 

MLPs have become generally applicable to high-dimensional systems containing thousands of atoms in 2007, when Behler and Parrinello introduced high-dimensional neural network potentials (HDNNPs).60–63 The key step, which paved the way for studying condensed systems such as liquid water by MLPs, is the construction of the potential energy E as a sum of environment-dependent local atomic energies Ei,
(1)
where Natom is the total number of atoms in the system. This general form of the total energy expression is shared by HDNNPs and many other second-generation MLPs proposed in the following years.

The energetically relevant local environment determining the Ei is defined by a cutoff radius Rc such that all interactions beyond this radius, which is typically chosen between 5 and 10 Å, are not explicitly included. While the ansatz of Eq. (1) has been used in many empirical potentials for a long time, the introduction of second-generation MLPs has only become possible by the development of many-body descriptors with full translational, rotational, and permutational invariance. In the case of HDNNPs, atom-centered symmetry functions (ACSFs)64 are most frequently used for this purpose, but at present, a wide range of alternative descriptors is available and employed in different types of MLPs.65–68 All these descriptors represent structural fingerprints of the local atomic environments and serve as an input for the machine learning algorithms, which then construct the functional relation between the atomic environments and the atomic energies.

In the case of HDNNPs, which are often used for simulations of water and aqueous systems, there is one feed-forward NN to be parameterized per element, which is then evaluated as many times as atoms of the respective element are present in the system. Closely related is ANI (Accurate neural network engine for molecular energies ANAKIN-ME, ANI for short), which is a HDNNP with modified angular ACSFs69 aiming for transferability across a wide range of organic molecules. Another NN-based MLP that is very frequently used for simulations of water is Deep Potential Molecular Dynamics (DeePMD),70–72 which employs a local atomic coordinate system and descriptors in this reference frame as an input for the atomic NNs. Many other second-generation MLPs containing NNs have been proposed,73,74 and in recent years also NN potentials learning descriptors as part of the training process employing message passing75 have been put forward.76–82 Beyond neural networks, Gaussian approximation potentials (GAPs)83 combined with the SOAP (Smooth Overlap of Atomic Positions) descriptor65 are among the most frequently used MLPs, with a few applications to aqueous systems reported to date. Many other second-generation MLPs are available in the literature, which can be expected to be used for systems containing water in the future.68,84,85

An obvious limitation of second-generation MLPs is the truncation of atomic interactions at the cutoff radius. However, in many aqueous systems, long-range electrostatic interactions play an important role.86 These are explicitly considered in third-generation MLPs, which include electrostatic interactions employing environment-dependent charges represented by machine learning models. Already in 2007, Popelier and co-workers showed that it is possible to construct electrostatic multipoles using neural networks87 and Gaussian processes88 to improve the description of electrostatics in classical force fields, and, in addition, applications to water clusters have been reported.89 

In 2011, HDNNPs of the third generation were proposed by introducing a second set of atomic neural networks providing atomic partial charges trained to DFT reference data.90,91 From these charges, the electrostatic energy can be computed and combined with the short-range expression of Eq. (1) to yield the total energy of the system. By training the short-range part to represent only the energy component not covered by electrostatics, double counting of energy contributions can be avoided. Further MLPs, including long-range electrostatics, are, for example, the HDNNP TensorMol,92 the message passing network PhysNet,93 and many others.94–98 At present, the machine learning representation of atomic partial charges and electrostatic multipoles is a very active field of research, opening many routes to the construction of third-generation MLPs.

A remaining limitation of third-generation MLPs is the locality of the atomic charges, which does not allow us to describe systems exhibiting long-range charge transfer and other non-local dependencies between the geometric and electronic structures.52 These phenomena can be considered in fourth-generation MLPs. The first MLP of this generation has been the charge equilibration neural network (CENT) technique proposed by Ghasemi et al. in 2015.99 Since the introduction of CENT, which employs a global charge equilibration step100 and is intended for applications to ionic materials, several other fourth-generation MLPs have emerged, such as Becke population neural networks (BpopNNs),101 fourth-generation HDNNPs (4G-HDNNPs),102 and charge recursive NNs (QRNNs).103 To date, fourth-generation MLPs have not been extensively applied to water but offer new interesting possibilities for studies of complex systems.

A key aspect in the training of MLPs54 is the construction of suitable datasets covering the structures that are visited in the intended simulations, as MLPs often show a strongly reduced accuracy when extrapolating beyond the known part of configuration space. The size and composition of these datasets depend on the systems of interest, but due to the high flexibility of MLPs, often energies and forces of 10 000 or more electronic structure calculations are required for training reliable potentials. For a systematic and unbiased determination of these structures, often various forms of active learning are employed.104–107 

Water clusters have received considerable attention already during the advent of MLPs as important benchmark systems. Even for small water clusters, there is a large structural variety with many energetically close local minima, posing a significant challenge for potential development. At the same time, their moderate size allows us to perform accurate high-level electronic structure reference calculations.

Early first-generation MLPs for water clusters include an MP2-based six-dimensional PES for the water dimer with frozen monomer geometries reported in 1997 that made use of a single feed-forward NN.108 In 2006, a very accurate three-dimensional NN potential for the water monomer with a root mean squared error (RMSE) of only 1 cm−1 (about 0.1 meV for the molecular potential energy) was published,109 and, in addition, a NN potential for the same systems focusing on the permutation invariance of the PES was reported in 2012.110 From 2005 onward, water clusters have also been investigated in great detail using permutation invariant polynomials reaching MP2 and coupled cluster accuracies.111–122 

To address larger water clusters, second-generation HDNNPs have been developed based on DFT data for a series of neutral clusters up to the decamer123,124 as well as for several protonated clusters.125 HDNNPs have also been applied to very large clusters containing hundreds of molecules.126 Moreover, nuclear quantum effects (NQEs) in neutral and protonated water clusters have been studied in recent years using very accurate HDNNPs trained to coupled cluster data.104,127–131

Starting in 2009, NNs were employed using water clusters as a test bed to find ways of improving the description of Coulomb interactions in classical force fields by learning environment-dependent electrostatic multipole moments.132 A comparison of NNs and Gaussian process regression (GPR) for the representation of multipoles found these methods to be similar in terms of accuracy and costs,89 resulting in an extension of the work to a water molecule embedded in a water decamer.133 Finally, this approach has been developed further to the FFLUX water model and applied to larger water clusters.134 In related work, GPR has been employed in a similar way to express charges in water molecules.95 

A first MLP of the third generation expressing the full energy, i.e., electrostatics and short-range bonding, of the system by machine learning has been a third-generation HDNNP for the water dimer reported in 2012,91 which includes electrostatic interactions based on environment-dependent atomic partial charges expressed by a second set of atomic NNs. Another example in which electrostatics, as well as van der Waals interactions, were implemented on top of a short-range HDNNP for water clusters is TensorMol published in 2018.92 Moreover, small clusters have been used as a benchmark for the inclusion of long-range interactions in DeePMD.97 

Finally, machine learning was used to learn tensorial properties using water clusters as a benchmark,135,136 and, in addition, the effect of noise on training HDNNPs for possible future applications on quantum computers has been tested for this system.137 Beyond NNs and GPR, also Support Vector Regression (SVR) has been explored for water clusters.138 Although not directly used in the development of high-dimensional PESs, the SVR method as well as random forest and Gaussian regression has also been used in specific applications, such as the prediction of the electron correlation energy in the water monomer and dimer.139 Beyond atomistic potentials, machine learning methods have also been used in various forms in combination with electronic structure concepts and methods, such as coupled cluster theory with single, double and perturbative triple excitations [CCSD(T)] and variational quantum Monte Carlo calculations with benchmarks for monomers and small clusters.140–146 

Of all aqueous systems, the construction of MLPs for bulk liquid water has received most attention not only due to its crucial importance as a solvent for a wide range of (bio)chemical reactions, but also because of its remarkable properties making water one of the most challenging benchmarks for the construction of interatomic potentials for molecular systems.

Due to the limited dimensionality that could be dealt with at that time, the first application of machine learning for the simulation of pure liquid water in 2002147 aimed for the inclusion of polarization in the TIP4P water model.148 For this purpose, a feed-forward neural network was used to represent the many-body interactions in dimers of rigid water molecules. This water model, named T4NN, was trained with MP2 reference data and was then used in Monte Carlo simulations to determine a range of properties of water, such as its density, heat capacity, and radial distribution functions. Overall, the agreement of many properties with experiment under standard conditions could be significantly improved with respect to the underlying TIP4P model, while the transferability to different temperatures posed a challenge.

In 2013, GAPs were used to enhance the accuracy of DFT-based ab initio MD simulations by introducing one- and two-body corrections trained on data of water monomers and dimers at the CCSD(T) level.149 Their application to water clusters and MD simulations of liquid water showed an improved overall potential energy surface, but additional corrections beyond two-body terms were necessary for further improvements, as also demonstrated by comparison with quantum Monte Carlo data.150 

The first full-dimensional MLP for bulk liquid water and ice, which did not rely on a force-field or a DFT baseline potential, was reported in 2016.151 In this work, a series of HDNNPs were trained using DFT reference data obtained for different generalized gradient approximation (GGA) functionals, with and without dispersion corrections. This allowed us, for the first time, to benchmark the quality of common DFT functionals in the description of computationally demanding properties of water, such as the density anomaly, the melting temperature, the viscosity, and the dielectric constant. The calculation of such quantities requires extensive simulations of large systems, which are prohibitively expensive using ab initio MD directly, but become affordable with HDNNPs. In this work, particular attention was devoted to study the effect of van der Waals interactions, which were shown to govern the flexibility of the hydrogen bond network and, hence, play a crucial role in determining the properties of water and ice. In fact, if van der Waals forces are neglected, the density maximum of water disappears and ice becomes denser than liquid water. In follow-up work, HDNNPs based on BLYP-D3 and RPBE-D3 data were used to study the density anomaly of water at negative pressures (for both functionals)152 and the kinetics of the ice–water interface (only for RPBE-D3).153 

Shortly after the first full-dimensional MLP for water, a HDNNP trained with B3LYP + D3 data was used in conjunction with path integral MD simulations to study nuclear quantum effects (NQEs) of liquid water close to the triple point.154 Since then, several other studies of nuclear quantum fluctuations in bulk liquid water and ice have followed,155–158 including a thermodynamic stability analysis of liquid water as well as hexagonal and cubic ice employing hybrid DFT data,159 and, more recently, a study of NQEs of liquid water based on the random phase approximation.160 

The past few years have witnessed a significant expansion in the use of MLPs for bulk water and ice. This growth encompasses many applications but also the development of methods, tools, and extensive benchmarking. For instance, the vibrational spectroscopy features of liquid water were extensively studied over the full frequency spectrum taking into account the effect of temperature and overcoordinated hydrogen-bond environments employing a HDNNP based on revPBE-D3.161,162 Embedded atom neural networks (EANNs) have been used to represent tensorial properties in water describing vibrational features with the revPBE0-D3 functional.163 Moreover, using water as a test example, DeePMD was introduced in 2018,71 which, alongside HDNNPs, emerged as one of the principal methods for modeling water using MLPs. DeePMD has also been coupled with empirical force fields164 and employed to develop coarse-grained water models.165 Some applications of DeePMD include the analysis of hydrogen bond dynamics in supercritical water,166 the comparison of light and heavy water to assess isotope effects,167,168 and the calculation of vibrational densities of states.169 

Crucial for the computation of vibrational features is the ability to determine the electronic polarizability tensor. In recent work, DeePMD was combined with an additional deep neural network to learn the environmental dependence of the polarizability tensor.170 As demonstrated using the SCAN (Strongly-Constrained and Appropriately-Normed) functional as a reference, this approach yields accurate Raman spectra of liquid water. Furthermore, DeePMD has been integrated with a deep neural network trained to predict Wannier centers based on local environments. This approach allowed us to compute infrared spectra171,172 and to determine the temperature dependence of the dielectric constant.173 The methods can also be extended to account for quadrupole moments.174 Recently, the description of tensorial properties, such as the polarizability tensor, has been fitted to MD simulations a posteriori using equivariant neural networks to describe infrared spectra.175 Moreover, a combination of HDNNP and GPR has allowed us to model the hyper-Raman spectra of water, which helped us understand the differences in the OH stretch mode between infrared and Raman spectra.176 

The accuracy and computational efficiency of MLPs have made it possible to accurately determine the thermodynamic properties of water, including its phase diagram. For instance, the thermodynamic properties of water have been investigated with DeePMD based on SCAN177,178 and with a HDNNP based on revPBE0-D3.159,179 The phase behavior of water under extreme conditions expected in a planetary environment was also studied employing a HDNNP.180 Other aspects addressed with MLPs include the study of heat transport181,182 and the viscosity.183 More recently, the thermodynamics of water has been investigated with a neuroevolution potential.184 

Studying the mechanism and kinetics of phase transitions is computationally very demanding and thus completely out of reach for ab initio simulations. MLPs, however, can be used to simulate systems of millions of water molecules with ab initio accuracy185 such that the simulation of phase transitions is now possible. One example is a recent investigation of the homogeneous nucleation of ice in supercooled water studied using DeePMD, in combination with the seeding methods,186,187 in a system of hundreds of thousands of water molecules.188 These calculations yielded nucleation rates consistent with experimental measurements. Very recently, advanced sampling techniques covering 36 µs of total simulation time have been used to probe the atomic structure of the critical nucleus.189 In addition, the liquid–liquid transition in supercooled water has been investigated. An initial attempt using DeePMD based on the SCAN functional found indications of this transition through anomalies in thermodynamic response functions.190 Two years later, the existence of this transition was conclusively demonstrated191 and its relation with the melting curves of ice polymorphs has been investigated.192 Finally, building on previous studies, the transition between ice Ih and its proton-ordered counterpart ice XI, mediated by ionic defects, has been studied based on the DeePMD model.193 

Since modern MLPs can capture reactions, important processes such as proton transfer and autoionization are accessible. In this regard, HDNNPs have allowed us to describe the transport of hydronium and hydroxide ions, including nuclear quantum effects194 and the free energetics and mechanics of water dissociation,195 allowing us to compute the equilibrium pKw of water.196 

The application of MLPs has been facilitated by careful benchmarking and transferability studies and the development of new ML-based methods and workflows. Some of these ML methods have been used beyond the fitting of potential energy surfaces as is the case for ML classifiers of phases197,198 and dynamical processes.199 One particular interesting finding regarding the transferability of MLPs is that liquid structures already contain the relevant information required to reproduce ice phases,200 including even ice–water interfaces. In particular, this was observed in studies of homogeneous nucleation, in which an empirical potential was benchmarked against its MLP representation containing only liquid structures.201 The performance of different density functionals [Perdew–Burke–Ernzerhof (PBE), SCAN, vdW-cx, and optB88-vdW] for modeling water and ice has been compared.202 Moreover, HDNNP and GPR trained on the same dataset have been shown to be equivalent when compared over different thermodynamic properties of liquid water although HDNNPs seem to be more demanding in terms of the required training data203 but are computationally more efficient. In fact, the role of the training data has been the focus in other studies.204 Furthermore, graph neural networks (GNNs), which do not require predefined structural descriptors, have been applied to accelerate molecular dynamics simulations.205 The selection of descriptors has also been automatized for HDNNPs,206,207 whereas GPR-based potentials have been employed in on-the-fly learning workflows.204,208

Empirical potentials have also benefited from the development of MLPs. Coarse-grained MLPs emerged,165,209–213 including an approach based on equivariant neural networks,214 and empirical force fields were parameterized using ML algorithms.215–218 Moreover, the addition of polarization to empirical force fields has been revisited, including charge transfer.219 DeePMD has been used to fit an accurate but costly many-body potential,220 reducing its computational cost by one order of magnitude.221 Moreover, a GNN has been applied to estimate Bayesian uncertainty in molecular dynamics simulations based on an empirical potential.222 

Further progress has come again from water clusters. HDNNPs, PIP-based potentials, and GAPs have been shown to be equivalent in representing many-body interactions in water clusters,223 which can be employed in the construction of improved water potentials for bulk water. In fact, recent advances suggest that reference data obtained exclusively for water clusters could be sufficient to train accurate MLPs even for the bulk liquid phase,224–228 including results from Gaussian-moment neural networks (GMNNs).228 Recently, even gold-standard CCSD(T)-level accuracy for bulk water potentials has been reached by training to large clusters or periodic structures.121,229,230

An important current topic of research is the inclusion of long-range interactions,86 which are not explicitly considered in many MLPs. This problem has been addressed by introducing non-local representations of the system remapped as local and equivariant feature vectors, capturing non-local and non-additive effects.231 Another approach to treat this issue is to learn the long-range response with a self-consistent field neural network, which has been shown to produce correct long-range polarization correlations in liquid water, as well as the correct response of liquid water to external electrostatic fields.232,234

Architectures like equivariant neural networks have been combined with empirical electrostatics and dispersion.235 Such models are highly accurate in learning reference datasets,80,81,235–238 and their adoption is growing rapidly.

Neural network potentials have also been used to investigate the structure, thermodynamics, and spectroscopic properties of the liquid/vapor interface. As the local environments close to the interface are highly anisotropic and thus very different from the bulk, it is important that the training set explicitly includes data for interface configurations.239 The structure of such configurations has been analyzed in detail using SOAP descriptors and local order parameters.240 Investigating the structure of the interface reveals the prevalence of orientations with the dipole moment roughly parallel to the surface with one OH bond pointing out of it,239 corroborating insights gained from sum frequency generation (SFG) measurements.241 By using the surface-sensitive velocity autocorrelation function,242 such SFG spectra of the liquid/vapor interface were calculated from path integral molecular dynamics based on a HDNNP trained at the revPBE0+D3 level.243 Recently, SFG spectra have been computed fully from first principles using a HDNNP combined with GPR.244 

In another study, it was found that a DeePMD potential relying only on local atomic energies can be applied to the liquid/vapor interface.245 However, the explicit inclusion of long-range interactions was shown to be beneficial, confirming the results of previous studies carried out for empirical potentials.246 The effect of long-range interactions was tested for a water molecule moving away from the liquid/vapor interface using an extension of DeePMD, including long-range electrostatics.97 The case of curved liquid/vapor interfaces has been addressed as well. For instance, DeePMD has been employed to investigate the formation of bubbles in metastable water.247 Furthermore, it was shown that the free energy of water dissociation at the liquid/vapor interface of droplets and films deviates from the bulk, leading to an enrichment of hydronium cations at the interface and a depletion of hydroxide anions.248 

Beyond pure water, MLPs have been used in numerous simulations of electrolyte solutions.249 Already in 1998, a feed-forward neural network was employed to represent the three-body interaction energies in H2O–Al3+–H2O clusters with the aim to improve the force field description of Al3+ ions dissolved in bulk water.56 This work represents an important milestone in the incorporation of permutation symmetry in structural descriptors. Later, HDNNPs have enabled the construction of full-dimensional DFT-quality PESs for aqueous NaOH solutions over the entire solubility range.250 In this work, it has been found that as the NaOH concentration increases, the primary mechanism for proton transfer shifts from being acceptor-driven, influenced by the pre-solvation of hydroxide ions, to donor-driven, controlled by the pre-solvation of water molecules. In addition, with increasing concentration, octahedral coordination geometries become less favored, in contrast to trigonal prism geometries.251 A novel water exchange mechanism has been identified around Na+(aq) ions in basic (high pH) solutions.252 Studies comparing classical and ring-polymer molecular dynamics based on the HDNNP revealed that nuclear quantum effects significantly reduce proton transfer barriers, thus increasing proton transfer rates. This leads to an enhanced diffusion coefficient, especially for OH, and a shorter mean residence time of molecules in the first hydration shell around Na+ at high NaOH concentrations.253 Moreover, elevated temperatures in concentrated NaOH solutions amplify both the contributions of proton transfer to ionic conductivity and deviations from the Nernst–Einstein relation.254 Further applications of HDNNPs include investigations of fluoride and sulfate ions in the solution.107 Employing similar methodologies and training on revPBE + D3 data, the dissolution mechanisms of NaCl in water have also been addressed.255 Another example is the use of HDNNPs to study zinc ion hydration in water,256 with molecular dynamics simulations matching both the experimentally observed zinc–water radial distribution function and the x-ray absorption near edge structure spectrum. Moreover, HDNNP-based studies reveal the impact of surface stratification on the interfacial water structure in electrolyte solutions.257 Equivariant neural network potentials have also been employed to study various electrolyte solutions,258 including aqueous lithium chloride259 and aqueous sodium chloride.260 A genetic algorithm has also been utilized to study hydrated zinc(II) ion clusters.261 In addition, microhydrated sodium ions with a few water molecules have been studied for both the potential energy and the dipole moment employing PIPs.262 

DeePMD potentials have been used to study sodium chloride, potassium chloride, and sodium bromide at various concentrations.263 These studies revealed that the structural changes due to the ions are confined to the immediate vicinity of the ions, where they disrupt the network of hydrogen bonds. Beyond these regions, the distribution of oxygen atoms relative to one another remains largely unchanged compared to pure water. In a related study, the dielectric permittivity of sodium chloride solutions has also been investigated.264 Using DeePMD potentials, the uptake of N2O5 into aqueous aerosols has been examined, a process that is challenging to study experimentally due to the fast reaction kinetics of N2O5.265 Furthermore, the diffusivity of water in aqueous cesium iodide and sodium chloride solutions has been examined using a DeePMD framework trained on DFT data using the revPBE-D3 functional.266 Such simulations addressing the characteristic behavior of different ions are not readily accessible through traditional force field-based molecular dynamics simulations due to less ion-specific description of ion–water interactions.

Solid–liquid interfaces are of high interest for catalysis and electrochemistry. Due to the very different bonding in liquid water and in crystalline surfaces, such as metals or oxides, constructing unified atomistic potentials that can describe all subsystems of solid–liquid interfaces with balanced high accuracy presents a substantial challenge for empirical potentials. Moreover, in many cases, water is not only in contact with the surface but can also dissociate and recombine at a much higher rate than in the bulk liquid. Consequently, the use of reactive potentials, which can describe the making and breaking of bonds, is mandatory. MLPs are ideally suited for this purpose.

In 2014, a HDNNP for a thin water layer on top of 55-atom CuAu alloy clusters with varying stoichiometries and a slab model were reported to study the effect of water on the stability of different interface compositions by Monte Carlo simulations.267 While this work exhibited a still rather large error of about 12 meV/atom, the accuracy of HDNNPs for solid–liquid interfaces has significantly improved in the following years. For instance, in studies of water at various surfaces of copper268,269 and zinc oxide270,271 energy, RMSEs of less than 1 meV/atom could be reached. Detailed convergence tests with respect to the required system size were carried out,268 showing that the diameter of the liquid phase between the slab surfaces needed to decouple the two surfaces by bulk-like water is at least about 35–40 Å.268 Such a size is beyond reach in ab initio molecular dynamics but necessary to ensure that the central water molecules have bulk-like environments in their local vicinity. This consideration is crucial to ensure that these molecules do not experience any significant influence from the altered water structure near the interfaces or from the surfaces directly.

While it has been found using HDNNPs that water does not spontaneously react with defect-free surfaces of certain metals, such as copper, on nanosecond time scales,268 fast dissociation and recombination processes leading to the formation of surface hydroxides have been observed on zinc oxide surfaces.270,271 These processes are often governed by the surrounding hydrogen bond networks, significantly influencing the free energy barriers of proton transfer processes.270 This phenomenon has been confirmed for water at TiO2 surfaces using DeePMD potentials,272,273 which have also been employed to explore the impact of slab thickness.274 Furthermore, the use of HDNNPs for computing anharmonic frequencies has been suggested as a method to elucidate the role of hydrogen bonds in surface processes.275 

Depending on the specific surface geometry, proton transfer events can lead to various topologies of proton transport networks along the surface, which can be either one-dimensional or two-dimensional. This has been demonstrated for several surfaces of zinc oxide276 and the lithium intercalation compound LiMn2O4277 using HDNNPs. Surface defects, often stabilized by solvation compared to the vacuum interface, have also been a subject of study. The mobility of adatoms has been found to significantly vary across different low-index surfaces of copper.269 Investigations into the defective Zr7O8N4/H2O and pristine ZrO2/H2O interfaces using neural network potentials278 revealed a bilayer water structure for Zr7O8N4 and a monolayer structure for ZrO2. Oxygen vacancies on the Zr7O8N4 surface have been suggested as active sites for the oxygen reduction reaction. Furthermore, neural networks have been used to identify different oxidation states of transition metal ions at oxide-water–interfaces,277 which enables the characterization of electronic structures relevant for catalytic applications.

Due to its importance in catalysis, the TiO2–water interface has so far been the most intensely studied interface using MLPs.107,272–274,279–282 Investigations into the water coverage on the anatase (101) TiO2 surface using the DeePMD potential279 have shown that higher water coverage prompts significant reorganization of the water monolayer at O2c sites, leading to the formation of a two-dimensional hydrogen bond network with closely linked pairs of water molecules on neighboring TiO5c and O2c sites. Other DeePMD-based studies have examined the impact of water dissociation on thermal transport at the TiO2–water interface.280 While previous research on TiO2–water interfaces mainly focused on the anatase (101) and rutile (110) TiO2 surfaces, recent MLP-based studies281 have explored seven different TiO2 surfaces using three distinct functionals: SCAN, PBE, and optB88-vdW. These studies found that water dissociation is more likely on the anatase (100), anatase (110), rutile (001), and rutile (011) surfaces, while molecular adsorption is the primary process on the anatase (101) and rutile (100) surfaces. Moreover, simulations for rutile (110) showed that the slab thickness significantly influences the results, with thicker slabs favoring molecular adsorption. DeePMD has also been used to study amorphous TiO2 (a-TiO2) to compare its behavior with well-studied crystalline TiO2 at aqueous interfaces.282 These studies demonstrated that water molecules on the a-TiO2 surface do not exhibit the distinct layering typical of the aqueous interface of crystalline TiO2. This difference results in an approximately tenfold increase in water diffusion speed at the interface.

Other cases of employing MLPs to study solid–water interfaces include the use of HDNNPs for the Pt(111)–water interface to investigate the interaction between water and hydroxylated metal surfaces283,284 and for the hematite–water interface,285 revealing solvation dynamics at various time scales. DeePMD has been utilized for studying the TiS2/water interface286 to examine the influence of TiS2 surface termination on the structure of interfacial water. Moreover, DeePMD has been applied to the IrO2–water interface, exploring the hydration structure, proton transfer mechanisms, and acid–base characteristics,287 as well as to the GaP(110)–water interface,288 which has been shown to require about 12 ns to reach equilibrium, a duration not achievable with traditional AIMD simulations. DeePMD has also been used for the construction of a potential aimed at studying ice nucleation at the microcline feldspar surface using the SCAN functional289 and for investigating the impact of water dissociation on thermal transport at the Cu–water interface.290 Apart from HDNNPs and DeePMD, an equivariant graph neural network has been employed to study the oxygen reduction reaction at the Au(100)–water interface,291 and on-the-fly learning kernel-based regression has been applied to investigate water adsorption on MgO and Fe3O4 surfaces, including surface reconstructions.292 

Confined water, which exhibits properties notably differing from bulk water, has also been studied using MLPs. Some examples include HDNNPs for water confined between two-dimensional boron nitride sheets293 and MoS2107 and water between graphite layers using DeePMD294,295 and committees of HDNNPs;296 the latter method was also applied to confinement within a graphene-like material.297 Furthermore, MD simulations have been used to investigate water in single-walled carbon and boron nitride nanotubes,107,298 finding a fivefold reduction in friction in carbon tubes compared to boron nitride, attributed to strong hydrogen–nitrogen interactions.298 Ion concentration profiles under nanoconfinement299 have also been studied using neural network potentials, focusing on the effects of channel widths, ion molarity, and ion types.

Apart from studies of pure water, electrolytes, and solid–liquid interfaces, which have been in the focus of MLP-based atomistic simulations for several years, the use of MLPs for aqueous systems has increasingly diversified and now covers essentially all fields of simulations involving water. An exhaustive coverage of the related literature is beyond the scope of this Perspective, and in this section, we just point the interested readers to several typical applications of MLPs in these rapidly growing fields.

A prominent use of atomistic simulations is to study chemical reactions of organic molecules in solution. Examples for the application of MLPs are corrections to quantum mechanical/molecular mechanical (QM/MM) simulations of SN2 reactions in water,300 the solvation of protein fragments,93 the decomposition of urea in water,301 the computation of free energy profiles of reactions of organic molecules,302 and enzyme reactions.303 Further studies include the quantum dynamics of an electron solvated in water304 and the excited state of CH3NNCH3 surrounded by several water molecules.305 

Still, the all-atom description of chemical processes in solution can be demanding, and, consequently, simplified MLPs have been proposed as well. For example, the solvated alanine dipeptide306,307 and the folding/unfolding of chignolin306,308 have been studied using the coarse-grained CGNet potential. Moreover, a DeepPot-SE model has been used to describe molecules under the influence of an implicit solvent.309 

Additional applications of MLPs for aqueous systems include the study of diffusion in hydrogen hydrates,310 the determination of vibrational frequency shifts in formic acid C=O stretching and C=N stretching of MeCN in water,311 retinoic acid in water,312 graph-convolutional neural networks for benchmarking sets of solutes and chemical reactions in water,313 hydration dynamics and IR spectroscopy of 4-fluorophenol,314 zinc protein studies,315 conformational shifts of stacked heteroaromatics,316 and solvation free energy prediction of organic molecules in redox flow batteries.317 

In recent years, MLPs have reached a high level of maturity, and, currently, a transition from proof-of-concept and benchmark studies to practical simulations of a wide range of complex systems is taking place. Therefore, it can be anticipated that MLPs will allow us to overcome the limitations of conventional methods, such as empirical potentials in terms of accuracy and ab initio molecular dynamics in terms of efficiency, paving the way for simulations of extended systems with unprecedented accuracy. MLPs have demonstrated this capability already for a broad spectrum of aqueous systems, ranging from neutral and protonated water clusters to bulk liquid water and ice, liquid/vapor interfaces and from electrolyte solutions to complex solid-water interfaces (see Fig. 1). In all these studies, MLPs have enabled simulations with first-principles accuracy that previously have been prohibitively demanding, as evidenced by a rapidly growing number of publications in the field shown in Fig. 2.

FIG. 2.

Overview of the number of articles published per year for different types of MLPs applied to water and aqueous systems as discussed in this Perspective. These are, in order of first use for these systems, neural network potentials (NNPs) based on simple feed-forward neural networks, Gaussian process regression (GPR), high-dimensional neural network potentials (HDNNPs), deep potential molecular dynamics (DeePMD), support vector regression (SVR), random forest (RF), embedded atom neural networks (EANNs), graph neural networks (GNNs), Gaussian-moment neural networks (GMNNs), comprehensive genetic algorithm (CGA), and neuroevolution potentials (NEPs).

FIG. 2.

Overview of the number of articles published per year for different types of MLPs applied to water and aqueous systems as discussed in this Perspective. These are, in order of first use for these systems, neural network potentials (NNPs) based on simple feed-forward neural networks, Gaussian process regression (GPR), high-dimensional neural network potentials (HDNNPs), deep potential molecular dynamics (DeePMD), support vector regression (SVR), random forest (RF), embedded atom neural networks (EANNs), graph neural networks (GNNs), Gaussian-moment neural networks (GMNNs), comprehensive genetic algorithm (CGA), and neuroevolution potentials (NEPs).

Close modal

MLPs bridge the gap between two traditional approaches in atomistic simulation, ab initio and force-field-based MD simulations, offering advantages over both. MLPs excel by enabling significantly longer length and time scales compared to AIMD simulations, thanks to their computational efficiency. This extension is crucial for accurately computing properties, such as equations of state across a broad range of parameters, kinetics of phase transitions, and interface mobility, and to study supercooled water as well as solvation dynamics on long time scales. In comparison with classical force fields, MLPs can potentially match the accuracy of the underlying electronic structure calculations, which usually surpass the accuracy of traditional force fields. Furthermore, MLPs are reactive since they do not rely on predefined chemical bonds, unlike most traditional force fields. This reactivity is crucial for correctly modeling systems with complex interactions, such as solid–liquid interfaces and aqueous solutions, where water dissociation, recombination, and proton transfer occur. It is also essential for accurately representing the entire range of acid/base chemistry, which fundamentally relies on proton transfer processes.

Over the past two decades, progress in MLPs for aqueous systems has focused on different frontiers. While right from the start the highly flexible functional form of ML algorithms has enabled a numerically very accurate representation of the electronic structure reference data, a severe challenge in the early years has been the very limited number of degrees of freedom that could be considered. Only the development of modern descriptors for the atomic environments allowed us to extend MLPs to condensed systems such as liquid water with all their associated degrees of freedom. Recently, message passing neural networks75 have become a promising alternative to the use of predefined descriptors, opening many new exciting possibilities for the construction of MLPs with higher accuracy, based on less data, in particular if equivariant features are used. Another frontier that has increasingly received attention in recent years is the incorporation of physical concepts into hitherto purely mathematical machine learning potentials, with many developments specifically aiming for improved descriptions of long-range electrostatic interactions, van der Waals forces, long-range charge transfer,52 and electron densities.318 This inclusion of physically meaningful terms, not only in the total energy expression but also in the form of novel descriptors,231,319 will further increase the accuracy and transferability of the potentials.

To effectively simulate chemical processes in the aqueous phase, the underlying potential function must rely on accurate electronic structure calculations, since the chosen reference method represents a natural limit for the accuracy of MLPs. While coupled cluster accuracy has already been achieved in MLPs for liquid water,121,229,230 which would have been unthinkable with conventional empirical potentials, reaching this gold-standard for more complex systems, such as solid–liquid interfaces, is very challenging. Therefore, DFT will likely remain the dominant method for the reference electronic structure calculations of many systems in the foreseeable future, in particular for those systems involving solids. Although the generalized gradient approximation (GGA) has been the most commonly employed functional in the study of aqueous systems as they offer a good compromise between computational cost and accuracy, more and more powerful computing resources increasingly enable the use of more advanced and computationally expensive functionals, such as meta-GGA (particularly SCAN) and hybrid functionals, in the investigation of several aqueous systems, and even this level of accuracy has remained essentially inaccessible by on-the-fly ab initio MD to date. Consequently, complex aqueous systems can now be investigated with a previously unattainable level of accuracy enabling predictive simulations. Moreover, MLPs can be employed to evaluate the accuracy of the underlying level of theory with respect to experimental values in a broad range of scenarios. This extends beyond the traditional comparison at standard temperature and pressure, allowing for a comprehensive evaluation that may provide insightful guidance for the development of theoretical methods.

The functional flexibility is a key property of MLPs, but it is a double-edged sword: on the positive side, this flexibility enables the accurate approximation of the PES based on the reference data. On the negative side, however, such flexibility severely limits the extrapolation capabilities of MLPs to chemical environments not adequately sampled during the training process. In fact, extrapolation to unfamiliar environments can lead to unphysical structures and completely wrong simulation results. Therefore, the construction and validation of MLPs have to be done with great care to ensure that all relevant local environments are included in the training set. Moreover, it is much more challenging than in case of simpler empirical potential to provide “boxed” MLPs for general usage, since not only the underlying parameters but also information about the range of validity is crucial information for successful applications. For instance, when studying solid–water interfaces, it becomes necessary to train the MLP not only on the bulk material and bulk water separately but also on systems that include all relevant interface configurations. Thus, increasing the complexity of the system also increases the number of local environments that must be included in the training data. Although AIMD simulations are sometimes used to generate initial training sets, they are usually insufficient since they often fail to capture the less frequently visited structures. Fortunately, this challenge has been largely overcome in recent years by incorporating active learning for the generation of the reference data. An alternative approach, recently emerging, focuses on establishing foundation models to be used as a starting point for MLP development.237 This might improve the transferability and contribute to the development of more accurate MLPs.

While, in this Perspective, we have focused on the construction and application of accurate and efficient interatomic potentials, machine learning approaches can be useful in several other ways for the atomistic simulation of aqueous systems, and, more generally, of materials and biomolecular systems.47 For instance, neural networks have been employed to classify local structures in liquid water and various forms of ice.197,198 The accurate identification of molecular structures with high spatial resolution is important, for instance, in the study of crystallization, melting, crystal growth, and the formation and migration of defect. Another challenge in molecular simulations lies in reducing or even completely removing correlations in the statistical sampling of configuration space as it occurs in sequential sampling methods, such as molecular dynamics and Markov chain Monte Carlo simulations. Here, normalizing flows320,321 or other generative methods can play an important role for sampling the equilibrium distribution and also for creating new stable crystal structures.322 Moreover, machine learning approaches have been suggested as a way to discover reaction coordinates and enhance the sampling of rare transitions occurring in complex molecular systems.323,324 There is little doubt that in the years to come further, new ML and AI tools will be applied to the computational investigation of matter at the atomistic level, creating new opportunities for studying complex aqueous systems.

Machine learning and, more generally, artificial intelligence are currently revolutionizing our way to do science and are providing new opportunities not achievable with traditional approaches. In the field of computational materials science, the advent of accurate, flexible, and efficient MLPs has dramatically increased the time and length scales accessible by atomistic simulations of materials with ab initio accuracy. In this Perspective, we have provided an overview of the key concepts of such machine learned potentials with a focus on their application to water and aqueous systems. The broad spectrum of systems successfully explored with these techniques, including water clusters, bulk liquid water and ice, the liquid/vapor interface, electrolyte solutions, and solid–liquid interfaces, underscores the flexibility, efficiency, and the high level of maturity they have reached since recent years.

Since machined learned potentials accurately reproduce the underlying reference data obtained with electronic structure methods but in applications require a much lower computational effort, they now make it possible to compute complex materials properties, such as phase diagrams. In this way, they do not only allow us to gain new insights into a variety of systems, but they also provide a way to truly test the theoretical description underlying the reference data and to reveal their possible limitations. This facilitates the generation of high quality reference data in the future as a basis for truly predictive ML-based computer simulations of complex materials.

In summary, modern MLPs have created new opportunities for the investigation of aqueous systems that would have been unimaginable with conventional methods for the foreseeable future. Carefully trained and validated MLPs can be employed to study complex reactive aqueous systems accurately across large time and length scales, without imposing ad hoc empirical constraints. While predicting the future of this rapidly evolving field is challenging, the remarkable progress made to date suggests that we can expect exciting new developments and some surprising breakthroughs in the years to come.

J.B. and A.O. are grateful for the funding by the Deutsche Forschungsgemeinschaft (DFG, German Research Foundation) under Grant No. TRR/CRC 247 (A10, Project No. 388390466) and under Germany’s Excellence Strategy—Grant No. EXC 2033 RESOLV (Project No. 390677874). C.D. and P.M.D.H. acknowledge the funding from the Austrian Science Foundation FWF through the Projects Doctoral College Advanced Functional Materials (Grant No. DOC 85-N) and the SFB TACO (Grant No. F-81).

The authors have no conflicts to disclose.

All authors contributed equally to this work.

Amir Omranpour: Visualization (equal); Writing – original draft (equal); Writing – review & editing (equal). Pablo Montero de Hijes: Visualization (equal); Writing – original draft (equal); Writing – review & editing (equal). Jörg Behler: Conceptualization (equal); Funding acquisition (equal); Supervision (equal); Writing – original draft (equal); Writing – review & editing (equal). Christoph Dellago: Conceptualization (equal); Funding acquisition (equal); Supervision (equal); Writing – original draft (equal); Writing – review & editing (equal).

Data sharing is not applicable to this article as no new data were created or analyzed in this study.

1.
A.
Rahman
and
F. H.
Stillinger
, “
Molecular dynamics study of liquid water
,”
J. Chem. Phys.
55
(
7
),
3336
3359
(
1971
).
2.
A.
Luzar
and
D.
Chandler
, “
Hydrogen-bond kinetics in liquid water
,”
Nature
379
(
6560
),
55
57
(
1996
).
3.
K. A.
Dill
,
T. M.
Truskett
,
V.
Vlachy
, and
B.
Hribar-Lee
, “
Modeling water, the hydrophobic effect, and ion solvation
,”
Annu. Rev. Biophys. Biomol. Struct.
34
(
1
),
173
199
(
2005
).
4.
O.
Björneholm
,
M. H.
Hansen
,
A.
Hodgson
,
L.-M.
Liu
,
D. T.
Limmer
,
A.
Michaelides
,
P.
Pedevilla
,
J.
Rossmeisl
,
H.
Shen
,
G.
Tocci
,
E.
Tyrode
,
M.-M.
Walz
,
J.
Werner
, and
H.
Bluhm
, “
Water at interfaces
,”
Chem. Rev.
116
,
7698
7726
(
2016
).
5.
V.
Rozsa
,
D.
Pan
,
F.
Giberti
, and
G.
Galli
, “
Ab initio spectroscopy and ionic conductivity of water under earth mantle conditions
,”
Proc. Natl. Acad. Sci. U. S. A.
115
(
27
),
6952
6957
(
2018
).
6.
C.
Cavazzoni
,
G. L.
Chiarotti
,
S.
Scandolo
,
E.
Tosatti
,
M.
Bernasconi
, and
M.
Parrinello
, “
Superionic and metallic states of water and ammonia at giant planet conditions
,”
Science
283
(
5398
),
44
46
(
1999
).
7.
P.
Gallo
,
K.
Amann-Winkel
,
C. A.
Angell
,
M. A.
Anisimov
,
F.
Caupin
,
C.
Chakravarty
,
E.
Lascaris
,
T.
Loerting
,
A. Z.
Panagiotopoulos
,
J.
Russo
,
J. A.
Sellberg
,
H. E.
Stanley
,
H.
Tanaka
,
C.
Vega
,
L.
Xu
, and
L. G. M.
Pettersson
, “
Water: A tale of two liquids
,”
Chem. Rev.
116
(
13
),
7463
7500
(
2016
).
8.
P. G.
Debenedetti
,
F.
Sciortino
, and
G. H.
Zerze
, “
Second critical point in two realistic models of water
,”
Science
369
(
6501
),
289
292
(
2020
).
9.
J. A.
Barker
and
R. O.
Watts
, “
Structure of water: A Monte Carlo calculation
,”
Chem. Phys. Lett.
3
(
3
),
144
145
(
1969
).
10.
C.
Vega
and
J. L. F.
Abascal
, “
Simulating water with rigid non-polarizable models: A general perspective
,”
Phys. Chem. Chem. Phys.
13
,
19663
19688
(
2011
).
11.
K.
Laasonen
,
M.
Sprik
,
M.
Parrinello
, and
R.
Car
, “
Ab initio’ liquid water
,”
J. Chem. Phys.
99
(
11
),
9080
9089
(
1993
).
12.
J. S.
Rowlinson
, “
The lattice energy of ice and the second virial coefficient of water vapour
,”
Trans. Faraday Soc.
47
,
120
129
(
1951
).
13.
A.
Ben-Naim
and
F.
Stillinger
, “
Aspects of the statistical-mechanical theory of water
,” in
Structure and Transport Processes in Water and Aqueous Solutions
, edited by
R. A.
Horne
(
Wiley-Interscience
,
New York
,
1972
).
14.
F. H.
Stillinger
and
A.
Rahman
, “
Improved simulation of liquid water by molecular dynamics
,”
J. Chem. Phys.
60
(
4
),
1545
1557
(
1974
).
15.
F. H.
Stillinger
,
Theory and Molecular Models for Water
(
John Wiley & Sons, Ltd.
,
1975
), pp.
1
101
.
16.
B.
Guillot
, “
A reappraisal of what we have learnt during three decades of computer simulations on water
,”
J. Mol. Liq.
101
(
1–3
),
219
260
(
2002
), part of the Special Issue: Molecular Liquids. Water at the New Millenium.
17.
C.
Vega
,
J. L. F.
Abascal
,
M. M.
Conde
, and
J. L.
Aragones
, “
What ice can teach us about water interactions: A critical comparison of the performance of different water models
,”
Faraday Discuss.
141
,
251
276
(
2009
).
18.
A. V.
Onufriev
and
S.
Izadi
, “
Water models for biomolecular simulations
,”
Wiley Interdiscip. Rev.: Comput. Mol. Sci.
8
(
2
),
e1347
(
2018
).
19.
G. A.
Cisneros
,
K. T.
Wikfeldt
,
L.
Ojamäe
,
J.
Lu
,
Y.
Xu
,
H.
Torabifard
,
A. P.
Bartók
,
G.
Csányi
,
V.
Molinero
, and
F.
Paesani
, “
Modeling molecular interactions in water: From pairwise to many-body potential energy functions
,”
Chem. Rev.
116
(
13
),
7501
7528
(
2016
).
20.
T. E.
Gartner
III
,
K. M.
Hunter
,
E.
Lambros
,
A.
Caruso
,
M.
Riera
,
G. R.
Medders
,
A. Z.
Panagiotopoulos
,
P. G.
Debenedetti
, and
F.
Paesani
, “
Anomalies and local structure of liquid water from boiling to the supercooled regime as predicted by the many-body MB-pol model
,”
J. Phys. Chem. Lett.
13
(
16
),
3652
3658
(
2022
).
21.
J. L. F.
Abascal
,
E.
Sanz
,
R.
García Fernández
, and
C.
Vega
, “
A potential model for the study of ices and amorphous water: TIP4P/Ice
,”
J. Chem. Phys.
122
(
23
),
234511
(
2005
).
22.
S. L.
Bore
,
P. M.
Piaggi
,
R.
Car
, and
F.
Paesani
, “
Phase diagram of the TIP4P/Ice water model by enhanced sampling simulations
,”
J. Chem. Phys.
157
(
5
),
054504
(
2022
).
23.
P.
Montero de Hijes
,
E.
Sanz
,
L.
Joly
,
C.
Valeriani
, and
F.
Caupin
, “
Viscosity and self-diffusion of supercooled and stretched water from molecular dynamics simulations
,”
J. Chem. Phys.
149
(
9
),
094503
(
2018
).
24.
F. H.
Stillinger
and
C. W.
David
, “
Polarization model for water and its ionic dissociation products
,”
J. Chem. Phys.
69
(
4
),
1473
1484
(
1978
).
25.
L.
Ojamäe
,
I.
Shavitt
, and
S. J.
Singer
, “
Potential models for simulations of the solvated proton in water
,”
J. Chem. Phys.
109
(
13
),
5547
5564
(
1998
).
26.
M.
Raju
,
S.-Y.
Kim
,
A. C. T.
van Duin
, and
K. A.
Fichthorn
, “
ReaxFF reactive force field study of the dissociation of water on titania surfaces
,”
J. Phys. Chem. C
117
,
10558
(
2013
).
27.
U. W.
Schmitt
and
G. A.
Voth
, “
Multistate empirical valence bond model for proton transport in water
,”
J. Phys. Chem. B
102
,
5547
5551
(
1998
).
28.
D.
Marx
, “
Proton transfer 200 years after von Grotthuss: Insights from ab initio simulations
,”
ChemPhysChem
7
,
1848
1870
(
2006
).
29.
M. E.
Tuckerman
,
K.
Laasonen
,
M.
Sprik
, and
M.
Parrinello
, “
Ab initio simulations of water and water ions
,”
J. Phys.: Condens. Matter
6
,
A93
A100
(
1994
).
30.
M. J.
Gillan
,
D.
Alfè
, and
A.
Michaelides
, “
Perspective: How good is DFT for water?
,”
J. Chem. Phys.
144
(
13
),
130901
(
2016
).
31.
J. G.
Brandenburg
,
A.
Zen
,
D.
Alfè
, and
A.
Michaelides
, “
Interaction between water and carbon nanostructures: How good are current density functional approximations?
,”
J. Chem. Phys.
151
(
16
),
164702
(
2019
).
32.
A. P.
Gaiduk
,
F.
Gygi
, and
G.
Galli
, “
Density and compressibility of liquid water and ice from first-principles simulations with hybrid functionals
,”
J. Phys. Chem. Lett.
6
(
15
),
2902
2908
(
2015
).
33.
M.
Del Ben
,
M.
Schönherr
,
J.
Hutter
, and
J.
VandeVondele
, “
Bulk liquid water at ambient temperature and pressure from MP2 theory
,”
J. Phys. Chem. Lett.
4
,
3753
(
2013
).
34.
M.
Del Ben
,
J.
Hutter
, and
J.
VandeVondele
, “
Probing the structural and dynamical properties of liquid water with models including non-local electron correlation
,”
J. Chem. Phys.
143
,
054506
(
2015
).
35.
B. J.
Braams
and
J. M.
Bowman
, “
Permutationally invariant potential energy surfaces in high dimensionality
,”
Int. Rev. Phys. Chem.
28
,
577
606
(
2009
).
36.
G. R.
Medders
,
V.
Babin
, and
F.
Paesani
, “
Development of a ‘first-principles’ water potential with flexible monomers. III. Liquid phase properties
,”
J. Chem. Theory Comput.
10
(
8
),
2906
2910
(
2014
).
37.
E.
Palos
,
S.
Dasgupta
,
E.
Lambros
, and
F.
Paesani
, “
Data-driven many-body potentials from density functional theory for aqueous phase chemistry
,”
Chem. Phys. Rev.
4
,
011301
(
2023
).
38.
Q.
Yu
,
C.
Qu
,
P. L.
Houston
,
A.
Nandi
,
P.
Pandey
,
R.
Conte
, and
J. M.
Bowman
, “
A status report on ‘gold standard’ machine-learned potentials for water
,”
J. Phys. Chem. Lett.
14
,
8077
8087
(
2023
).
39.
C. M.
Handley
and
P. L. A.
Popelier
, “
Potential energy surfaces fitted by artificial neural networks
,”
J. Phys. Chem. A
114
,
3371
3383
(
2010
).
40.
J.
Behler
, “
Neural network potential-energy surfaces in chemistry: A tool for large-scale simulations
,”
Phys. Chem. Chem. Phys.
13
,
17930
17955
(
2011
).
41.
J.
Behler
, “
Perspective: Machine learning potentials for atomistic simulations
,”
J. Chem. Phys.
145
,
170901
(
2016
).
42.
O. T.
Unke
,
S.
Chmiela
,
H. E.
Sauceda
,
M.
Gastegger
,
I.
Poltavsky
,
K. T.
Schütt
,
A.
Tkatchenko
, and
K.-R.
Müller
, “
Machine learning force fields
,”
Chem. Rev.
121
(
16
),
10142
10186
(
2021
).
43.
P.
Friederich
,
F.
Häse
,
J.
Proppe
, and
A.
Aspuru-Guzik
, “
Machine-learned potentials for next-generation matter simulations
,”
Nat. Mater.
20
,
750
761
(
2021
).
44.
J.
Behler
and
G.
Csányi
, “
Machine learning potentials for extended systems: A perspective
,”
Eur. Phys. J. B
94
,
142
(
2021
).
45.
V. L.
Deringer
,
M. A.
Caro
, and
G.
Csányi
, “
Machine learning interatomic potentials as emerging tools for materials science
,”
Adv. Mater.
31
,
1902765
(
2019
).
46.
P. O.
Dral
, “
Quantum chemistry in the age of machine learning
,”
J. Phys. Chem. Lett.
11
,
2336
2347
(
2020
).
47.
F.
Noé
,
A.
Tkatchenko
,
K.-R.
Müller
, and
C.
Clementi
, “
Machine learning for molecular simulation
,”
Annu. Rev. Phys. Chem.
71
,
361
390
(
2020
).
48.
C. M.
Handley
and
J.
Behler
, “
Next generation interatomic potentials for condensed systems
,”
Eur. Phys. J. B
87
,
152
(
2014
).
49.
J.
Behler
, “
Four generations of high-dimensional neural network potentials
,”
Chem. Rev.
121
(
16
),
10037
10072
(
2021
).
50.
V. L.
Deringer
,
A. P.
Bartók
,
N.
Bernstein
,
D. M.
Wilkins
,
M.
Ceriotti
, and
G.
Csányi
, “
Gaussian process regression for materials and molecules
,”
Chem. Rev.
121
,
10073
10141
(
2021
).
51.
E.
Kocer
,
T. W.
Ko
, and
J.
Behler
, “
Neural network potentials: A concise overview of methods
,”
Annu. Rev. Phys. Chem.
73
,
163
186
(
2022
).
52.
T. W.
Ko
,
J. A.
Finkler
,
S.
Goedecker
, and
J.
Behler
, “
General-purpose machine learning potentials capturing nonlocal charge transfer
,”
Acc. Chem. Res.
54
,
808
817
(
2021
).
53.
S.
Käser
,
L. I.
Vazquez-Salazar
,
M.
Meuwly
, and
K.
Töpfer
, “
Neural network potentials for chemistry: Concepts, applications and prospects
,”
Digital Discovery
2
,
28
(
2023
).
54.
A. M.
Tokita
and
J.
Behler
, “
Tutorial: How to train a neural network potential
,”
J. Chem. Phys.
159
,
121501
(
2023
).
55.
T. B.
Blank
,
S. D.
Brown
,
A. W.
Calhoun
, and
D. J.
Doren
, “
Neural network models of potential energy surfaces
,”
J. Chem. Phys.
103
,
4129
4137
(
1995
).
56.
H.
Gassner
,
M.
Probst
,
A.
Lauenstein
, and
K.
Hermansson
, “
Representation of intermolecular potential functions by neural networks
,”
J. Phys. Chem. A
102
,
4596
4605
(
1998
).
57.
S.
Lorenz
,
A.
Groß
, and
M.
Scheffler
, “
Representing high-dimensional potential-energy surfaces for reactions at surfaces by neural networks
,”
Chem. Phys. Lett.
395
,
210
215
(
2004
).
58.
J.
Behler
,
S.
Lorenz
, and
K.
Reuter
, “
Representing molecule-surface interactions with symmetry-adapted neural networks
,”
J. Chem. Phys.
127
,
014705
(
2007
).
59.
A.
Brown
,
B. J.
Braams
,
K.
Christoffel
,
Z.
Jin
, and
J. M.
Bowman
, “
Classical and quasiclassical spectral analysis of CH5+ using an ab initio potential energy surface
,”
J. Chem. Phys.
119
,
8790
8793
(
2003
).
60.
J.
Behler
and
M.
Parrinello
, “
Generalized neural-network representation of high-dimensional potential-energy surfaces
,”
Phys. Rev. Lett.
98
,
146401
(
2007
).
61.
J.
Behler
, “
First principles neural network potentials for reactive simulations of large molecular and condensed systems
,”
Angew. Chem., Int. Ed.
56
,
12828
(
2017
).
62.
J.
Behler
, “
Constructing high-dimensional neural network potentials: A tutorial review
,”
Int. J. Quantum Chem.
115
,
1032
1050
(
2015
).
63.
J.
Behler
, “
Representing potential energy surfaces by high-dimensional neural network potentials
,”
J. Phys.: Condens. Matter
26
(
18
),
183001
(
2014
).
64.
J.
Behler
, “
Atom-centered symmetry functions for constructing high-dimensional neural network potentials
,”
J. Chem. Phys.
134
,
074106
(
2011
).
65.
A. P.
Bartók
,
R.
Kondor
, and
G.
Csányi
, “
On representing chemical environments
,”
Phys. Rev. B
87
,
184115
(
2013
).
66.
W.
Pronobis
,
A.
Tkatchenko
, and
K.-R.
Müller
, “
Many-body descriptors for predicting molecular properties with machine learning: Analysis of pairwise and three-body interactions in molecules
,”
J. Chem. Theory Comput.
14
(
6
),
2991
3003
(
2018
).
67.
F.
Musil
,
A.
Grisafi
,
A. P.
Bartók
,
C.
Ortner
,
G.
Csányi
, and
M.
Ceriotti
, “
Physics-inspired structural representations for molecules and materials
,”
Chem. Rev.
121
(
16
),
9759
9815
(
2021
).
68.
R.
Drautz
, “
Atomic cluster expansion for accurate and transferable interatomic potentials
,”
Phys. Rev. B
99
,
014104
(
2019
).
69.
J. S.
Smith
,
O.
Isayev
, and
A. E.
Roitberg
, “
ANI-1: An extensible neural network potential with DFT accuracy at force field computational cost
,”
Chem. Sci.
8
,
3192
3203
(
2017
).
70.
L.
Zhang
,
J.
Han
,
H.
Wang
,
R.
Car
, and
W.
E
, “
Deep potential molecular dynamics: A scalable model with the accuracy of quantum mechanics
,”
Phys. Rev. Lett.
120
,
143001
(
2018
).
71.
H.
Wang
,
L.
Zhang
,
J.
Han
, and
W.
E
, “
DeePMD-kit: A deep learning package for many-body potential energy representation and molecular dynamics
,”
Comput. Phys. Commun.
228
,
178
184
(
2018
).
72.
J.
Han
,
L.
Zhang
,
R.
Car
, and
W.
E
, “
Deep potential: A general representation of a many-body potential energy surface
,”
Commun. Comput. Phys.
23
,
629
639
(
2018
).
73.
Y.
Zhang
,
C.
Hu
, and
B.
Jiang
, “
Embedded atom neural network potentials: Efficient and accurate machine learning with a physically inspired representation
,”
J. Phys. Chem. Lett.
10
,
4962
4967
(
2019
).
74.
V.
Zaverkin
,
D.
Holzmüller
,
I.
Steinwart
, and
J.
Kästner
, “
Fast and sample-efficient interatomic neural network potentials for molecules and materials based on Gaussian moments
,”
J. Chem. Theory Comput.
17
,
6658
6670
(
2021
).
75.
J.
Gilmer
,
S. S.
Schoenholz
,
P. F.
Riley
,
O.
Vinyals
, and
G. E.
Dahl
, “
Neural message passing for quantum chemistry
,” in
Proceedings of the 34th International Conference on Machine Learning
, edited by
D.
Precup
and
Y. W.
Teh
(
PMLR
,
2017
), Vol.
70
, pp.
1263
1272
.
76.
K. T.
Schütt
,
F.
Arbabzadah
,
S.
Chmiela
,
K. R.
Müller
, and
A.
Tkatchenko
, “
Quantum-chemical insights from deep tensor neural networks
,”
Nat. Commun.
8
,
13890
(
2017
).
77.
K. T.
Schütt
,
H. E.
Sauceda
,
P.-J.
Kindermans
,
A.
Tkatchenko
, and
K. R.
Müller
, “
SchNet—A deep learning architecture for molecules and materials
,”
J. Chem. Phys.
148
,
241722
(
2018
).
78.
R.
Zubatyuk
,
J. S.
Smith
,
J.
Leszczynski
, and
O.
Isayev
, “
Accurate and transferable multitask prediction of chemical properties with an atoms-in-molecules neural network
,”
Sci. Adv.
5
,
eaav6490
(
2019
).
79.
K.
Schütt
,
O.
Unke
, and
M.
Gastegger
, “
Equivariant message passing for the prediction of tensorial properties and molecular spectra
,” in
Proceedings of the 38th International Conference on Machine Learning
, edited by
M.
Meila
and
T.
Zhang
(
PMLR
,
2021
), Vol.
139
, pp.
9377
9388
.
80.
S.
Batzner
,
A.
Musaelian
,
L.
Sun
,
M.
Geiger
,
J. P.
Mailoa
,
M.
Kornbluth
,
N.
Molinari
,
T. E.
Smidt
, and
B.
Kozinsky
, “
E(3)-equivariant graph neural networks for data-efficient and accurate interatomic potentials
,”
Nat. Commun.
13
(
1
),
2453
(
2022
).
81.
A.
Musaelian
,
S.
Batzner
,
A.
Johansson
,
L.
Sun
,
C. J.
Owen
,
M.
Kornbluth
, and
B.
Kozinsky
, “
Learning local equivariant representations for large-scale atomistic dynamics
,”
Nat. Commun.
14
,
579
(
2023
).
82.
I.
Batatia
,
D. P.
Kovacs
,
G.
Simm
,
C.
Ortner
, and
G.
Csanyi
, “
Mace: Higher order equivariant message passing neural networks for fast and accurate force fields
,” in
Advances in Neural Information Processing Systems
, edited by
S.
Koyejo
,
S.
Mohamed
,
A.
Agarwal
,
D.
Belgrave
,
K.
Cho
, and
A.
Oh
(
Curran Associates, Inc.
,
2022
), Vol.
35
, pp.
11423
11436
.
83.
A. P.
Bartók
,
M. C.
Payne
,
R.
Kondor
, and
G.
Csányi
, “
Gaussian approximation potentials: The accuracy of quantum mechanics, without the electrons
,”
Phys. Rev. Lett.
104
,
136403
(
2010
).
84.
A. V.
Shapeev
, “
Moment tensor potentials: A class of systematically improvable interatomic potentials
,”
Multiscale Model. Simul.
14
,
1153
1173
(
2016
).
85.
A. P.
Thompson
,
L. P.
Swiler
,
C. R.
Trott
,
S. M.
Foiles
, and
G. J.
Tucker
, “
Spectral neighbor analysis method for automated generation of quantum-accurate interatomic potentials
,”
J. Comput. Phys.
285
,
316
330
(
2015
).
86.
S.
Yue
,
M. C.
Muniz
,
M. F.
Calegari Andrade
,
L.
Zhang
,
R.
Car
, and
A. Z.
Panagiotopoulos
, “
When do short-range atomistic machine-learning models fall short?
,”
J. Chem. Phys.
154
,
034111
(
2021
).
87.
S.
Houlding
,
S. Y.
Liem
, and
P. L. A.
Popelier
, “
A polarizable high-rank quantum topological electrostatic potential developed using neural networks: Molecular dynamics simulations on the hydrogen fluoride dimer
,”
Int. J. Quantum Chem.
107
,
2817
2827
(
2007
).
88.
M. J. L.
Mills
and
P. L. A.
Popelier
, “
Polarisable multipolar electrostatics from the machine learning method kriging: An application to alanine
,”
Theor. Chem. Acc.
131
,
1137
(
2012
).
89.
C. M.
Handley
,
G. I.
Hawe
,
D. B.
Kell
, and
P. L. A.
Popelier
, “
Optimal construction of a fast and accurate polarisable water potential based on multipole moments trained by machine learning
,”
Phys. Chem. Chem. Phys.
11
,
6365
6376
(
2009
).
90.
N.
Artrith
,
T.
Morawietz
, and
J.
Behler
, “
High-dimensional neural-network potentials for multicomponent systems: Applications to zinc oxide
,”
Phys. Rev. B
83
,
153101
(
2011
).
91.
T.
Morawietz
,
V.
Sharma
, and
J.
Behler
, “
A neural network potential-energy surface for the water dimer based on environment-dependent atomic energies and charges
,”
J. Chem. Phys.
136
,
064103
(
2012
).
92.
K.
Yao
,
J. E.
Herr
,
D. W.
Toth
,
R.
Mckintyre
, and
J.
Parkhill
, “
The TensorMol-0.1 model chemistry: A neural network augmented with long-range physics
,”
Chem. Sci.
9
,
2261
2269
(
2018
).
93.
O. T.
Unke
and
M.
Meuwly
, “
PhysNet: A neural network for predicting energies, forces, dipole moments, and partial charges
,”
J. Chem. Theory Comput.
15
,
3678
3693
(
2019
).
94.
O. T.
Unke
,
S.
Chmiela
,
M.
Gastegger
,
K. T.
Schütt
,
H. E.
Sauceda
, and
K.-R.
Müller
, “
SpookyNet: Learning force fields with electronic degrees of freedom and nonlocal effects
,”
Nat. Commun.
12
(
1
),
7273
(
2021
).
95.
T.
Bereau
,
D.
Andrienko
, and
O. A.
von Lilienfeld
, “
Transferable atomic multipole machine learning models for small organic molecules
,”
J. Chem. Theory Comput.
11
,
3225
3233
(
2015
).
96.
A. E.
Sifain
,
N.
Lubbers
,
B. T.
Nebgen
,
J. S.
Smith
,
A. Y.
Lokhov
,
O.
Isayev
,
A. E.
Roitberg
,
K.
Barros
, and
S.
Tretiak
, “
Discovering a transferable charge assignment model using machine learning
,”
J. Phys. Chem. Lett.
9
,
4495
4501
(
2018
).
97.
L.
Zhang
,
H.
Wang
,
M. C.
Muniz
,
A. Z.
Panagiotopoulos
,
R.
Car
, and
W.
E
, “
A deep potential model with long-range electrostatic interactions
,”
J. Chem. Phys.
156
(
12
),
124107
(
2022
).
98.
M.
Gastegger
,
J.
Behler
, and
P.
Marquetand
, “
Machine learning molecular dynamics for the simulation of infrared spectra
,”
Chem. Sci.
8
,
6924
6935
(
2017
).
99.
S. A.
Ghasemi
,
A.
Hofstetter
,
S.
Saha
, and
S.
Goedecker
, “
Interatomic potentials for ionic systems with density functional accuracy based on charge densities obtained by a neural network
,”
Phys. Rev. B
92
,
045131
(
2015
).
100.
A. K.
Rappe
and
W. A.
Goddard
III
, “
Charge equilibration for molecular dynamics simulations
,”
J. Phys. Chem.
95
,
3358
3363
(
1991
).
101.
X.
Xie
,
K. A.
Persson
, and
D. W.
Small
, “
Incorporating electronic information into machine learning potential energy surfaces via approaching the ground-state electronic energy as a function of atom-based electronic populations
,”
J. Chem. Theory Comput.
16
,
4256
4270
(
2020
).
102.
T. W.
Ko
,
J. A.
Finkler
,
S.
Goedecker
, and
J.
Behler
, “
A fourth-generation high-dimensional neural network potential with accurate electrostatics including non-local charge transfer
,”
Nat. Commun.
12
,
398
(
2021
).
103.
L. D.
Jacobson
,
J. M.
Stevenson
,
F.
Ramezanghorbani
,
D.
Ghoreishi
,
K.
Leswing
,
E. D.
Harder
, and
R.
Abel
, “
Transferable neural network potential energy surfaces for closed-shell organic molecules: Extension to ions
,”
J. Chem. Theory Comput.
18
,
2354
2366
(
2022
).
104.
C.
Schran
,
J.
Behler
, and
D.
Marx
, “
Automated fitting of neural network potentials at coupled cluster accuracy: Protonated water clusters as testing ground
,”
J. Chem. Theory Comput.
16
,
88
99
(
2020
).
105.
J. S.
Smith
,
B.
Nebgen
,
N.
Lubbers
,
O.
Isayev
, and
A. E.
Roitberg
, “
Less is more: Sampling chemical space with active learning
,”
J. Chem. Phys.
148
,
241733
(
2018
).
106.
L.
Zhang
,
D.-Y.
Lin
,
H.
Wang
,
R.
Car
, and
W.
E
, “
Active learning of uniformly accurate interatomic potentials for materials simulation
,”
Phys. Rev. Mater.
3
,
023804
(
2019
).
107.
C.
Schran
,
F. L.
Thiemann
,
P.
Rowe
,
E. A.
Müller
,
O.
Marsalek
, and
A.
Michaelides
, “
Machine learning potentials for complex aqueous systems made simple
,”
Proc. Natl. Acad. Sci. U. S. A.
118
(
38
),
e2110077118
(
2021
).
108.
K.
Tai No
,
B.
Ha Chang
,
S.
Yeon Kim
,
M.
Shik Jhon
, and
H. A.
Scheraga
, “
Description of the potential energy surface of the water dimer with an artificial neural network
,”
Chem. Phys. Lett.
271
,
152
156
(
1997
).
109.
S.
Manzhos
,
X.
Wang
,
R.
Dawes
, and
T.
Carrington
, Jr.
, “
A nested molecule-independent neural network approach for high-quality potential fits
,”
J. Phys. Chem. A
110
,
5295
5304
(
2006
).
110.
H. T. T.
Nguyen
and
H. M.
Le
, “
Modified feed-forward neural network structures and combined-function-derivative approximations incorporating exchange symmetry for potential energy surface fitting
,”
J. Phys. Chem. A
116
,
4629
4638
(
2012
).
111.
X.
Huang
,
B. J.
Braams
, and
J. M.
Bowman
, “
Ab initio potential energy and dipole moment surfaces for H5O2+
,”
J. Chem. Phys.
122
,
044308
(
2005
).
112.
X.
Huang
,
B. J.
Braams
, and
J. M.
Bowman
, “
Ab initio potential energy and dipole moment surfaces of (H2O)2
,”
J. Phys. Chem. A
110
,
445
451
(
2006
).
113.
X.
Huang
,
B. J.
Braams
,
J. M.
Bowman
et al, “
New ab initio potential energy surface and the vibration-rotation-tunneling levels of (H2O)2 and (D2O)2
,”
J. Chem. Phys.
128
,
034312
(
2008
).
114.
Y.
Wang
,
B. C.
Shepler
,
B. J.
Braams
, and
J. M.
Bowman
, “
Full-dimensional, ab initio potential energy and dipole moment surfaces for water
,”
J. Chem. Phys.
131
,
054511
(
2009
).
115.
A.
Shank
,
Y.
Wang
,
A.
Kaledin
,
B. J.
Braams
, and
J. M.
Bowman
, “
Accurate ab initio and ‘hybrid’ potential energy surfaces, intramolecular vibrational energies, and classical ir spectrum of the water dimer
,”
J. Chem. Phys.
130
,
144314
(
2009
).
116.
J. M.
Bowman
,
B. J.
Braams
,
S.
Carter
,
C.
Chen
,
G.
Czakó
,
B.
Fu
,
X.
Huang
,
E.
Kamarchik
,
A. R.
Sharma
,
B. C.
Shepler
,
Y.
Wang
, and
Z.
Xie
, “
Ab-initio-based potential energy surfaces for complex molecules and molecular complexes
,”
J. Phys. Chem. Lett.
1
,
1866
1874
(
2010
).
117.
Y.
Wang
and
J. M.
Bowman
, “
Towards an ab initio flexible potential for water, and post-harmonic quantum vibrational analysis of water clusters
,”
Chem. Phys. Lett.
491
,
1
10
(
2010
).
118.
Z.
Xie
and
J. M.
Bowman
, “
Permutationally invariant polynomial basis for molecular energy surface fitting via monomial symmetrization
,”
J. Chem. Theory Comput.
6
,
26
(
2010
).
119.
Y.
Wang
,
X.
Huang
,
B. C.
Shepler
,
B. J.
Braams
, and
J. M.
Bowman
, “
Flexible, ab initio potential, and dipole moment surfaces for water. I. Tests and applications for clusters up to the 22-mer
,”
J. Chem. Phys.
134
,
094509
(
2011
).
120.
Q.
Yu
and
J. M.
Bowman
, “
Classical, thermostated ring polymer, and quantum VSCF/VCI calculations of IR spectra of H7O3+ and H9O4+ (eigen) and comparison with experiment
,”
J. Phys. Chem. A
123
,
1399
1409
(
2019
).
121.
Q.
Yu
,
C.
Qu
,
P. L.
Houston
,
R.
Conte
,
A.
Nandi
, and
J. M.
Bowman
, “
q-AQUA: A many-body CCSD(T) water potential, including four-body interactions, demonstrates the quantum nature of water from clusters to the liquid phase
,”
J. Phys. Chem. Lett.
13
,
5068
5074
(
2022
).
122.
C.
Qu
,
Q.
Yu
,
R.
Conte
,
P. L.
Houston
,
A.
Nandi
, and
J. M.
Bomwan
, “
A Δ-machine learning approach for force fields, illustrated by a CCSD(T) 4-body correction to the MB-pol water potential
,”
Digital Discovery
1
(
5
),
658
664
(
2022
).
123.
T.
Morawietz
and
J.
Behler
, “
A density-functional theory-based neural network potential for water clusters including van der Waals corrections
,”
J. Phys. Chem. A
117
,
7356
(
2013
).
124.
T.
Morawietz
and
J.
Behler
, “
A full-dimensional neural network potential-energy surface for water clusters up to the hexamer
,”
Z. Phys. Chem.
227
,
1559
1581
(
2013
).
125.
S.
Kondati Natarajan
,
T.
Morawietz
, and
J.
Behler
, “
Representing the potential-energy surface of protonated water clusters by high-dimensional neural network potentials
,”
Phys. Chem. Chem. Phys.
17
,
8356
(
2015
).
126.
H.
Zhou
,
Y.-J.
Feng
,
C.
Wang
,
T.
Huang
,
Y.-R.
Liu
,
S.
Jiang
,
C.-Y.
Wang
, and
W.
Huang
, “
A high-accuracy machine-learning water model for exploring water nanocluster structures
,”
Nanoscale
13
(
28
),
12212
12222
(
2021
).
127.
C.
Schran
,
F.
Brieuc
, and
D.
Marx
, “
Converged colored noise path integral molecular dynamics study of the Zundel cation down to ultralow temperatures at coupled cluster accuracy
,”
J. Chem. Theory Comput.
14
,
5068
5078
(
2018
).
128.
C.
Schran
and
D.
Marx
, “
Quantum nature of the hydrogen bond from ambient conditions down to ultra-low temperatures
,”
Phys. Chem. Chem. Phys.
21
,
24967
24975
(
2019
).
129.
C.
Schran
,
F.
Brieuc
, and
D.
Marx
, “
Transferability of machine learning potentials: Protonated water neural network potential applied to the protonated water hexamer
,”
J. Chem. Phys.
154
,
051101
(
2021
).
130.
R.
Beckmann
,
F.
Brieuc
,
C.
Schran
, and
D.
Marx
, “
Infrared spectra at coupled cluster accuracy from neural network representations
,”
J. Chem. Theory Comput.
18
(
9
),
5492
5501
(
2022
).
131.
C.
Schran
,
F.
Uhl
,
J.
Behler
, and
D.
Marx
, “
High-dimensional neural network potentials for solvation: The case of protonated water clusters in helium
,”
J. Chem. Phys.
148
,
102310
(
2017
).
132.
C. M.
Handley
and
P. L. A.
Popelier
, “
Dynamically polarizable water potential based on multipole moments trained by machine learning
,”
J. Chem. Theory Comput.
5
,
1474
1489
(
2009
).
133.
S. J.
Davie
,
N.
Di Pasquale
, and
P. L.
Popelier
, “
Incorporation of local structure into kriging models for the prediction of atomistic properties in the water decamer
,”
J. Comput. Chem.
37
(
27
),
2409
2422
(
2016
).
134.
Z. E.
Hughes
,
E.
Ren
,
J. C.
Thacker
,
B. C.
Symons
,
A. F.
Silva
, and
P. L.
Popelier
, “
A FFLUX water model: Flexible, polarizable and with a multipolar description of electrostatics
,”
J. Comput. Chem.
41
(
7
),
619
628
(
2020
).
135.
A.
Grisafi
,
D. M.
Wilkins
,
G.
Csányi
, and
M.
Ceriotti
, “
Symmetry-adapted machine learning for tensorial properties of atomistic systems
,”
Phys. Rev. Lett.
120
(
3
),
036002
(
2018
).
136.
V. H. A.
Nguyen
and
A.
Lunghi
, “
Predicting tensorial molecular properties with equivariant machine learning models
,”
Phys. Rev. B
105
(
16
),
165131
(
2022
).
137.
J.
Schuhmacher
,
G.
Mazzola
,
F.
Tacchino
,
O.
Dmitriyeva
,
T.
Bui
,
S.
Huang
, and
I.
Tavernelli
, “
Extending the reach of quantum computing for materials science with machine learning potentials
,”
AIP Adv.
12
(
11
),
115321
(
2022
).
138.
S.
Bose
,
D.
Dhawan
,
S.
Nandi
,
R. R.
Sarkar
, and
D.
Ghosh
, “
Machine learning prediction of interaction energies in rigid water clusters
,”
Phys. Chem. Chem. Phys.
20
(
35
),
22987
22996
(
2018
).
139.
A. F.
Silva
,
L. J.
Duarte
, and
P. L.
Popelier
, “
Contributions of IQA electron correlation in understanding the chemical bond and non-covalent interactions
,”
Struct. Chem.
31
(
2
),
507
519
(
2020
).
140.
L.
Cheng
,
J.
Sun
,
J. E.
Deustua
,
V. C.
Bhethanabotla
, and
T. F.
Miller
, “
Molecular-orbital-based machine learning for open-shell and multi-reference systems with kernel addition Gaussian process regression
,”
J. Chem. Phys.
157
(
15
),
154105
(
2022
).
141.
L.
Cheng
,
M.
Welborn
,
A. S.
Christensen
, and
T. F.
Miller
, “
A universal density matrix functional from molecular orbital-based machine learning: Transferability across organic molecules
,”
J. Chem. Phys.
150
(
13
),
131103
(
2019
).
142.
J. P.
Coe
, “
Machine learning configuration interaction
,”
J. Chem. Theory Comput.
14
(
11
),
5739
5749
(
2018
).
143.
J. P.
Coe
, “
Machine learning configuration interaction for ab initio potential energy curves
,”
J. Chem. Theory Comput.
15
(
11
),
6179
6189
(
2019
).
144.
F.
Lu
,
L.
Cheng
,
R. J.
DiRisio
,
J. M.
Finney
,
M. A.
Boyer
,
P.
Moonkaen
,
J.
Sun
,
S. J.
Lee
,
J. E.
Deustua
,
T. F.
Miller
III
, and
A. B.
McCoy
, “
Fast near ab initio potential energy surfaces using machine learning
,”
J. Phys. Chem. A
126
(
25
),
4013
4024
(
2022
).
145.
M.
Welborn
,
L.
Cheng
, and
T. F.
Miller
III
, “
Transferability in machine learning for electronic structure via the molecular orbital basis
,”
J. Chem. Theory Comput.
14
(
9
),
4772
4779
(
2018
).
146.
R. J.
DiRisio
,
F.
Lu
, and
A. B.
McCoy
, “
GPU-accelerated neural network potential energy surfaces for diffusion Monte Carlo
,”
J. Phys. Chem. A
125
(
26
),
5849
5859
(
2021
).
147.
K. H.
Cho
,
K. T.
No
, and
H. A.
Scheraga
, “
A polarizable force field for water using an artificial neural network
,”
J. Mol. Struct.
641
,
77
91
(
2002
).
148.
W. L.
Jorgensen
,
J.
Chandrasekhar
,
J. D.
Madura
,
R. W.
Impey
, and
M. L.
Klein
, “
Comparison of simple potential functions for simulating liquid water
,”
J. Chem. Phys.
79
,
926
935
(
1983
).
149.
A. P.
Bartók
,
M. J.
Gillan
,
F. R.
Manby
, and
G.
Csányi
, “
Machine-learning approach for one- and two-body corrections to density functional theory: Applications to molecular and condensed water
,”
Phys. Rev. B
88
,
054104
(
2013
).
150.
D.
Alfè
,
A. P.
Bartók
,
G.
Csányi
, and
M. J.
Gillan
, “
Communication: Energy benchmarking with quantum Monte Carlo for water nano-droplets and bulk liquid water
,”
J. Chem. Phys.
138
,
221102
(
2013
).
151.
T.
Morawietz
,
A.
Singraber
,
C.
Dellago
, and
J.
Behler
, “
How van der Waals interactions determine the unique properties of water
,”
Proc. Natl. Acad. Sci. U. S. A.
113
,
8368
(
2016
).
152.
A.
Singraber
,
T.
Morawietz
,
J.
Behler
, and
C.
Dellago
, “
Density anomaly of water at negative pressures from first principles
,”
J. Phys.: Condens. Matter
30
,
254005
(
2018
).
153.
P.
Montero de Hijes
,
S.
Romano
,
A.
Gorfer
, and
C.
Dellago
, “
The kinetics of the ice–water interface from ab initio machine learning simulations
,”
J. Chem. Phys.
158
(
20
),
204706
(
2023
).
154.
B.
Cheng
,
J.
Behler
, and
M.
Ceriotti
, “
Nuclear quantum effects in water at the triple point: Using theory as a link between experiments
,”
J. Phys. Chem. Lett.
7
,
2210
2215
(
2016
).
155.
V.
Kapil
,
J.
Behler
, and
M.
Ceriotti
, “
High order path integrals made easy
,”
J. Chem. Phys.
145
,
234103
(
2016
).
156.
V.
Kapil
,
D. M.
Wilkins
,
J.
Lan
, and
M.
Ceriotti
, “
Inexpensive modeling of quantum dynamics using path integral generalized Langevin equation thermostats
,”
J. Chem. Phys.
152
,
124104
(
2020
).
157.
Y.
Yao
and
Y.
Kanai
, “
Temperature dependence of nuclear quantum effects on liquid water via artificial neural network model based on SCAN meta-GGA functional
,”
J. Chem. Phys.
153
(
4
),
044114
(
2020
).
158.
C.
Li
and
G. A.
Voth
, “
Using machine learning to greatly accelerate path integral ab initio molecular dynamics
,”
J. Chem. Theory Comput.
18
,
599
604
(
2022
).
159.
B.
Cheng
,
E. A.
Engel
,
J.
Behler
,
C.
Dellago
, and
M.
Ceriotti
, “
Ab initio thermodynamics of liquid and solid water
,”
Proc. Natl. Acad. Sci. U. S. A.
116
,
1110
1115
(
2019
).
160.
Y.
Yao
and
Y.
Kanai
, “
Nuclear quantum effect and its temperature dependence in liquid water from random phase approximation via artificial neural network
,”
J. Phys. Chem. Lett.
12
(
27
),
6354
6362
(
2021
).
161.
T.
Morawietz
,
O.
Marsalek
,
S. R.
Pattenaude
,
L. M.
Streacker
,
D.
Ben-Amotz
, and
T. E.
Markland
, “
The interplay of structure and dynamics in the Raman spectrum of liquid water over the full frequency and temperature range
,”
J. Phys. Chem. Lett.
9
,
851
857
(
2018
).
162.
T.
Morawietz
,
A. S.
Urbina
,
P. K.
Wise
,
X.
Wu
,
W.
Lu
,
D.
Ben-Amotz
, and
T. E.
Markland
, “
Hiding in the crowd: Spectral signatures of overcoordinated hydrogen-bond environments
,”
J. Phys. Chem. Lett.
10
,
6067
6073
(
2019
).
163.
Y.
Zhang
,
S.
Ye
,
J.
Zhang
,
C.
Hu
,
J.
Jiang
, and
B.
Jiang
, “
Efficient and accurate simulations of vibrational and electronic spectra with symmetry-preserving neural network models for tensorial properties
,”
J. Phys. Chem. B
124
(
33
),
7284
7290
(
2020
).
164.
L.
Zhang
,
H.
Wang
, and
W.
E
, “
Adaptive coupling of a deep neural network potential to a classical force field
,”
J. Chem. Phys.
149
,
154107
(
2018
).
165.
L.
Zhang
,
J.
Han
,
H.
Wang
,
R.
Car
, and
W.
E
, “
DeePCG: Constructing coarse-grained models via deep neural networks
,”
J. Chem. Phys.
149
,
034101
(
2018
).
166.
C.
Andreani
,
G.
Romanelli
,
A.
Parmentier
,
R.
Senesi
,
A. I.
Kolesnikov
,
H.-Y.
Ko
,
M. F.
Calegari Andrade
, and
R.
Car
, “
Hydrogen dynamics in supercritical water probed by neutron scattering and computer simulations
,”
J. Phys. Chem. Lett.
11
,
9461
9467
(
2020
).
167.
H.-Y.
Ko
,
L.
Zhang
,
B.
Santra
,
H.
Wang
,
W.
E
,
R. A.
DiStasio
, Jr.
, and
R.
Car
, “
Isotope effects in liquid water via deep potential molecular dynamics
,”
Mol. Phys.
117
(
22
),
3269
3281
(
2019
).
168.
J.
Xu
,
C.
Zhang
,
L.
Zhang
,
M.
Chen
,
B.
Santra
, and
X.
Wu
, “
Isotope effects in molecular structures and electronic properties of liquid water via deep potential molecular dynamics based on the SCAN functional
,”
Phys. Rev. B
102
,
214113
(
2020
).
169.
J.
Liu
,
J.
Lan
, and
X.
He
, “
Toward high-level machine learning potential for water based on quantum fragmentation and neural networks
,”
J. Phys. Chem. A
126
(
24
),
3926
3936
(
2022
).
170.
G. M.
Sommers
,
M. F.
Calegari Andrade
,
L.
Zhang
,
H.
Wang
, and
R.
Car
, “
Raman spectrum and polarizability of liquid water from deep neural networks
,”
Phys. Chem. Chem. Phys.
22
,
10592
10602
(
2020
).
171.
L.
Zhang
,
M.
Chen
,
X.
Wu
,
H.
Wang
,
W.
E
, and
R.
Car
, “
Deep neural network for the dielectric response of insulators
,”
Phys. Rev. B
102
,
041121
(
2020
).
172.
C.
Zhang
,
F.
Tang
,
M.
Chen
,
J.
Xu
,
L.
Zhang
,
D. Y.
Qiu
,
J. P.
Perdew
,
M. L.
Klein
, and
X.
Wu
, “
Modeling liquid water by climbing up Jacob’s ladder in density functional theory facilitated by using deep neural network potentials
,”
J. Phys. Chem. B
125
,
11444
11456
(
2021
).
173.
A.
Krishnamoorthy
,
K.-i.
Nomura
,
N.
Baradwaj
,
K.
Shimamura
,
P.
Rajak
,
A.
Mishra
,
S.
Fukushima
,
F.
Shimojo
,
R.
Kalia
,
A.
Nakano
, and
P.
Vashishta
, “
Dielectric constant of liquid water determined with neural network quantum molecular dynamics
,”
Phys. Rev. Lett.
126
,
216403
(
2021
).
174.
Y.
Shi
,
C. C.
Doyle
, and
T. L.
Beck
, “
Condensed phase water molecular multipole moments from deep neural network models trained on ab initio simulation data
,”
J. Phys. Chem. Lett.
12
,
10310
10317
(
2021
).
175.
P.
Schienbein
, “
Spectroscopy from machine learning by accurately representing the atomic polar tensor
,”
J. Chem. Theory Comput.
19
(
3
),
705
712
(
2023
).
176.
K.
Inoue
,
Y.
Litman
,
D. M.
Wilkins
,
Y.
Nagata
, and
M.
Okuno
, “
Is unified understanding of vibrational coupling of water possible? Hyper-Raman measurement and machine learning spectra
,”
J. Phys. Chem. Lett.
14
(
12
),
3063
3068
(
2023
).
177.
P. M.
Piaggi
,
A. Z.
Panagiotopoulos
,
P. G.
Debenedetti
, and
R.
Car
, “
Phase equilibrium of water with hexagonal and cubic ice using the SCAN functional
,”
J. Chem. Theory Comput.
17
,
3065
3077
(
2021
).
178.
L.
Zhang
,
H.
Wang
,
R.
Car
, and
W.
E
, “
Phase diagram of a deep potential water model
,”
Phys. Rev. Lett.
126
,
236001
(
2021
).
179.
A.
Reinhardt
and
B.
Cheng
, “
Quantum-mechanical exploration of the phase diagram of water
,”
Nat. Commun.
12
(
1
),
588
(
2021
).
180.
B.
Cheng
,
M.
Bethkenhagen
,
C. J.
Pickard
, and
S.
Hamel
, “
Phase behaviours of superionic water at planetary conditions
,”
Nat. Phys.
17
(
11
),
1228
1232
(
2021
).
181.
D.
Tisi
,
L.
Zhang
,
R.
Bertossa
,
H.
Wang
,
R.
Car
, and
S.
Baroni
, “
Heat transport in liquid water from first-principles and deep neural network simulations
,”
Phys. Rev. B
104
,
224202
(
2021
).
182.
K.
Xu
,
Y.
Hao
,
T.
Liang
,
P.
Ying
,
J.
Xu
,
J.
Wu
, and
Z.
Fan
, “
Accurate prediction of heat conductivity of water by a neuroevolution potential
,”
J. Chem. Phys.
158
(
20
),
204114
(
2023
).
183.
C.
Malosso
,
L.
Zhang
,
R.
Car
,
S.
Baroni
, and
D.
Tisi
, “
Viscosity in water from first-principles and deep-neural-network simulations
,”
npj Comput. Mater.
8
(
1
),
139
(
2022
).
184.
Z.
Chen
,
M. L.
Berrens
,
K.-T.
Chan
,
Z.
Fan
, and
D.
Donadio
, “
Thermodynamics of water and ice from a fast and scalable first-principles neuroevolution potential
,”
J. Chem. Eng. Data
69
,
128
(
2023
).
185.
D.
Lu
,
H.
Wang
,
M.
Chen
,
L.
Lin
,
R.
Car
,
W.
E
,
W.
Jia
, and
L.
Zhang
, “
86 PFLOPS deep potential molecular dynamics simulation of 100 million atoms with ab initio accuracy
,”
Comput. Phys. Commun.
259
,
107624
(
2021
).
186.
X.-M.
Bai
and
M.
Li
, “
Test of classical nucleation theory via molecular-dynamics simulation
,”
J. Chem. Phys.
122
(
22
),
224510
(
2005
).
187.
E.
Sanz
,
C.
Vega
,
J.
Espinosa
,
R.
Caballero-Bernal
,
J.
Abascal
, and
C.
Valeriani
, “
Homogeneous ice nucleation at moderate supercooling from molecular simulation
,”
J. Am. Chem. Soc.
135
(
40
),
15008
15017
(
2013
).
188.
P. M.
Piaggi
,
J.
Weis
,
A. Z.
Panagiotopoulos
,
P. G.
Debenedetti
, and
R.
Car
, “
Homogeneous ice nucleation in an ab initio machine-learning model of water
,”
Proc. Natl. Acad. Sci. U. S. A.
119
(
33
),
e2207294119
(
2022
).
189.
M.
Chen
,
L.
Tan
,
H.
Wang
,
L.
Zhang
, and
H.
Niu
, “
Imperfectly coordinated water molecules pave the way for homogeneous ice nucleation
,” arXiv:2304.12665 (
2023
).
190.
T. E.
Gartner
,
L.
Zhang
,
P. M.
Piaggi
,
R.
Car
,
A. Z.
Panagiotopoulos
, and
P. G.
Debenedetti
, “
Signatures of a liquid–liquid transition in an ab initio deep neural network model for water
,”
Proc. Natl. Acad. Sci. U. S. A.
117
(
42
),
26040
26046
(
2020
).
191.
T. E.
Gartner
III
,
P. M.
Piaggi
,
R.
Car
,
A. Z.
Panagiotopoulos
, and
P. G.
Debenedetti
, “
Liquid-liquid transition in water from first principles
,”
Phys. Rev. Lett.
129
(
25
),
255702
(
2022
).
192.
P. M.
Piaggi
,
T. E.
Gartner
,
R.
Car
, and
P. G.
Debenedetti
, “
Melting curves of ice polymorphs in the vicinity of the liquid–liquid critical point
,”
J. Chem. Phys.
159
,
054502
(
2023
).
193.
P. M.
Piaggi
and
R.
Car
, “
Enhancing the formation of ionic defects to study the ice Ih/XI transition with molecular dynamics simulations
,”
Mol. Phys.
119
(
19–20
),
e1916634
(
2021
).
194.
A. O.
Atsango
,
T.
Morawietz
,
O.
Marsalek
, and
T. E.
Markland
, “
Developing machine-learned potentials to simultaneously capture the dynamics of excess protons and hydroxide ions in classical and path integral simulations
,”
J. Chem. Phys.
159
(
7
),
074101
(
2023
).
195.
L.
Liu
,
Y.
Tian
,
X.
Yang
, and
C.
Liu
, “
Mechanistic insights into water autoionization through metadynamics simulation enhanced by machine learning
,”
Phys. Rev. Lett.
131
(
15
),
158001
(
2023
).
196.
M.
Calegari Andrade
,
R.
Car
, and
A.
Selloni
, “
Probing the self-ionization of liquid water with ab initio deep potential molecular dynamics
,”
Proc. Natl. Acad. Sci. U. S. A.
120
(
46
),
e2302468120
(
2023
).
197.
P.
Geiger
and
C.
Dellago
, “
Neural networks for local structure detection in polymorphic systems
,”
J. Chem. Phys.
139
,
164105
(
2013
).
198.
M.
Fulford
,
M.
Salvalaglio
, and
C.
Molteni
, “
DeepIce: A deep neural network approach to identify ice and water molecules
,”
J. Chem. Inf. Model.
59
,
2141
2149
(
2019
).
199.
J.
Huang
,
G.
Huang
, and
S.
Li
, “
A machine learning model to classify dynamic processes in liquid water
,”
ChemPhysChem
23
(
1
),
e202100599
(
2022
).
200.
B.
Monserrat
,
J. G.
Brandenburg
,
E. A.
Engel
, and
B.
Cheng
, “
Liquid water contains the building blocks of diverse ice phases
,”
Nat. Commun.
11
(
1
),
5757
(
2020
).
201.
F.
Guidarelli Mattioli
,
F.
Sciortino
, and
J.
Russo
, “
Are neural network potentials trained on liquid states transferable to crystal nucleation? A test on ice nucleation in the mW water model
,”
J. Phys. Chem. B
127
(
17
),
3894
3901
(
2023
).
202.
A.
Torres
,
L. S.
Pedroza
,
M.
Fernandez-Serra
, and
A. R.
Rocha
, “
Using neural network force fields to ascertain the quality of ab initio simulations of liquid water
,”
J. Phys. Chem. B
125
,
10772
10778
(
2021
).
203.
P.
Montero de Hijes
,
C.
Dellago
,
R.
Jinnouchi
,
B.
Schmiedmayer
, and
G.
Kresse
, “
Comparing machine learning potentials for water: Kernel-based regression and Behler-Parrinello neural networks
,”
J. Chem. Phys.
160
,
114107
(
2024
).
204.
M. S.
Gomes-Filho
,
A.
Torres
,
A.
Reily Rocha
, and
L. S.
Pedroza
, “
Size and quality of quantum mechanical data set for training neural network force fields for liquid water
,”
J. Phys. Chem. B
127
(
6
),
1422
1428
(
2023
).
205.
Z.
Li
,
K.
Meidani
,
P.
Yadav
, and
A.
Barati Farimani
, “
Graph neural networks accelerated molecular dynamics
,”
J. Chem. Phys.
156
(
14
),
144103
(
2022
).
206.
G.
Imbalzano
,
A.
Anelli
,
D.
Giofre
,
S.
Klees
,
J.
Behler
, and
M.
Ceriotti
, “
Automatic selection of atomic fingerprints and reference configurations for machine-learning potentials
,”
J. Chem. Phys.
148
,
241730
(
2018
).
207.
F.
Guidarelli Mattioli
,
F.
Sciortino
, and
J.
Russo
, “
A neural network potential with self-trained atomic fingerprints: A test with the mW water potential
,”
J. Chem. Phys.
158
(
10
),
104501
(
2023
).
208.
T. A.
Young
,
T.
Johnston-Wood
,
V. L.
Deringer
, and
F.
Duarte
, “
A transferable active-learning strategy for reactive molecular force fields
,”
Chem. Sci.
12
(
32
),
10944
10955
(
2021
).
209.
T. K.
Patra
,
T. D.
Loeffler
,
H.
Chan
,
M. J.
Cherukara
,
B.
Narayanan
, and
S. K. R. S.
Sankaranarayanan
, “
A coarse-grained deep neural network model for liquid water
,”
Appl. Phys. Lett.
115
,
193101
(
2019
).
210.
T. D.
Loeffler
,
T. K.
Patra
,
H.
Chan
, and
S. K.
Sankaranarayanan
, “
Active learning a coarse-grained neural network model for bulk water from sparse training data
,”
Mol. Syst. Des. Eng.
5
(
5
),
902
910
(
2020
).
211.
C.
Scherer
,
R.
Scheid
,
D.
Andrienko
, and
T.
Bereau
, “
Kernel-based machine learning for efficient simulations of molecular liquids
,”
J. Chem. Theory Comput.
16
(
5
),
3194
3204
(
2020
).
212.
S.
Thaler
and
J.
Zavadlav
, “
Learning neural network potentials from experimental data via differentiable trajectory reweighting
,”
Nat. Commun.
12
(
1
),
6884
(
2021
).
213.
F.
Musil
,
I.
Zaporozhets
,
F.
Noé
,
C.
Clementi
, and
V.
Kapil
, “
Quantum dynamics using path integral coarse-graining
,”
J. Chem. Phys.
157
(
18
),
181102
(
2022
).
214.
T. D.
Loose
,
P. G.
Sahrmann
,
T. S.
Qu
, and
G. A.
Voth
, “
Coarse-graining with equivariant neural networks: A path toward accurate and data-efficient models
,”
J. Phys. Chem. B
127
,
10564
(
2023
).
215.
H.
Chan
,
M. J.
Cherukara
,
B.
Narayanan
,
T. D.
Loeffler
,
C.
Benmore
,
S. K.
Gray
, and
S. K. R. S.
Sankaranarayanan
, “
Machine learning coarse grained models for water
,”
Nat. Commun.
10
(
1
),
379
(
2019
).
216.
T. D.
Loeffler
,
H.
Chan
,
K.
Sasikumar
,
B.
Narayanan
,
M. J.
Cherukara
,
S.
Gray
, and
S. K. R. S.
Sankaranarayanan
, “
Teaching an old dog new tricks: Machine learning an improved TIP3P potential model for liquid–vapor phase phenomena
,”
J. Phys. Chem. C
123
(
36
),
22643
22655
(
2019
).
217.
H.-f.
Ye
,
J.
Wang
,
Y.-g.
Zheng
,
H.-w.
Zhang
, and
Z.
Chen
, “
Machine learning for reparameterization of four-site water models: TIP4P-BG and TIP4P-BGT
,”
Phys. Chem. Chem. Phys.
23
(
17
),
10164
10173
(
2021
).
218.
J.
Wang
,
Y.
Zheng
,
H.
Zhang
, and
H.
Ye
, “
Machine learning-generated TIP4P-BGWT model for liquid and supercooled water
,”
J. Mol. Liq.
367
,
120459
(
2022
).
219.
B.
Han
,
C. M.
Isborn
, and
L.
Shi
, “
Incorporating polarization and charge transfer into a point-charge model for water using machine learning
,”
J. Phys. Chem. Lett.
14
(
16
),
3869
3877
(
2023
).
220.
Y.
Zhai
,
A.
Caruso
,
S. L.
Bore
,
Z.
Luo
, and
F.
Paesani
, “
A ‘short blanket’ dilemma for a state-of-the-art neural network potential for water: Reproducing experimental properties or the physics of the underlying many-body interactions?
,”
J. Chem. Phys.
158
,
084111
(
2023
).
221.
M. C.
Muniz
,
R.
Car
, and
A. Z.
Panagiotopoulos
, “
Neural network water model based on the MB-pol many-body potential
,”
J. Phys. Chem. B
127
,
9165
(
2023
).
222.
S.
Thaler
,
G.
Doehner
, and
J.
Zavadlav
, “
Scalable Bayesian uncertainty quantification for neural network potentials: Promise and pitfalls
,”
J. Chem. Theory Comput.
19
,
4520
(
2023
).
223.
T. T.
Nguyen
,
E.
Székely
,
G.
Imbalzano
,
J.
Behler
,
G.
Csányi
,
M.
Ceriotti
,
A. W.
Götz
, and
F.
Paesani
, “
Comparison of permutationally invariant polynomials, neural networks, and Gaussian approximation potentials in representing water interactions through many-body expansions
,”
J. Chem. Phys.
148
,
241725
(
2018
).
224.
H.
Wang
and
W.
Yang
, “
Force field for water based on neural network
,”
J. Phys. Chem. Lett.
9
,
3232
3240
(
2018
).
225.
L.
Yang
,
J.
Li
,
F.
Chen
, and
K.
Yu
, “
A transferrable range-separated force field for water: Combining the power of both physically-motivated models and machine learning techniques
,”
J. Chem. Phys.
157
(
21
),
214108
(
2022
).
226.
B. C.
Symons
and
P. L.
Popelier
, “
Application of quantum chemical topology force field FFLUX to condensed matter simulations: Liquid water
,”
J. Chem. Theory Comput.
18
(
9
),
5577
5588
(
2022
).
227.
A.
Konovalov
,
B. C.
Symons
, and
P. L.
Popelier
, “
On the many-body nature of intramolecular forces in FFLUX and its implications
,”
J. Comput. Chem.
42
(
2
),
107
116
(
2021
).
228.
V.
Zaverkin
,
D.
Holzmüller
,
R.
Schuldt
, and
J.
Kästner
, “
Predicting properties of periodic systems from cluster data: A case study of liquid water
,”
J. Chem. Phys.
156
,
114103
(
2022
).
229.
J.
Daru
,
H.
Forbert
,
J.
Behler
, and
D.
Marx
, “
Coupled cluster molecular dynamics of condensed phase systems enabled by machine learning potentials: Liquid water benchmark
,”
Phys. Rev. Lett.
129
,
226001
(
2022
).
230.
M. S.
Chen
,
J.
Lee
,
H.-Z.
Ye
,
T. C.
Berkelbach
,
D. R.
Reichman
, and
T. E.
Markland
, “
Data-efficient machine learning potentials from transfer learning of periodic correlated electronic structure methods: Liquid water at AFQMC, CCSD, and CCSD(T) accuracy
,”
J. Chem. Theory Comput.
19
,
4510
(
2023
).
231.
A.
Grisafi
and
M.
Ceriotti
, “
Incorporating long-range physics in atomic-scale machine learning
,”
J. Chem. Phys.
151
(
20
),
204105
(
2019
).
232.
A.
Gao
and
R. C.
Remsing
, “
Self-consistent determination of long-range electrostatics in neural network potentials
,”
Nat. Commun.
13
(
1
),
1572
(
2022
).
233.
H. S.
Dhattarwal
,
A.
Gao
, and
R. C.
Remsing
, “
Dielectric saturation in water from a long-range machine learning model
,”
J. Phys. Chem. B
127
(
16
),
3663
3671
(
2023
).
234.
T.
Plé
,
L.
Lagardère
, and
J.-P.
Piquemal
, “
Force-field-enhanced neural network interactions: from local equivariant embedding to atom-in-molecule properties and long-range effects
,”
Chem. Sci.
14
,
12554
12569
(
2023
).
235.
D. P.
Kovacs
,
I.
Batatia
,
E. S.
Arany
, and
G.
Csanyi
, “
Evaluation of the MACE force field architecture: From medicinal chemistry to materials science
,”
J. Chem. Phys.
159
,
044118
(
2023
).
236.
D. P.
Kovács
,
J. H.
Moore
,
N. J.
Browning
,
I.
Batatia
,
J. T.
Horton
,
V.
Kapil
,
I.-B.
Magdău
,
D. J.
Cole
, and
G.
Csányi
, “
MACE-OFF23: Transferable machine learning force fields for organic molecules
,” arXiv:2312.15211 (
2023
).
237.
I.
Batatia
,
P.
Benner
,
Y.
Chiang
,
A. M.
Elena
,
D. P.
Kovács
,
J.
Riebesell
,
X. R.
Advincula
,
M.
Asta
,
W. J.
Baldwin
,
N.
Bernstein
et al, “
A foundation model for atomistic materials chemistry
,” arXiv:2401.00096 (
2023
).
238.
X.
Fu
,
Z.
Wu
,
W.
Wang
,
T.
Xie
,
S.
Keten
,
R.
Gomez-Bombarelli
, and
T.
Jaakkola
, “
Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations
,” arXiv:2210.07237 (
2022
).
239.
O.
Wohlfahrt
,
C.
Dellago
, and
M.
Sega
, “
Ab initio structure and thermodynamics of the RPBE-D3 water/vapor interface by neural-network molecular dynamics
,”
J. Chem. Phys.
153
(
14
),
144710
(
2020
).
240.
E. D.
Donkor
,
A.
Laio
, and
A.
Hassanali
, “
Do machine-learning atomic descriptors and order parameters tell the same story? The case of liquid water
,”
J. Chem. Theory Comput.
19
,
4596
(
2023
).
241.
M.
Bonn
,
Y.
Nagata
, and
E. H.
Backus
, “
Molecular structure and dynamics of water at the water–air interface studied with surface-specific vibrational spectroscopy
,”
Angew. Chem., Int. Ed.
54
(
19
),
5560
5576
(
2015
).
242.
T.
Ohto
,
K.
Usui
,
T.
Hasegawa
,
M.
Bonn
, and
Y.
Nagata
, “
Toward ab initio molecular dynamics modeling for sum-frequency generation spectra; an efficient algorithm based on surface-specific velocity-velocity correlation function
,”
J. Chem. Phys.
143
(
12
),
124702
(
2015
).
243.
S.
Shepherd
,
J.
Lan
,
D. M.
Wilkins
, and
V.
Kapil
, “
Efficient quantum vibrational spectroscopy of water with high-order path integrals: From bulk to interfaces
,”
J. Phys. Chem. Lett.
12
,
9108
9114
(
2021
).
244.
Y.
Litman
,
J.
Lan
,
Y.
Nagata
, and
D. M.
Wilkins
, “
Fully first-principles surface spectroscopy with machine learning
,”
J. Phys. Chem. Lett.
14
(
36
),
8175
8182
(
2023
).
245.
S. P.
Niblett
,
M.
Galib
, and
D. T.
Limmer
, “
Learning intermolecular forces at liquid–vapor interfaces
,”
J. Chem. Phys.
155
(
16
),
164101
(
2021
).
246.
M.
Sega
and
C.
Dellago
, “
Long-range dispersion effects on the water/vapor interface simulated using the most common models
,”
J. Phys. Chem. B
121
(
15
),
3798
(
2017
).
247.
I.
Sanchez-Burgos
,
M. C.
Muniz
,
J. R.
Espinosa
, and
A. Z.
Panagiotopoulos
, “
A deep potential model for liquid–vapor equilibrium and cavitation rates of water
,”
J. Chem. Phys.
158
,
184504
(
2023
).
248.
M.
de la Puente
and
D.
Laage
, “
How the acidity of water droplets and films is controlled by the air–water interface
,”
J. Am. Chem. Soc.
145
,
25186
25194
(
2023
).
249.
Y.
Shao
,
L.
Knijff
,
F. M.
Dietrich
,
K.
Hermansson
, and
C.
Zhang
, “
Modelling bulk electrolytes and electrolyte interfaces with atomistic machine learning
,”
Batteries Supercaps
4
,
585
595
(
2021
).
250.
M.
Hellström
and
J.
Behler
, “
Concentration-dependent proton transfer mechanisms in aqueous NaOH solutions: From acceptor-driven to donor-driven and back
,”
J. Phys. Chem. Lett.
7
,
3302
3306
(
2016
).
251.
M.
Hellström
and
J.
Behler
, “
Structure of aqueous NaOH solutions: Insights from neural-network-based molecular dynamics simulations
,”
Phys. Chem. Chem. Phys.
19
,
82
(
2017
).
252.
M.
Hellström
and
J.
Behler
, “
Proton-transfer-driven water exchange mechanism in the Na+ solvation shell
,”
J. Phys. Chem. B
121
,
4184
(
2017
).
253.
M.
Hellström
,
M.
Ceriotti
, and
J.
Behler
, “
Nuclear quantum effects in sodium hydroxide solutions from neural network molecular dynamics simulations
,”
J. Phys. Chem. B
122
,
10158
10171
(
2018
).
254.
Y.
Shao
,
M.
Hellström
,
A.
Yllö
,
J.
Mindemark
,
K.
Hermansson
,
J.
Behler
, and
C.
Zhang
, “
Temperature effects on the ionic conductivity in concentrated alkaline electrolyte solutions
,”
Phys. Chem. Chem. Phys.
22
,
10426
(
2020
).
255.
N.
O’Neill
,
C.
Schran
,
S. J.
Cox
, and
A.
Michaelides
, “
Crumbling crystals: On the dissolution mechanism of NaCl in water
,” arXiv:2211.04345 (
2022
).
256.
M.
Xu
,
T.
Zhu
, and
J. Z.
Zhang
, “
Molecular dynamics simulation of zinc ion in water with an ab initio based neural network potential
,”
J. Phys. Chem. A
123
,
6587
6595
(
2019
).
257.
Y.
Litman
,
K.-Y.
Chiang
,
T.
Seki
,
Y.
Nagata
, and
M.
Bonn
, “
Surface stratification determines the interfacial water structure of simple electrolyte solutions
,”
Nat. Chem.
16
,
644
650
(
2024
).
258.
J.
Zhang
,
J.
Pagotto
, and
T. T.
Duignan
, “
Towards predictive design of electrolyte solutions by accelerating ab initio simulation with neural networks
,”
J. Mater. Chem. A
10
(
37
),
19560
19571
(
2022
).
259.
J.
Zhang
,
J.
Pagotto
,
T.
Gould
, and
T. T.
Duignan
, “
Accurate, fast and generalisable first principles simulation of aqueous lithium chloride
,” arXiv:2310.12535 (
2023
).
260.
S.
Baker
,
J.
Pagotto
,
T. T.
Duignan
, and
A. J.
Page
, “
High-throughput aqueous electrolyte structure prediction using IonSolvR and equivariant graph neural network potentials
,”
J. Phys. Chem. Lett.
14
(
42
),
9508
9515
(
2023
).
261.
P.
Wang
,
Y.
Su
,
R.
Shi
,
X.
Huang
, and
J.
Zhao
, “
Structures and spectroscopic properties of hydrated zinc(II) ion clusters [Zn2+(H2O)n (n = 1−8)] by ab initio study
,”
J. Cluster Sci.
34
(
3
),
1625
1632
(
2023
).
262.
E.
Kamarchik
,
Y.
Wang
, and
J. M.
Bowman
, “
Quantum vibrational analysis and infrared spectra of microhydrated sodium ions using an ab initio potential
,”
J. Chem. Phys.
134
,
114311
(
2011
).
263.
C.
Zhang
,
S.
Yue
,
A. Z.
Panagiotopoulos
,
M. L.
Klein
, and
X.
Wu
, “
Dissolving salt is not equivalent to applying a pressure on water
,”
Nat. Commun.
13
,
822
(
2022
).
264.
C.
Zhang
,
S.
Yue
,
A. Z.
Panagiotopoulos
,
M. L.
Klein
, and
X.
Wu
, “
Why dissolving salt in water decreases its dielectric permittivity
,”
Phys. Rev. Lett.
131
(
7
),
076801
(
2023
).
265.
M.
Galib
and
D. T.
Limmer
, “
Reactive uptake of N2O5 by atmospheric aerosol is dominated by interfacial processes
,”
Science
371
(
6532
),
921
925
(
2021
).
266.
N. V. S.
Avula
,
M. L.
Klein
, and
S.
Balasubramanian
, “
Understanding the anomalous diffusion of water in aqueous electrolytes using machine learned potentials
,”
J. Phys. Chem. Lett.
14
,
9500
9507
(
2023
).
267.
N.
Artrith
and
A. M.
Kolpak
, “
Understanding the composition and activity of electrocatalytic nanoalloys in aqueous solvents: A combination of DFT and accurate neural network potentials
,”
Nano Lett.
14
,
2670
2676
(
2014
).
268.
S. K.
Natarajan
and
J.
Behler
, “
Neural network molecular dynamics simulations of solid-liquid interfaces: Water at low-index copper surfaces
,”
Phys. Chem. Chem. Phys.
18
,
28704
(
2016
).
269.
S.
Kondati Natarajan
and
J.
Behler
, “
Self-diffusion of surface defects at copper-water interfaces
,”
J. Phys. Chem. C
121
,
4368
(
2017
).
270.
V.
Quaranta
,
M.
Hellström
, and
J.
Behler
, “
Proton-transfer mechanisms at the water–ZnO interface: The role of presolvation
,”
J. Phys. Chem. Lett.
8
,
1476
(
2017
).
271.
V.
Quaranta
,
J.
Behler
, and
M.
Hellström
, “
Structure and dynamics of the liquid-water/zinc-oxide interface from machine learning potential simulations
,”
J. Phys. Chem. C
123
,
1293
(
2019
).
272.
M. F.
Calegari Andrade
,
H.-Y.
Ko
,
L.
Zhang
,
R.
Car
, and
A.
Selloni
, “
Free energy of proton transfer at the water–TiO2 interface from ab initio deep potential molecular dynamics
,”
Chem. Sci.
11
,
2335
2341
(
2020
).
273.
B.
Wen
,
M. F.
Calegari Andrade
,
L.-M.
Liu
, and
A.
Selloni
, “
Water dissociation at the water–rutile TiO2(110) interface from ab initio-based deep neural network simulations
,”
Proc. Natl. Acad. Sci. U. S. A.
120
,
e2212250120
(
2023
).
274.
Y.-B.
Zhuang
,
R.-H.
Bi
, and
J.
Cheng
, “
Resolving the odd-even oscillation of water dissociation at rutile TiO2(110)-water interface by machine learning accelerated molecular dynamics
,”
J. Chem. Phys.
157
,
164701
(
2022
).
275.
V.
Quaranta
,
M.
Hellström
,
J.
Behler
,
J.
Kullgren
,
P.
Mitev
, and
K.
Hermansson
, “
Maximally resolved anharmonic OH vibrational spectrum of the water/ZnO(10-10) interface from a high-dimensional neural network potential
,”
J. Chem. Phys.
148
,
241720
(
2018
).
276.
M.
Hellström
,
V.
Quaranta
, and
J.
Behler
, “
One-dimensional vs. two-dimensional proton transport processes at solid-liquid zinc-oxide-water interfaces
,”
Chem. Sci.
10
,
1232
(
2019
).
277.
M.
Eckhoff
and
J.
Behler
, “
Insights into lithium manganese oxide-water interfaces using machine learning potentials
,”
J. Chem. Phys.
155
,
244703
(
2021
).
278.
A.
Nakanishi
,
S.
Kasamatsu
,
J.
Haruyama
, and
O.
Sugino
, “
Structural analysis of zirconium oxynitride/water interface using neural network potential
,” arXiv:2307.11296 (
2023
).
279.
C. R.
O’Connor
,
M. F.
Calegari Andrade
,
A.
Selloni
, and
G. A.
Kimmel
, “
Elucidating the water–anatase TiO2(101) interface structure using infrared signatures and molecular dynamics
,”
J. Chem. Phys.
159
(
10
),
104707
(
2023
).
280.
Z.
Li
,
J.
Wang
,
C.
Yang
,
L.
Liu
, and
J.-Y.
Yang
, “
Thermal transport across TiO2–H2O interface involving water dissociation: Ab initio-assisted deep potential molecular dynamics
,”
J. Chem. Phys.
159
,
144701
(
2023
).
281.
Z.
Zeng
,
F.
Wodaczek
,
K.
Liu
,
F.
Stein
,
J.
Hutter
,
J.
Chen
, and
B.
Cheng
, “
Mechanistic insight on water dissociation on pristine low-index TiO2 surfaces from machine learning molecular dynamics simulations
,”
Nat. Commun.
14
(
1
),
6131
(
2023
).
282.
Z.
Ding
and
A.
Selloni
, “
Modeling the aqueous interface of amorphous TiO2 using deep potential molecular dynamics
,”
J. Chem. Phys.
159
(
2
),
024706
(
2023
).
283.
A. E. G.
Mikkelsen
,
J.
Schiøtz
,
T.
Vegge
, and
K. W.
Jacobsen
, “
Is the water/Pt(111) interface ordered at room temperature?
,”
J. Chem. Phys.
155
,
224701
(
2021
).
284.
A. E. G.
Mikkelsen
,
H. H.
Kristoffersen
,
J.
Schiøtz
,
T.
Vegge
,
H. A.
Hansen
, and
K. W.
Jacobsen
, “
Structure and energetics of liquid water-hydroxyl layers on Pt(111)
,”
Phys. Chem. Chem. Phys.
24
,
9885
9890
(
2022
).
285.
P.
Schienbein
and
J.
Blumberger
, “
Nanosecond solvation dynamics of the hematite/liquid water interface at hybrid DFT accuracy using committee neural network potentials
,”
Phys. Chem. Chem. Phys.
24
(
25
),
15365
15375
(
2022
).
286.
L.
Li
,
M. F.
Calegari Andrade
,
R.
Car
,
A.
Selloni
, and
E. A.
Carter
, “
Characterizing structure-dependent TiS2/water interfaces using deep-neural-network-assisted molecular dynamics
,”
J. Phys. Chem. C
127
,
9750
(
2023
).
287.
A. S.
Raman
and
A.
Selloni
, “
Acid–base chemistry of a model IrO2 catalytic interface
,”
J. Phys. Chem. Lett.
14
,
7787
7794
(
2023
).
288.
X.-T.
Fan
,
X.-J.
Wen
,
Y.-B.
Zhuang
, and
J.
Cheng
, “
Molecular insight into the GaP(110)-water interface using machine learning accelerated molecular dynamics
,”
J. Energy Chem.
82
,
239
(
2023
).
289.
P. M.
Piaggi
,
A.
Selloni
,
A.
Panagiotopoulos
,
R.
Car
, and
P. G.
Debenedetti
, “
A first-principles machine-learning force field for heterogeneous ice nucleation on microcline feldspar
,”
Faraday Discuss.
249
,
98
(
2024
).
290.
Z.
Li
,
X.
Tan
,
Z.
Fu
,
L.
Liu
, and
J.-Y.
Yang
, “
Thermal transport across copper–water interfaces according to deep potential molecular dynamics
,”
Phys. Chem. Chem. Phys.
25
(
9
),
6746
6756
(
2023
).
291.
X.
Yang
,
A.
Bhowmik
,
T.
Vegge
, and
H. A.
Hansen
, “
Neural network potentials for accelerated metadynamics of oxygen reduction kinetics at Au–water interfaces
,”
Chem. Sci.
14
(
14
),
3913
3922
(
2023
).
292.
X.
Li
,
W.
Paier
, and
J.
Paier
, “
Machine learning in computational surface science and catalysis: Case studies on water and metal–oxide interfaces
,”
Front. Chem.
8
,
601029
(
2020
).
293.
H.
Ghorbanfekr
,
J.
Behler
, and
F. M.
Peeters
, “
Insights into water permeation through hBN nanocapillaries by ab initio machine learning molecular dynamics simulations
,”
J. Phys. Chem. Lett.
11
,
7363
(
2020
).
294.
W.
Zhao
,
H.
Qiu
, and
W.
Guo
, “
A deep neural network potential for water confined in graphene nanocapillaries
,”
J. Phys. Chem. C
126
,
10546
(
2022
).
295.
D.
Liu
,
J.
Wu
, and
D.
Lu
, “
Transferability evaluation of the deep potential model for simulating water-graphene confined system
,”
J. Chem. Phys.
159
(
4
),
044712
(
2023
).
296.
V.
Kapil
,
C.
Schran
,
A.
Zen
,
J.
Chen
,
C. J.
Pickard
, and
A.
Michaelides
, “
The first-principles phase diagram of monolayer nanoconfined water
,”
Nature
609
(
7927
),
512
516
(
2022
).
297.
P.
Ravindra
,
X. R.
Advincula
,
C.
Schran
,
A.
Michaelides
, and
V.
Kapil
, “
A quasi-one-dimensional hydrogen-bonded monolayer ice phase
,” arXiv:2312.01340 (
2023
).
298.
F. L.
Thiemann
,
C.
Schran
,
P.
Rowe
,
E. A.
Müller
, and
A.
Michaelides
, “
Water flow in single-wall nanotubes: Oxygen makes it slip, hydrogen makes it stick
,”
ACS Nano
16
,
10775
10782
(
2022
).
299.
Z.
Cao
,
Y.
Wang
,
C.
Lorsung
, and
A.
Barati Farimani
, “
Neural network predicts ion concentration profiles under nanoconfinement
,”
J. Chem. Phys.
159
(
9
),
094702
(
2023
).
300.
L.
Shen
and
W.
Yang
, “
Molecular dynamics simulations with quantum mechanics/molecular mechanics and adaptive neural networks
,”
J. Chem. Theory Comput.
14
,
1442
1455
(
2018
).
301.
M.
Yang
,
L.
Bonati
,
D.
Polino
, and
M.
Parrinello
, “
Using metadynamics to build neural network potentials for reactive events: The case of urea decomposition in water
,”
Catal. Today
387
,
143
149
(
2022
).
302.
T.
Devergne
,
T.
Magrino
,
F.
Pietrucci
, and
A. M.
Saitta
, “
Combining machine learning approaches and accurate ab initio enhanced sampling methods for prebiotic chemical reactions in solution
,”
J. Chem. Theory Comput.
18
,
5410
5421
(
2022
).
303.
X.
Pan
,
J.
Yang
,
R.
Van
,
E.
Epifanovsky
,
J.
Ho
,
J.
Huang
,
J.
Pu
,
Y.
Mei
,
K.
Nam
, and
Y.
Shao
, “
Machine-learning-assisted free energy simulation of solution-phase and enzyme reactions
,”
J. Chem. Theory Comput.
17
,
5745
5758
(
2021
).
304.
J.
Lan
,
V.
Kapil
,
P.
Gasparotto
,
M.
Ceriotti
,
M.
Iannuzzi
, and
V. V.
Rybkin
, “
Simulating the ghost: Quantum dynamics of the solvated electron
,”
Nat. Commun.
12
(
1
),
766
(
2021
).
305.
W.-K.
Chen
,
W.-H.
Fang
, and
G.
Cui
, “
Integrating machine learning with the multilayer energy-based fragment method for excited states of large systems
,”
J. Phys. Chem. Lett.
10
(
24
),
7836
7841
(
2019
).
306.
J.
Wang
,
S.
Olsson
,
C.
Wehmeyer
,
A.
Pérez
,
N. E.
Charron
,
G.
De Fabritiis
,
F.
Noé
, and
C.
Clementi
, “
Machine learning of coarse-grained molecular dynamics force fields
,”
ACS Cent. Sci.
5
(
5
),
755
767
(
2019
).
307.
B. E.
Husic
,
N. E.
Charron
,
D.
Lemm
,
J.
Wang
,
A.
Pérez
,
M.
Majewski
,
A.
Krämer
,
Y.
Chen
,
S.
Olsson
,
G.
de Fabritiis
et al, “
Coarse graining molecular dynamics with graph neural networks
,”
J. Chem. Phys.
153
(
19
),
194101
(
2020
).
308.
A.
krämer
,
A. E.
Durumeric
,
N. E.
Charron
,
Y.
Chen
,
C.
Clementi
, and
F.
Noé
, “
Statistically optimal force aggregation for coarse-graining molecular dynamics
,”
J. Phys. Chem. Lett.
14
(
17
),
3970
3979
(
2023
).
309.
S.
Yao
,
R.
Van
,
X.
Pan
,
J. H.
Park
,
Y.
Mao
,
J.
Pu
,
Y.
Mei
, and
Y.
Shao
, “
Machine learning based implicit solvent model for aqueous-solution alanine dipeptide molecular dynamics simulations
,”
RSC Adv.
13
(
7
),
4565
4577
(
2023
).
310.
J. R.
Cendagorta
,
H.
Shen
,
Z.
Bacic
, and
M. E.
Tuckerman
, “
Enhanced sampling path integral methods using neural network potential energy surfaces with application to diffusion in hydrogen hydrates
,”
Adv. Theory Simul.
4
,
2000258
(
2021
).
311.
J.
Yang
,
Y.
Cong
,
Y.
Li
, and
H.
Li
, “
Machine learning approach based on a range-corrected deep potential model for efficient vibrational frequency computation
,”
J. Chem. Theory Comput.
19
(
18
),
6366
6374
(
2023
).
312.
L.
Böselt
,
M.
Thürlemann
, and
S.
Riniker
, “
Machine learning in QM/MM molecular dynamics simulations of condensed-phase systems
,”
J. Chem. Theory Comput.
17
(
5
),
2641
2658
(
2021
).
313.
A.
Hofstetter
,
L.
Böselt
, and
S.
Riniker
, “
Graph-convolutional neural networks for (QM)ML/MM molecular dynamics simulations
,”
Phys. Chem. Chem. Phys.
24
(
37
),
22497
22512
(
2022
).
314.
S. M.
Salehi
,
S.
Käser
,
K.
Töpfer
,
P.
Diamantis
,
R.
Pfister
,
P.
Hamm
,
U.
Rothlisberger
, and
M.
Meuwly
, “
Hydration dynamics and IR spectroscopy of 4-fluorophenol
,”
Phys. Chem. Chem. Phys.
24
(
42
),
26046
26060
(
2022
).
315.
M.
Xu
,
T.
Zhu
, and
J. Z.
Zhang
, “
Automatically constructed neural network potentials for molecular dynamics simulation of zinc proteins
,”
Front. Chem.
9
,
692200
(
2021
).
316.
J. R.
Loeffler
,
M. L.
Fernández-Quintero
,
F.
Waibl
,
P. K.
Quoika
,
F.
Hofer
,
M.
Schauperl
, and
K. R.
Liedl
, “
Conformational shifts of stacked heteroaromatics: Vacuum vs. water studied by machine learning
,”
Front. Chem.
9
,
641610
(
2021
).
317.
P.
Gao
,
X.
Yang
,
Y.-H.
Tang
,
M.
Zheng
,
A.
Andersen
,
V.
Murugesan
,
A.
Hollas
, and
W.
Wang
, “
Graphical Gaussian process regression model for aqueous solvation free energy prediction of organic molecules in redox flow batteries
,”
Phys. Chem. Chem. Phys.
23
(
43
),
24892
24904
(
2021
).
318.
A.
Fabrizio
,
A.
Grisafi
,
B.
Meyer
,
M.
Ceriotti
, and
C.
Corminboeuf
, “
Electron density learning of non-covalent systems
,”
Chem. Sci.
10
,
9424
9432
(
2019
).
319.
T. W.
Ko
,
J. A.
Finkler
,
S.
Goedecker
, and
J.
Behler
, “
Accurate fourth-generation machine learning potentials by electrostatic embedding
,”
J. Chem. Theory Comput.
19
,
3567
3579
(
2023
).
320.
F.
Noé
,
S.
Olsson
,
J.
Köhler
, and
H.
Wu
, “
Boltzmann generators: Sampling equilibrium states of many-body systems with deep learning
,”
Science
365
(
6457
),
eaaw1147
(
2019
).
321.
P.
Wirnsberger
,
G.
Papamakarios
,
B.
Ibarz
,
S.
Racanière
,
A. J.
Ballard
,
A.
Pritzel
, and
C.
Blundell
, “
Normalizing flows for atomic solids
,”
Mach. Learn.: Sci. Technol.
3
,
025009
(
2022
).
322.
C.
Zeni
,
R.
Pinsler
,
D.
Zügner
,
A.
Fowler
,
M.
Horton
,
X.
Fu
,
S.
Shysheya
,
J.
Crabbé
,
L.
Sun
,
J.
Smith
et al, “
MatterGen: A generative model for inorganic materials design
,” arXiv:2312.03687 (
2023
).
323.
Z.
Zou
,
E. R.
Beyerle
,
S.-T.
Tsai
, and
P.
Tiwary
, “
Driving and characterizing nucleation of urea and glycine polymorphs in water
,”
Proc. Natl. Acad. Sci. U. S. A.
120
(
7
),
e2216099120
(
2023
).
324.
H.
Jung
,
R.
Covino
,
A.
Arjun
,
C.
Leitold
,
C.
Dellago
,
P. G.
Bolhuis
, and
G.
Hummer
, “
Machine-guided path sampling to discover mechanisms of molecular self-organization
,”
Nat. Comput. Sci.
3
(
4
),
334
345
(
2023
).