We present a method for optimizing transition state theory dividing surfaces with support vector machines. The resulting dividing surfaces require no a priori information or intuition about reaction mechanisms. To generate optimal dividing surfaces, we apply a cycle of machine-learning and refinement of the surface by molecular dynamics sampling. We demonstrate that the machine-learned surfaces contain the relevant low-energy saddle points. The mechanisms of reactions may be extracted from the machine-learned surfaces in order to identify unexpected chemically relevant processes. Furthermore, we show that the machine-learned surfaces significantly increase the transmission coefficient for an adatom exchange involving many coupled degrees of freedom on a (100) surface when compared to a distance-based dividing surface.
Skip Nav Destination
,
,
,
,
,
Article navigation
7 May 2012
Research Article|
May 01 2012
Optimizing transition states via kernel-based machine learning Available to Purchase
Zachary D. Pozun;
Zachary D. Pozun
1Institute for Pure and Applied Mathematics,
University of California
, Los Angeles, Los Angeles, California 90095-7121, USA
2Department of Chemistry and Biochemistry and the Institute for Computational Engineering and Sciences,
The University of Texas at Austin
, Austin, Texas 78712-0165, USA
Search for other works by this author on:
Katja Hansen;
Katja Hansen
a)
1Institute for Pure and Applied Mathematics,
University of California
, Los Angeles, Los Angeles, California 90095-7121, USA
3Machine Learning Group, Computer Science Department,
Technische Universität Berlin
, Germany
Search for other works by this author on:
Daniel Sheppard;
Daniel Sheppard
b)
1Institute for Pure and Applied Mathematics,
University of California
, Los Angeles, Los Angeles, California 90095-7121, USA
2Department of Chemistry and Biochemistry and the Institute for Computational Engineering and Sciences,
The University of Texas at Austin
, Austin, Texas 78712-0165, USA
Search for other works by this author on:
Matthias Rupp;
Matthias Rupp
c)
1Institute for Pure and Applied Mathematics,
University of California
, Los Angeles, Los Angeles, California 90095-7121, USA
3Machine Learning Group, Computer Science Department,
Technische Universität Berlin
, Germany
Search for other works by this author on:
Klaus-Robert Müller;
Klaus-Robert Müller
1Institute for Pure and Applied Mathematics,
University of California
, Los Angeles, Los Angeles, California 90095-7121, USA
3Machine Learning Group, Computer Science Department,
Technische Universität Berlin
, Germany
4Department of Brain and Cognitive Engineering,
Korea University
, Anam-dong, Seongbuk-gu, Seoul 136-713, Korea
Search for other works by this author on:
Graeme Henkelman
Graeme Henkelman
d)
1Institute for Pure and Applied Mathematics,
University of California
, Los Angeles, Los Angeles, California 90095-7121, USA
2Department of Chemistry and Biochemistry and the Institute for Computational Engineering and Sciences,
The University of Texas at Austin
, Austin, Texas 78712-0165, USA
Search for other works by this author on:
Zachary D. Pozun
1,2
Katja Hansen
1,3,a)
Daniel Sheppard
1,2,b)
Matthias Rupp
1,3,c)
Klaus-Robert Müller
1,3,4
Graeme Henkelman
1,2,d)
1Institute for Pure and Applied Mathematics,
University of California
, Los Angeles, Los Angeles, California 90095-7121, USA
2Department of Chemistry and Biochemistry and the Institute for Computational Engineering and Sciences,
The University of Texas at Austin
, Austin, Texas 78712-0165, USA
3Machine Learning Group, Computer Science Department,
Technische Universität Berlin
, Germany
4Department of Brain and Cognitive Engineering,
Korea University
, Anam-dong, Seongbuk-gu, Seoul 136-713, Korea
a)
Also at Theory Department, Fritz Haber Institute of the Max Planck Society, Berlin, Germany.
b)
Present address: Theoretical Division, Los Alamos National Laboratory, Los Alamos, New Mexico 87545, USA.
c)
Present address: Institute of Pharmaceutical Sciences, ETH Zurich, Zurich, Switzerland.
d)
Electronic mail: [email protected].
J. Chem. Phys. 136, 174101 (2012)
Article history
Received:
January 25 2012
Accepted:
April 11 2012
Citation
Zachary D. Pozun, Katja Hansen, Daniel Sheppard, Matthias Rupp, Klaus-Robert Müller, Graeme Henkelman; Optimizing transition states via kernel-based machine learning. J. Chem. Phys. 7 May 2012; 136 (17): 174101. https://doi.org/10.1063/1.4707167
Download citation file:
Pay-Per-View Access
$40.00
Sign In
You could not be signed in. Please check your credentials and make sure you have an active account and try again.
Citing articles via
CREST—A program for the exploration of low-energy molecular chemical space
Philipp Pracht, Stefan Grimme, et al.
DeePMD-kit v2: A software package for deep potential models
Jinzhe Zeng, Duo Zhang, et al.
Related Content
Ridge-based bias potentials to accelerate molecular dynamics
J. Chem. Phys. (December 2015)
A hybrid memory kernel approach for condensed phase non-adiabatic dynamics
J. Chem. Phys. (July 2017)
Next generation extended Lagrangian first principles molecular dynamics
J. Chem. Phys. (August 2017)
Automated placement of interfaces in conformational kinetics calculations using machine learning
J. Chem. Phys. (October 2017)
Hydration free energies from kernel-based machine learning: Compound-database bias
J. Chem. Phys. (July 2020)