Content deleted Content added
No edit summary |
m Copy editing |
||
(6 intermediate revisions by 4 users not shown) | |||
Line 1:
{{Short description|Interatomic potentials constructed by machine learning programs}}
'''Machine-learned interatomic potentials''' ('''MLIPs'''), or simply '''machine learning potentials''' ('''MLPs'''), are [[interatomic potential]]s constructed
Such machine learning potentials promised to fill the gap between [[density functional theory]], a highly accurate but computationally intensive modelling method, and empirically derived or intuitively-approximated potentials, which were far lighter computationally but substantially less accurate. Improvements in [[artificial intelligence]] technology heightened the accuracy of MLPs while lowering their computational cost, increasing the role of machine learning in fitting potentials.<ref name="ML">{{cite journal|last1=Kocer|last2=Ko|last3=Behler|first1=Emir|first2=Tsz Wai|first3=Jorg|journal=Annual Review of Physical Chemistry|title=Neural Network Potentials: A Concise Overview of Methods|date=2022|volume=73|pages=163–86|doi=10.1146/annurev-physchem-082720-034254 |pmid=34982580 |bibcode=2022ARPC...73..163K |doi-access=free|arxiv=2107.03727}}</ref><ref>{{cite journal|last1=Blank|first1=TB|last2=Brown|first2=SD|last3=Calhoun|last4=Doren|first4=DJ|first3=AW|date=1995|title=Neural network models of potential energy surfaces|journal=Journal of Chemical Physics|volume=103|number=10|pages=4129–37|doi=10.1063/1.469597 |bibcode=1995JChPh.103.4129B }}</ref>
Line 11:
Almost all neural networks intake atomic coordinates and output potential energies. For some, these atomic coordinates are converted into atom-centered symmetry functions. From this data, a separate atomic neural network is trained for each element; each atomic network is evaluated whenever that element occurs in the given structure, and then the results are pooled together at the end. This process – in particular, the atom-centered symmetry functions which convey translational, rotational, and permutational invariances – has greatly improved machine learning potentials by significantly constraining the neural network search space. Other models use a similar process but emphasize bonds over atoms, using pair symmetry functions and training one network per atom pair.<ref name="ML"/><ref>{{cite journal|last1=Behler|first1=J|last2=Parrinello|first2=M|title=Generalized neural-network representation of high-dimensional potential-energy surfaces|date=2007|journal=Physical Review Letters|volume=148|issue=14|doi=10.1103/PhysRevLett.98.146401|pmid=17501293|bibcode=2007PhRvL..98n6401B}}</ref>
Other models to learn their own descriptors rather than using predetermined symmetry-dictating functions. These models, called [[Graph neural network#Message passing layers|message-passing neural networks]] (MPNNs), are graph neural networks. Treating molecules as three-dimensional [[Graph (discrete mathematics)|graphs]] (where atoms are nodes and bonds are edges), the model takes feature vectors describing the atoms as input, and iteratively updates these vectors as information about neighboring atoms is processed through message functions and [[convolution]]s. These feature vectors are then used to predict the final potentials. The flexibility of this method often results in stronger, more generalizable models. In 2017, the first-ever MPNN model (a deep tensor neural network) was used to calculate the properties of small organic molecules.
== Gaussian Approximation Potential (GAP) ==
One popular class of machine-learned interatomic potential is the Gaussian Approximation Potential (GAP),<ref>{{Cite journal |last1=Bartók |first1=Albert P. |last2=Payne |first2=Mike C. |last3=Kondor |first3=Risi |last4=Csányi |first4=Gábor |date=2010-04-01 |title=Gaussian Approximation Potentials: The Accuracy of Quantum Mechanics, without the Electrons |url=https://link.aps.org/doi/10.1103/PhysRevLett.104.136403 |journal=Physical Review Letters |volume=104 |issue=13 |pages=136403 |doi=10.1103/PhysRevLett.104.136403|pmid=20481899 |arxiv=0910.1019 |bibcode=2010PhRvL.104m6403B }}</ref><ref>{{Cite journal |last1=Bartók |first1=Albert P. |last2=De |first2=Sandip |last3=Poelking |first3=Carl |last4=Bernstein |first4=Noam |last5=Kermode |first5=James R. |last6=Csányi |first6=Gábor |last7=Ceriotti |first7=Michele |date=December 2017 |title=Machine learning unifies the modeling of materials and molecules |journal=Science Advances |language=en |volume=3 |issue=12 |pages=e1701816 |doi=10.1126/sciadv.1701816 |issn=2375-2548 |pmc=5729016 |pmid=29242828|arxiv=1706.00179 |bibcode=2017SciA....3E1816B }}</ref><ref>{{Cite web |title=Gaussian approximation potential – Machine learning atomistic simulation of materials and molecules |url=https://gap-ml.org/ |access-date=2024-04-04 |language=en-US}}</ref> which combines compact descriptors of local atomic environments<ref>{{Cite journal |last1=Bartók |first1=Albert P. |last2=Kondor |first2=Risi |last3=Csányi |first3=Gábor |date=2013-05-28 |title=On representing chemical environments |url=https://link.aps.org/doi/10.1103/PhysRevB.87.184115 |journal=Physical Review B |volume=87 |issue=18 |pages=184115 |doi=10.1103/PhysRevB.87.184115|arxiv=1209.3140 |bibcode=2013PhRvB..87r4115B }}</ref> with Gaussian process regression<ref>{{Cite book |last1=Rasmussen |first1=Carl Edward |title=Gaussian processes for machine learning |last2=Williams |first2=Christopher K. I. |date=2008 |publisher=MIT Press |isbn=978-0-262-18253-9 |edition=3. print |series=Adaptive computation and machine learning |___location=Cambridge, Mass.}}</ref> to machine learn the [[potential energy surface]] of a given system. To date, the GAP framework has been used to successfully develop a number of MLIPs for various systems, including for elemental systems such as [[
==References==
{{Reflist}}
[[Category:Machine learning]]
|