Machine-learned interatomic potential: Difference between revisions

Content deleted Content added
OAbot (talk | contribs)
m Open access bot: arxiv updated in citation with #oabot.
Adding description of the Gaussian Approximation Potential, a class of MLIP
Tags: nowiki added Visual edit
Line 8:
 
Still other models, rather than using predetermined symmetry-dictating functions, prefer to learn their own descriptors instead. These models, called message-passing neural networks (MPNNs), are graph neural networks. Treating molecules as three-dimensional [[Graph (discrete mathematics)|graphs]] (where atoms are nodes and bonds are edges), the model intakes feature vectors describing the atoms, and iteratively updates these feature vectors as information about neighboring atoms is processed through message functions and convolutions. These feature vectors are then used to predict the final potentials. This method gives more flexibility to the artificial intelligences, often resulting in stronger and more generalizable models. In 2017, the first-ever MPNN model, a deep tensor neural network, was used to calculate the properties of small organic molecules. Such technology was commercialized, leading to the development of Matlantis in 2022, which extracts properties through both the forward and backward passes. [https://matlantis.com/ Matlantis], which can simulate 72 elements, handle up to 20,000 atoms at a time, and execute calculations up to 20,000,000 times faster than [[density functional theory]] with almost indistinguishable accuracy, showcases the power of machine learning potentials in the age of artificial intelligence.<ref>{{cite journal|last1=Schutt|first1=KT|last2=Arbabzadah|first2=F|last3=Chmiela|first3=S|last4=Muller|first4=KR|last5=Tkatchenko|first5=A|date=2017|title=Quantum-chemical insights from deep tensor neural networks|journal=Nature Communications|volume=8 |page=13890 |doi=10.1038/ncomms13890 |pmid=28067221 |pmc=5228054 |bibcode=2017NatCo...813890S }}</ref><ref name="ML"/><ref>{{cite journal|journal=Nature Communications|title=Towards universal neural network potential for material discovery applicable to arbitrary combinations of 45 elements|last1=Takamoto|first1=So|last2=Shinagawa|first2=Chikashi|last3=Motoki|first3=Daisuke|last4=Nakago|first4=Kosuke|volume=13|date=May 30, 2022|issue=1 |page=2991 |doi=10.1038/s41467-022-30687-9 |pmid=35637178 |pmc=9151783 |bibcode=2022NatCo..13.2991T }}</ref><ref>{{cite web|url=https://matlantis.com/|title=Matlantis}}</ref>
 
== Gaussian Approximation Potential (GAP) ==
One popular class of machine-learned interatomic potential is the Gaussian Approximation Potential (GAP)<ref>{{Cite journal |last=Bartók |first=Albert P. |last2=Payne |first2=Mike C. |last3=Kondor |first3=Risi |last4=Csányi |first4=Gábor |date=2010-04-01 |title=Gaussian Approximation Potentials: The Accuracy of Quantum Mechanics, without the Electrons |url=https://link.aps.org/doi/10.1103/PhysRevLett.104.136403 |journal=Physical Review Letters |volume=104 |issue=13 |pages=136403 |doi=10.1103/PhysRevLett.104.136403}}</ref><ref>{{Cite journal |last=Bartók |first=Albert P. |last2=De |first2=Sandip |last3=Poelking |first3=Carl |last4=Bernstein |first4=Noam |last5=Kermode |first5=James R. |last6=Csányi |first6=Gábor |last7=Ceriotti |first7=Michele |date=2017-12 |title=Machine learning unifies the modeling of materials and molecules |url=https://www.science.org/doi/10.1126/sciadv.1701816 |journal=Science Advances |language=en |volume=3 |issue=12 |doi=10.1126/sciadv.1701816 |issn=2375-2548 |pmc=PMC5729016 |pmid=29242828}}</ref><ref>{{Cite web |title=Gaussian approximation potential – Machine learning atomistic simulation of materials and molecules |url=https://gap-ml.org/ |access-date=2024-04-04 |language=en-US}}</ref>, which combines compact descriptors of local atomic environments<ref>{{Cite journal |last=Bartók |first=Albert P. |last2=Kondor |first2=Risi |last3=Csányi |first3=Gábor |date=2013-05-28 |title=On representing chemical environments |url=https://link.aps.org/doi/10.1103/PhysRevB.87.184115 |journal=Physical Review B |volume=87 |issue=18 |pages=184115 |doi=10.1103/PhysRevB.87.184115}}</ref> with Gaussian process regression<ref>{{Cite book |last=Rasmussen |first=Carl Edward |title=Gaussian processes for machine learning |last2=Williams |first2=Christopher K. I. |date=2008 |publisher=MIT Press |isbn=978-0-262-18253-9 |edition=3. print |series=Adaptive computation and machine learning |___location=Cambridge, Mass.}}</ref> to machine learn the [[potential energy surface]] of a given system. To date, the GAP framework has been used to successfully develop a number of MLIPs for various systems, including for elemental systems such as Carbon<ref>{{Cite journal |last=Deringer |first=Volker L. |last2=Csányi |first2=Gábor |date=2017-03-03 |title=Machine learning based interatomic potential for amorphous carbon |url=https://link.aps.org/doi/10.1103/PhysRevB.95.094203 |journal=Physical Review B |volume=95 |issue=9 |pages=094203 |doi=10.1103/PhysRevB.95.094203}}</ref>, Silicon<ref>{{Cite journal |last=Bartók |first=Albert P. |last2=Kermode |first2=James |last3=Bernstein |first3=Noam |last4=Csányi |first4=Gábor |date=2018-12-14 |title=Machine Learning a General-Purpose Interatomic Potential for Silicon |url=https://link.aps.org/doi/10.1103/PhysRevX.8.041048 |journal=Physical Review X |volume=8 |issue=4 |pages=041048 |doi=10.1103/PhysRevX.8.041048}}</ref>, Phosphorous<ref>{{Cite journal |last=Deringer |first=Volker L. |last2=Caro |first2=Miguel A. |last3=Csányi |first3=Gábor |date=2020-10-29 |title=A general-purpose machine-learning force field for bulk and nanostructured phosphorus |url=https://www.nature.com/articles/s41467-020-19168-z |journal=Nature Communications |language=en |volume=11 |issue=1 |pages=5461 |doi=10.1038/s41467-020-19168-z |issn=2041-1723 |pmc=PMC7596484 |pmid=33122630}}</ref>, and Tungsten<ref>{{Cite journal |last=Szlachta |first=Wojciech J. |last2=Bartók |first2=Albert P. |last3=Csányi |first3=Gábor |date=2014-09-24 |title=Accuracy and transferability of Gaussian approximation potential models for tungsten |url=https://link.aps.org/doi/10.1103/PhysRevB.90.104108 |journal=Physical Review B |volume=90 |issue=10 |pages=104108 |doi=10.1103/PhysRevB.90.104108}}</ref>, as well as for multicomponent systems such as Ge<sub>2</sub>Sb<sub>2</sub>Te<sub>5</sub><ref>{{Cite journal |last=Mocanu |first=Felix C. |last2=Konstantinou |first2=Konstantinos |last3=Lee |first3=Tae Hoon |last4=Bernstein |first4=Noam |last5=Deringer |first5=Volker L. |last6=Csányi |first6=Gábor |last7=Elliott |first7=Stephen R. |date=2018-09-27 |title=Modeling the Phase-Change Memory Material, Ge 2 Sb 2 Te 5 , with a Machine-Learned Interatomic Potential |url=https://pubs.acs.org/doi/10.1021/acs.jpcb.8b06476 |journal=The Journal of Physical Chemistry B |language=en |volume=122 |issue=38 |pages=8998–9006 |doi=10.1021/acs.jpcb.8b06476 |issn=1520-6106}}</ref> and austenitic [[stainless steel]], Fe<sub>7</sub>Cr<sub>2</sub>Ni<ref>{{Cite journal |last=Shenoy |first=Lakshmi |last2=Woodgate |first2=Christopher D. |last3=Staunton |first3=Julie B. |last4=Bartók |first4=Albert P. |last5=Becquart |first5=Charlotte S. |last6=Domain |first6=Christophe |last7=Kermode |first7=James R. |date=2024-03-22 |title=<nowiki>Collinear-spin machine learned interatomic potential for ${\mathrm{Fe}}_{7}{\mathrm{Cr}}_{2}\mathrm{Ni}$ alloy</nowiki> |url=https://link.aps.org/doi/10.1103/PhysRevMaterials.8.033804 |journal=Physical Review Materials |volume=8 |issue=3 |pages=033804 |doi=10.1103/PhysRevMaterials.8.033804}}</ref>.
 
==References==