Ising model: Difference between revisions

Content deleted Content added
see also already in main body of text
OAbot (talk | contribs)
m Open access bot: url-access updated in citation with #oabot.
Line 5:
The '''Ising model''' (or '''Lenz–Ising model'''), named after the physicists [[Ernst Ising]] and [[Wilhelm Lenz]], is a [[mathematical models in physics|mathematical model]] of [[ferromagnetism]] in [[statistical mechanics]]. The model consists of [[discrete variables]] that represent [[Nuclear magnetic moment|magnetic dipole moments of atomic "spins"]] that can be in one of two states (+1 or −1). The spins are arranged in a [[Graph (abstract data type)|graph]], usually a [[lattice (group)|lattice]] (where the local structure repeats periodically in all directions), allowing each spin to interact with its neighbors. Neighboring spins that agree have a lower energy than those that disagree; the system tends to the lowest energy but heat disturbs this tendency, thus creating the possibility of different structural phases. The model allows the identification of [[phase transition]]s as a simplified model of reality. The two-dimensional [[square-lattice Ising model]] is one of the simplest statistical models to show a [[phase transition]].<ref>See {{harvtxt|Gallavotti|1999}}, Chapters VI-VII.</ref>
 
The Ising model was invented by the physicist {{harvs|txt|authorlink=Wilhelm Lenz|first=Wilhelm|last=Lenz|year=1920}}, who gave it as a problem to his student Ernst Ising. The one-dimensional Ising model was solved by {{harvtxt|Ising|1925}} alone in his 1924 thesis;<ref>[http://www.hs-augsburg.de/~harsch/anglica/Chronology/20thC/Ising/isi_fm00.html Ernst Ising, ''Contribution to the Theory of Ferromagnetism'']</ref> it has no phase transition. The two-dimensional square-lattice Ising model is much harder and was only given an analytic description much later, by {{harvs|txt|authorlink=Lars Onsager|first=Lars |last=Onsager|year=1944}}. It is usually solved by a [[Transfer-matrix method (statistical mechanics)|transfer-matrix method]], although there exists a very simple approach relating the model to a non-interacting fermionic [[quantum field theory]].<ref>{{Cite journal |last1=Samuel |first1=Stuart|date=1980 |title=The use of anticommuting variable integrals in statistical mechanics. I. The computation of partition functions|url=https://doi.org/10.1063/1.524404 |journal=Journal of Mathematical Physics |language=en |volume=21|issue=12 |pages= 2806–2814 |doi=10.1063/1.524404|url-access=subscription }}</ref>
 
In dimensions greater than four, the phase transition of the Ising model is described by [[mean-field theory]]. The Ising model for greater dimensions was also explored with respect to various tree topologies in the late 1970s, culminating in an exact solution of the zero-field, time-independent {{harvtxt|Barth|1981}} model for closed Cayley trees of arbitrary branching ratio, and thereby, arbitrarily large dimensionality within tree branches. The solution to this model exhibited a new, unusual phase transition behavior, along with non-vanishing long-range and nearest-neighbor spin-spin correlations, deemed relevant to large neural networks as one of its possible {{pslink|Ising model|applications|nopage=y}}.
Line 249:
=== Artificial neural network ===
{{Main|Hopfield network}}
Ising model was instrumental in the development of the [[Hopfield network]]. The original Ising model is a model for equilibrium. [[Roy J. Glauber]] in 1963 studied the Ising model evolving in time, as a process towards thermal equilibrium ([[Glauber dynamics]]), adding in the component of time.<ref name=":222">{{cite journal |last1=Glauber |first1=Roy J. |date=February 1963 |title=Roy J. Glauber "Time-Dependent Statistics of the Ising Model" |url=https://aip.scitation.org/doi/abs/10.1063/1.1703954 |journal=Journal of Mathematical Physics |volume=4 |issue=2 |pages=294–307 |doi=10.1063/1.1703954 |access-date=2021-03-21|url-access=subscription }}</ref> (Kaoru Nakano, 1971)<ref name="Nakano1971">{{cite book |last1=Nakano |first1=Kaoru |title=Pattern Recognition and Machine Learning |date=1971 |isbn=978-1-4615-7568-9 |pages=172–186 |chapter=Learning Process in a Model of Associative Memory |doi=10.1007/978-1-4615-7566-5_15}}</ref><ref name="Nakano1972">{{cite journal |last1=Nakano |first1=Kaoru |date=1972 |title=Associatron-A Model of Associative Memory |journal=IEEE Transactions on Systems, Man, and Cybernetics |volume=SMC-2 |issue=3 |pages=380–388 |doi=10.1109/TSMC.1972.4309133}}</ref> and ([[Shun'ichi Amari]], 1972),<ref name="Amari19722">{{cite journal |last1=Amari |first1=Shun-Ichi |date=1972 |title=Learning patterns and pattern sequences by self-organizing nets of threshold elements |journal=IEEE Transactions |volume=C |issue=21 |pages=1197–1206}}</ref> proposed to modify the weights of an Ising model by [[Hebbian theory|Hebbian learning]] rule as a model of associative memory. The same idea was published by ({{ill|William A. Little (physicist)|lt=William A. Little|de|William A. Little}}, 1974),<ref name="little74">{{cite journal |last=Little |first=W. A. |year=1974 |title=The Existence of Persistent States in the Brain |journal=Mathematical Biosciences |volume=19 |issue=1–2 |pages=101–120 |doi=10.1016/0025-5564(74)90031-5}}</ref> who was cited by Hopfield in his 1982 paper.
 
The [[Spin glass#Sherrington–Kirkpatrick model|Sherrington–Kirkpatrick model]] of spin glass, published in 1975,<ref>{{Cite journal |last1=Sherrington |first1=David |last2=Kirkpatrick |first2=Scott |date=1975-12-29 |title=Solvable Model of a Spin-Glass |url=https://link.aps.org/doi/10.1103/PhysRevLett.35.1792 |journal=Physical Review Letters |volume=35 |issue=26 |pages=1792–1796 |bibcode=1975PhRvL..35.1792S |doi=10.1103/PhysRevLett.35.1792 |issn=0031-9007|url-access=subscription }}</ref> is the Hopfield network with random initialization. Sherrington and Kirkpatrick found that it is highly likely for the energy function of the SK model to have many local minima. In the 1982 paper, Hopfield applied this recently developed theory to study the Hopfield network with binary activation functions.<ref name="Hopfield1982">{{cite journal |last1=Hopfield |first1=J. J. |date=1982 |title=Neural networks and physical systems with emergent collective computational abilities |journal=Proceedings of the National Academy of Sciences |volume=79 |issue=8 |pages=2554–2558 |bibcode=1982PNAS...79.2554H |doi=10.1073/pnas.79.8.2554 |pmc=346238 |pmid=6953413 |doi-access=free}}</ref> In a 1984 paper he extended this to continuous activation functions.<ref name=":03">{{cite journal |last1=Hopfield |first1=J. J. |date=1984 |title=Neurons with graded response have collective computational properties like those of two-state neurons |journal=Proceedings of the National Academy of Sciences |volume=81 |issue=10 |pages=3088–3092 |bibcode=1984PNAS...81.3088H |doi=10.1073/pnas.81.10.3088 |pmc=345226 |pmid=6587342 |doi-access=free}}</ref> It became a standard model for the study of neural networks through statistical mechanics.<ref>{{Cite book |last1=Engel |first1=A. |title=Statistical mechanics of learning |last2=Broeck |first2=C. van den |date=2001 |publisher=Cambridge University Press |isbn=978-0-521-77307-2 |___location=Cambridge, UK; New York, NY}}</ref><ref>{{Cite journal |last1=Seung |first1=H. S. |last2=Sompolinsky |first2=H. |last3=Tishby |first3=N. |date=1992-04-01 |title=Statistical mechanics of learning from examples |url=https://journals.aps.org/pra/abstract/10.1103/PhysRevA.45.6056 |journal=Physical Review A |volume=45 |issue=8 |pages=6056–6091 |bibcode=1992PhRvA..45.6056S |doi=10.1103/PhysRevA.45.6056 |pmid=9907706|url-access=subscription }}</ref>
 
===Sea ice===
Line 450:
==== Renormalization ====
 
When there is no external field, we can derive a functional equation that <math>f(\beta, 0) = f(\beta)</math> satisfies using renormalization.<ref>{{Cite journal |last1=Maris |first1=Humphrey J. |last2=Kadanoff |first2=Leo P. |date=June 1978 |title=Teaching the renormalization group |url=https://pubs.aip.org/aapt/ajp/article/46/6/652-657/1045608 |journal=American Journal of Physics |language=en |volume=46 |issue=6 |pages=652–657 |doi=10.1119/1.11224 |bibcode=1978AmJPh..46..652M |issn=0002-9505|url-access=subscription }}</ref> Specifically, let <math>Z_N(\beta, J)</math> be the partition function with <math>N</math> sites. Now we have:<math display="block">Z_N(\beta, J) = \sum_{\sigma} e^{K \sigma_2(\sigma_1 + \sigma_3)}e^{K \sigma_4(\sigma_3 + \sigma_5)}\cdots</math>where <math>K := \beta J</math>. We sum over each of <math>\sigma_2, \sigma_4, \cdots</math>, to obtain<math display="block">Z_N(\beta, J) = \sum_{\sigma} (2\cosh(K(\sigma_1 + \sigma_3))) \cdot (2\cosh(K(\sigma_3 + \sigma_5))) \cdots</math>Now, since the cosh function is even, we can solve <math>Ae^{K'\sigma_1\sigma_3} = 2\cosh(K(\sigma_1+\sigma_3))</math> as <math display="inline">A = 2\sqrt{\cosh(2K)}, K' = \frac 12 \ln\cosh(2K)</math>. Now we have a self-similarity relation:<math display="block">\frac 1N \ln Z_N(K) = \frac 12 \ln\left(2\sqrt{\cosh(2K)}\right) + \frac 12 \frac{1}{N/2} \ln Z_{N/2}(K')</math>Taking the limit, we obtain<math display="block">f(\beta) = \frac 12 \ln\left(2\sqrt{\cosh(2K)}\right) + \frac 12 f(\beta')</math>where <math>\beta' J = \frac 12 \ln\cosh(2\beta J)</math>.
 
When <math>\beta</math> is small, we have <math>f(\beta)\approx \ln 2</math>, so we can numerically evaluate <math>f(\beta)</math> by iterating the functional equation until <math>K</math> is small.
Line 936:
* {{Citation | last = Lenz | first = W. | author-link = Wilhelm Lenz | year = 1920 | title = Beiträge zum Verständnis der magnetischen Eigenschaften in festen Körpern | journal = Physikalische Zeitschrift | volume = 21 | pages = 613–615 }}
* Barry M. McCoy and Tai Tsun Wu (1973), ''The Two-Dimensional Ising Model''. Harvard University Press, Cambridge Massachusetts, {{ISBN|0-674-91440-6}}
*{{Citation | last1=Montroll | first1=Elliott W. | last2=Potts | first2=Renfrey B. | last3=Ward | first3=John C. | author-link3=John Clive Ward | title=Correlations and spontaneous magnetization of the two-dimensional Ising model | url=http://link.aip.org/link/?JMAPAQ%2F4%2F308%2F1 | doi=10.1063/1.1703955 | mr=0148406 | year=1963 | journal=[[Journal of Mathematical Physics]] | issn=0022-2488 | volume=4 | pages=308–322 | bibcode=1963JMP.....4..308M | issue=2 | url-status=dead | archive-url=https://archive.today/20130112095848/http://link.aip.org/link/?JMAPAQ/4/308/1 | archive-date=2013-01-12 | access-date=2009-10-25 | url-access=subscription }}
*{{Citation | last1=Onsager | first1=Lars | author-link1= Lars Onsager|title=Crystal statistics. I. A two-dimensional model with an order-disorder transition | doi=10.1103/PhysRev.65.117 | mr=0010315 | year=1944 | journal= Physical Review | series = Series II | volume=65 | pages=117–149|bibcode = 1944PhRv...65..117O | issue=3–4 }}
*{{Citation |last=Onsager |first=Lars |author-link=Lars Onsager|title=Discussion|journal=Supplemento al Nuovo Cimento | volume=6|page=261|year=1949}}