Content deleted Content added
m it's > its - it's does not mean belonging to it, it is only a contraction of it is, or it has |
→Summary: mimicry and "internal representation" are important. also important to describe the levels of unsupervision. |
||
Line 1:
{{Short description|A paradigm in machine learning}}
'''Unsupervised learning''' is a
Other methods in the supervision spectrum are [[Reinforcement Learning]] where the machine is given only a numerical performance score as guidance, and [[Weak_supervision | Weak or Semi supervision]] where a small portion of the data is tagged, and [[Self-supervised_learning | Self Supervision]].
== Neural networks ==
Line 15 ⟶ 16:
=== Energy ===
An energy function is a macroscopic measure of a network's activation state. In Boltzmann machines, it plays the role of the Cost function. This analogy with physics is inspired by Ludwig Boltzmann's analysis of a gas' macroscopic energy from the microscopic probabilities of particle motion <math>p \propto e^{-E/kT}</math>, where k is the Boltzmann constant and T is temperature. In the [[Restricted_Boltzmann_machine|RBM]] network the relation is <math> p = e^{-E} / Z </math>,<ref name="Hinton2010" /> where <math>p</math> and <math>E</math> vary over every possible activation pattern and <math>\textstyle{Z = \sum_{\scriptscriptstyle{\text{All Patterns}}} e^{-E(\text{pattern})}}</math>. To be more precise, <math>p(a) = e^{-E(a)} / Z</math>, where <math>a</math> is an activation pattern of all neurons (visible and hidden). Hence, some early neural networks bear the name Boltzmann Machine. Paul Smolensky calls <math>-E\,</math> the ''Harmony''. A network seeks low energy which is high Harmony.
=== Networks ===
|