Unsupervised learning: Difference between revisions

Content deleted Content added
m it's > its - it's does not mean belonging to it, it is only a contraction of it is, or it has
Numiri (talk | contribs)
Summary: mimicry and "internal representation" are important. also important to describe the levels of unsupervision.
Line 1:
{{Short description|A paradigm in machine learning}}
'''Unsupervised learning''' is a paradigmmethod in [[machine learning]] where, in contrast to [[supervised learning]] and [[semi-supervised learning]], algorithms learn patterns exclusively from unlabeled data. The hope is that through mimicry, which is an important mode of learning in people, the machine is forced to build a concise representation of its world and then generate imaginative content from it.
 
Other methods in the supervision spectrum are [[Reinforcement Learning]] where the machine is given only a numerical performance score as guidance, and [[Weak_supervision | Weak or Semi supervision]] where a small portion of the data is tagged, and [[Self-supervised_learning | Self Supervision]].
== Neural networks ==
 
Line 15 ⟶ 16:
 
=== Energy ===
An energy function is a macroscopic measure of a network's activation state. In Boltzmann machines, it plays the role of the Cost function. This analogy with physics is inspired by Ludwig Boltzmann's analysis of a gas' macroscopic energy from the microscopic probabilities of particle motion <math>p \propto e^{-E/kT}</math>, where k is the Boltzmann constant and T is temperature. In the [[Restricted_Boltzmann_machine|RBM]] network the relation is <math> p = e^{-E} / Z </math>,<ref name="Hinton2010" /> where <math>p</math> and <math>E</math> vary over every possible activation pattern and <math>\textstyle{Z = \sum_{\scriptscriptstyle{\text{All Patterns}}} e^{-E(\text{pattern})}}</math>. To be more precise, <math>p(a) = e^{-E(a)} / Z</math>, where <math>a</math> is an activation pattern of all neurons (visible and hidden). Hence, some early neural networks bear the name Boltzmann Machine. Paul Smolensky calls <math>-E\,</math> the ''Harmony''. A network seeks low energy which is high Harmony.
 
=== Networks ===