Content deleted Content added
Line 47:
First, assign any values to unknown parameters, {'''S'''<sub>m</sub>}. Then, compute association variables f(m|n),
:<big>f(m|n) = r(m) l('''X'''(n)|m) / ∑<sub>m'=1..M</sub> r(m') l('''X'''(n)|m').</big> (3)
Equation for f(m|n) looks like the Bayes formula for a posteriori probabilities; if l(n|m) in the result of learning become conditional likelihoods, f(m|n) become Bayesian probabilities for signal n originating from object m. The dynamic logic of the Modeling Fields (MF) is defined as follows
:<big>df(m|n)/dt = f(m|n) ∑<sub>m'=1..M</sub> {[δ<sub>mm'</sub> - f(m'|n)] • [∂ln l(n|m')/∂'''M'''<sub>m'</sub>] ∂'''M'''<sub>m'</sub>/∂'''S'''<sub>m'</sub> d'''S'''<sub>m'</sub>/dt,</big> (4)
:<big>d'''S'''<sub>m</sub>/dt = ∑<sub>n=1..N</sub> f(m|n)[∂ln l(n|m)/∂'''M'''<sub>m</sub>]∂'''M'''<sub>m</sub>/∂'''S'''<sub>m</sub>,</big> (5)▼
▲:<big>d'''S'''<sub>m</sub>/dt = ∑<sub>n=1..N</sub> f(m|n)[∂ln l(n|m)/∂'''M'''<sub>m</sub>]∂'''M'''<sub>m</sub>/∂'''S'''<sub>m</sub>,</big>
The following theorem was proved:
''Theorem''.
It follows that the stationary states of an MF system are the maximum similarity states satisfying the knowledge instinct. When partial similarities are specified as probability density functions (pdf), or likelihoods, the stationary values of parameters {S<sub>m</sub>} are asymptotically unbiased and efficient estimates of these parameters<ref>Cramer, H. (1946). Mathematical Methods of Statistics, Princeton University Press, Princeton NJ.</ref>. A computational complexity of dynamic logic is linear in N.
Practically, when solving the equations through successive iterations, f(m|n) can be recomputed at every iteration
The proof of the above theorem contains a proof that similarity L increases at each iteration. This has a psychological interpretation that the knowledge instinct is satisfied at each step, the corresponding emotion is positive and NMF-dynamic logic emotionally enjoys learning
|