Neural modeling fields: Difference between revisions

Content deleted Content added
Romanilin (talk | contribs)
Romanilin (talk | contribs)
Line 36:
A psychological interpretation of maximizing similarity L is the [[Perlovsky | knowledge instinct]], a drive to learn, to maximize the knowledge.
 
==Dynamic logic, Mathematical description==
The learning process consists in estimating model parameters S and associating signals with concepts by maximizing the similarity L. Note, all possible combinations of signals and models are accounted for in expression for L. This can be seen by expanding a sum and multiplying all the terms; it would result in MNM<sup>N</sup> items, a huge number. This is the number of combinations between all signals (N) and all models (M). Here is the source of Combinatorial Complexity of many algorithms used in the past. For example, a popular multiple hypothesis testing algorithm<ref>Singer, R.A., Sea, R.G. and Housewright, R.B. (1974). Derivation and Evaluation of Improved Tracking Filters for Use in Dense Multitarget Environments, IEEE Transactions on Information Theory, IT-20, pp. 423-432.</ref> attempts to maximize similarity L over model parameters and associations between signals and models, in two steps. First it takes one of the MNM<sup>N</sup> items, which is one particular association between signals and models; and maximizes it over model parameters. Second, the largest item is selected (that is the best association for the best set of parameters). Such a program inevitably faces a wall of Combinatorial Complexity, the number of computations on the order of MNM<sup>N</sup>.
 
NMF solves this problem by using [[dynamic logic (neural)Perlovsky|dynamic logic]]<ref>Perlovsky, L.I. (1996). Mathematical Concepts of Intellect. Proc. World Congress on Neural Networks, San Diego, CA; Lawrence Erlbaum Associates, NJ, pp.1013-16</ref>, <ref>Perlovsky, L.I.(1997). Physical Concepts of Intellect. Proc. Russian Academy of Sciences, 354(3), pp. 320-323.</ref>. An important aspect of dynamic logic is ''matching vagueness or fuzziness of similarity measures to the uncertainty of models''. Initially, parameter values are not known, and uncertainty of models is high; so is the fuzziness of the similarity measures. In the process of learning, models become more accurate, and the similarity measure more crisp, the value of the similarity increases. This is the mechanism of dynamic logic. Mathematics of dynamic logicis described in a separate article.
 
Dynamic logic maximizes similarity L, associates bottom-up signals X with top-down signals M, and finds values of parameters S<sub>m</sub>, without combinatorial complexity, as follows.
 
First, assign any values to unknown parameters, {S<sub>m</sub>}. Then, compute association variables f(m|n),
 
f(m|n) = r(m) l(X(n)|m) / &sum;<sub>m'=1..M</sub> r(m') l(X(n)|m').
 
Equation for f(m|n) looks like the Bayes formula for a posteriori probabilities; if l(n|m) in the result of learning become conditional likelihoods, f(m|n) become Bayesian probabilities for signal n originating from object m. The dynamic logic of the Modeling Fields (MF) is defined as follows
 
 
df(m|n)/dt = f(m|n) &sum;<sub>m'=1..M</sub> {[&delta;<sub>mm'</sub> - f(m'|n)] • [&part;ln l(n|m')/&part;M<sub>m'</sub>] &part;M<sub>m'</sub>/&part;S<sub>m'</sub> dS<sub>m'</sub>/dt,
 
dSm/dt = &sum;<sub>n=1..N</sub> f(m|n)[&part;ln l(n|m)/&part;M<sub>m</sub>]&part;M<sub>m</sub>/&part;S<sub>m</sub>,
 
 
The following theorem was proved:
 
''Theorem''. Previous equations define a convergent dynamic NMF system with stationary states defined by max{S<sub>m</sub>}L.
 
It follows that the stationary states of an MF system are the maximum similarity states satisfying the knowledge instinct. When partial similarities are specified as probability density functions (pdf), or likelihoods, the stationary values of parameters {S<sub>m</sub>} are asymptotically unbiased and efficient estimates of these parameters<ref>Cramer, H. (1946). Mathematical Methods of Statistics, Princeton University Press, Princeton NJ.</ref>. A computational complexity of dynamic logic is linear in N.
 
Practically, when solving the equations through successive iterations, f(m|n) can be recomputed at every iteration step as opposed to incremental formula.
 
The proof of the above theorem contains a proof that similarity L increases at each iteration. This has a psychological interpretation that the knowledge instinct is satisfied at each step, the corresponding emotion is positive and NMF-dynamic logic emotionally enjoys learning
 
==Example of Dynamic Logic Operations==