Content deleted Content added
No edit summary |
No edit summary |
||
Line 39:
==Dynamic logic==
The learning process consists in estimating model parameters S and associating signals with concepts by maximizing the similarity L. Note, all possible combinations of signals and models are accounted for in expression for L. This can be seen by expanding a sum and multiplying all the terms; it would result in MN items, a huge number. This is the number of combinations between all signals (N) and all models (M). Here is the source of
NMF solves this problem by using [[dynamic logic (neural)|dynamic logic]]<ref>Perlovsky, L.I. (1996). Mathematical Concepts of Intellect. Proc. World Congress on Neural Networks, San Diego, CA; Lawrence Erlbaum Associates, NJ, pp.1013-16</ref>, <ref>Perlovsky, L.I.(1997). Physical Concepts of Intellect. Proc. Russian Academy of Sciences, 354(3), pp. 320-323.</ref>. An important aspect of dynamic logic is ''matching vagueness or fuzziness of similarity measures to the uncertainty of models''. Initially, parameter values are not known, and uncertainty of models is high; so is the fuzziness of the similarity measures. In the process of learning, models become more accurate, and the similarity measure more crisp, the value of the similarity increases. This is the mechanism of dynamic logic. Mathematics of dynamic logicis described in a separate article.
|