Probabilistic neural network: Difference between revisions

Content deleted Content added
Yobot (talk | contribs)
m WP:CHECKWIKI errors fixed + general fixes using AWB (8961)
Layers of PNN: Shifted summary para to top of section
Line 6:
 
==Layers of PNN==
 
PNN is often used in classification problems.<ref>http://www.mathworks.in/help/toolbox/nnet/ug/bss38ji-1.html</ref> When an Input is present, the first layer computes the distance from the input vector to the training input vectors. This produces a vector where its elements indicate how close the input is to the training input. The second layer sums the contribution for each class of inputs and produces its net output as a vector of probabilities. Finally, a compete transfer function on the output of the second layer picks the maximum of these probabilities, and produces a 1 (positive identification) for that class and a 0 (negative identification) for non-targeted classes.
 
=== Input layer ===
Line 14 ⟶ 16:
 
===Summation layer===
For PNN networks there is one pattern neuron for each category of the target variable. The actual target category of each training case is stored with each hidden neuron; the weighted value coming out of a hidden neuron is fed only to the pattern neuron that corresponds to the hidden neuron’s category. The pattern neurons add the values for the class they represent .
PNN is often used in classification problems.<ref>http://www.mathworks.in/help/toolbox/nnet/ug/bss38ji-1.html</ref> When an Input is present, the first layer computes the distance from the input vector to the training input vectors. This produces a vector where its elements indicate how close the input is to the training input. The second layer sums the contribution for each class of inputs and produces its net output as a vector of probabilities. Finally, a compete transfer function on the output of the second layer picks the maximum of these probabilities, and produces a 1 (positive identification) for that class and a 0 (negative identification) for non-targeted classes.
 
===Output layer===