Content deleted Content added
m Open access bot: url-access updated in citation with #oabot. |
|||
(85 intermediate revisions by 44 users not shown) | |||
Line 1:
A '''probabilistic neural network''' ('''PNN''')<ref name="pnn-book">{{cite book |last1=Mohebali |first1=Behshad |last2=Tahmassebi |first2=Amirhessam |last3=Meyer-Baese |first3=Anke |last4=Gandomi |first4=Amir H. |title=Probabilistic neural networks: a brief overview of theory, implementation, and application |date=2020 |publisher=Elsevier |pages=347–367 |doi=10.1016/B978-0-12-816514-0.00014-X |s2cid=208119250 }}</ref> is a [[feedforward neural network]], which is widely used in classification and pattern recognition problems. In the PNN algorithm, the parent probability distribution function (PDF) of each class is approximated by a [[Kernel density estimation|Parzen window]] and a non-parametric function. Then, using PDF of each class, the class probability of a new input data is estimated and Bayes’ rule is then employed to allocate the class with highest posterior probability to new input data. By this method, the probability of mis-classification is minimized.<ref>{{Cite journal|url=https://www.researchgate.net/publication/312519997|title=Competitive probabilistic neural network|year=2017|doi=10.3233/ICA-170540|last1=Zeinali|first1=Yasha|last2=Story|first2=Brett A.|journal=Integrated Computer-Aided Engineering|volume=24|issue=2|pages=105–118}}</ref> This type of [[artificial neural network]] (ANN) was derived from the [[Bayesian network]]<ref>{{cite web |url=http://herselfsai.com/2007/03/probabilistic-neural-networks.html |title=Probabilistic Neural Networks |access-date=2012-03-22 |url-status=dead |archive-url=https://web.archive.org/web/20101218121158/http://herselfsai.com/2007/03/probabilistic-neural-networks.html |archive-date=2010-12-18 }}</ref> and a statistical algorithm called [[Kernel Fisher discriminant analysis]].<ref>{{cite web |url=http://www.psi.toronto.edu/~vincent/research/presentations/PNN.pdf |title=Archived copy |access-date=2012-03-22 |url-status=dead |archive-url=https://web.archive.org/web/20120131053940/http://www.psi.toronto.edu/~vincent/research/presentations/PNN.pdf |archive-date=2012-01-31 }}</ref> It was introduced by D.F. Specht in 1966.<ref>{{Cite journal|last=Specht|first=D. F.|date=1967-06-01|title=Generation of Polynomial Discriminant Functions for Pattern Recognition|journal=IEEE Transactions on Electronic Computers|volume=EC-16|issue=3|pages=308–319|doi=10.1109/PGEC.1967.264667|issn=0367-7508}}</ref><ref name=Specht1990>{{Cite journal | last1 = Specht | first1 = D. F. | doi = 10.1016/0893-6080(90)90049-Q | title = Probabilistic neural networks | journal = Neural Networks | volume = 3 | pages = 109–118 | year = 1990 }}</ref> In a PNN, the operations are organized into a multilayered feedforward network with four layers:
* Input layer
*
*
* Output layer
==Layers
=== Input layer ===▼
Each neuron in the input layer represents a predictor variable. In categorical variables, ''N-1'' neurons are used when there are ''N'' number of categories. It standardizes the range of the values by subtracting the median and dividing by the interquartile range.Then the input neurons feed the values to each of the neurons in the hidden layer.▼
PNN is often
===Hidden layer===▼
▲=== Input layer ===
▲Each neuron in the input layer represents a predictor variable. In categorical variables, ''N-1'' neurons are used when there are ''N'' number of categories. It standardizes the range of the values by subtracting the median and dividing by the [[interquartile range]]. Then the input neurons feed the values to each of the neurons in the hidden layer.
===Pattern layer===
▲PNN often use in classification problems<ref>http://www.mathworks.in/help/toolbox/nnet/ug/bss38ji-1.html</ref>.When an Input is present, first layer computes the distance from the input vector to the training input vectors. It produce a vector where its elements indicate how close the input is to training input. The second layer sums the contribution for each class of inputs and produce it's net output as a vector of probabilities.Finally, a compete transfer function on the output of the second layer picks the maximum of these probabilities, and produces a 1 for that class and a 0 for the other classes.
For PNN there is one pattern neuron for each category of the target variable. The actual target category of each training case is stored with each hidden neuron; the weighted value coming out of a hidden neuron is fed only to the pattern neuron that corresponds to the hidden neuron’s category. The pattern neurons add the values for the class they represent.
===Output layer===
The output layer compares the weighted votes for each target category accumulated in the pattern layer and uses the largest vote to predict the target category.
== Advantages==
There are several advantages and disadvantages using PNN instead of [[multilayer perceptron]].<ref>{{cite web |url=http://www.dtreg.com/pnn.htm |title=Probabilistic and General Regression Neural Networks |access-date=2012-03-22 |url-status=dead |archive-url=https://web.archive.org/web/20120302075157/http://www.dtreg.com/pnn.htm |archive-date=2012-03-02 }}</ref>
*
*
* PNN
* PNN networks generate accurate predicted target probability scores.
*
==Disadvantages ==
* PNN are slower than multilayer perceptron networks at classifying new cases.
* PNN require more memory space to store the model.
==Applications based on PNN==
* probabilistic neural networks in modelling structural deterioration of stormwater pipes.<ref>{{cite journal |last1=Tran |first1=D. H. |last2=Ng |first2=A. W. M. |last3=Perera |first3=B. J. C. |last4=Burn |first4=S. |last5=Davis |first5=P. |title=Application of probabilistic neural networks in modelling structural deterioration of stormwater pipes |journal=Urban Water Journal |date=September 2006 |volume=3 |issue=3 |pages=175–184 |doi=10.1080/15730620600961684 |bibcode=2006UrbWJ...3..175T |s2cid=15220500 |url=http://vuir.vu.edu.au/583/1/UrbanWater-Dung.pdf|archive-url=https://web.archive.org/web/20170808222146/http://vuir.vu.edu.au/583/1/UrbanWater-Dung.pdf|archive-date=8 August 2017 |access-date=27 February 2023}}</ref>
* probabilistic neural networks method to gastric endoscope samples diagnosis based on FTIR spectroscopy.<ref>
* Application of probabilistic neural networks to population pharmacokineties.<ref>{{Cite book | doi=10.1109/IJCNN.2003.1223983| isbn=0-7803-7898-9| chapter=Application of probabilistic neural networks to population pharmacokineties| title=Proceedings of the International Joint Conference on Neural Networks, 2003| year=2003| last1=Berno| first1=E.| last2=Brambilla| first2=L.| last3=Canaparo| first3=R.| last4=Casale| first4=F.| last5=Costa| first5=M.| last6=Della Pepa| first6=C.| last7=Eandi| first7=M.| last8=Pasero| first8=E.| pages=2637–2642| s2cid=60477107}}</ref>
* Probabilistic Neural Networks to the Class Prediction of Leukemia and Embryonal Tumor of Central Nervous System.<ref>{{Cite journal|url=http://dl.acm.org/citation.cfm?id=1011984|doi = 10.1023/B:NEPL.0000035613.51734.48|title = Application of Probabilistic Neural Networks to the Class Prediction of Leukemia and Embryonal Tumor of Central Nervous System|year = 2004|last1 = Huang|first1 = Chenn-Jung|last2 = Liao|first2 = Wei-Chen|journal = Neural Processing Letters|volume = 19|issue = 3|pages = 211–226|s2cid = 5651402|url-access = subscription}}</ref>
* Ship Identification Using Probabilistic Neural Networks.<ref>{{cite journal |last1=Araghi |first1=Leila Fallah |last2=d Khaloozade |first2=Hami |last3=Arvan |first3=Mohammad Reza |title=Ship Identification Using Probabilistic Neural Networks (PNN) |journal=Proceedings of the International MultiConference of Engineers and Computer Scientists |date=19 March 2009 |volume=2 |url=https://www.iaeng.org/publication/IMECS2009/IMECS2009_pp1291-1294.pdf |access-date=27 February 2023 |___location=[[Hong Kong]], China |language=en}}</ref>
* Probabilistic Neural Network-Based sensor configuration management in a wireless ''ad hoc'' network.<ref>{{Cite web |url=http://www.ll.mit.edu/asap/asap_04/DAY2/27_PA_STEVENS.PDF |title=Archived copy |access-date=2012-03-22 |archive-url=https://web.archive.org/web/20100614171621/http://www.ll.mit.edu/asap/asap_04/DAY2/27_PA_STEVENS.PDF |archive-date=2010-06-14 |url-status=dead }}</ref>
* Probabilistic Neural Network
* Remote-sensing Image Classification.<ref>{{cite journal|last1=Zhang|first1=Y.|title=Remote-sensing Image Classification Based on an Improved Probabilistic Neural Network|journal=Sensors|date=2009|volume=9|issue=9|pages=7516–7539|doi=10.3390/s90907516|pmid=22400006|pmc=3290485|bibcode=2009Senso...9.7516Z|doi-access=free}}</ref>
== References ==
{{Reflist}}
[[Category:Neural network architectures]]
|