Probabilistic neural network: Difference between revisions

Content deleted Content added
OAbot (talk | contribs)
m Open access bot: url-access updated in citation with #oabot.
 
(21 intermediate revisions by 13 users not shown)
Line 1:
A '''probabilistic neural network''' ('''PNN)''' )<ref name="pnn-book">{{cite book |last1=Mohebali |first1=Behshad |last2=Tahmassebi |first2=Amirhessam |last3=Meyer-Baese |first3=Anke |last4=Gandomi |first4=Amir H. |title=Probabilistic neural networks: a brief overview of theory, implementation, and application |date=2020 |publisher=Elsevier |pages=347–367 |doi=10.1016/B978-0-12-816514-0.00014-X |s2cid=208119250 }}</ref> is a [[feedforward neural network]], which is widely used in classification and pattern recognition problems. In the PNN algorithm, the parent probability distribution function (PDF) of each class is approximated by a [[Kernel density estimation|Parzen window]] and a non-parametric function. Then, using PDF of each class, the class probability of a new input data is estimated and Bayes’ rule is then employed to allocate the class with highest posterior probability to new input data. By this method, the probability of mis-classification is minimized.<ref>{{Cite webjournal|url=https://www.researchgate.net/publication/312519997|title=Competitive probabilistic neural network|year=2017|doi=10.3233/ICA-170540|last1=Zeinali|first1=Yasha|last2=Story|first2=Brett (PDFA.|journal=Integrated DownloadComputer-Aided Available)Engineering|websitevolume=ResearchGate24|languageissue=en2|access-datepages=2017-03-16105–118}}</ref> This type of [[artificial neural network]] (ANN) was derived from the [[Bayesian network]]<ref>{{cite web |url=http://herselfsai.com/2007/03/probabilistic-neural-networks.html |title=ArchivedProbabilistic copyNeural Networks |accessdateaccess-date=2012-03-22 |url-status=dead |archiveurlarchive-url=https://web.archive.org/web/20101218121158/http://herselfsai.com/2007/03/probabilistic-neural-networks.html |archivedatearchive-date=2010-12-18 }}</ref> and a statistical algorithm called [[Kernel Fisher discriminant analysis]].<ref>{{cite web |url=http://www.psi.toronto.edu/~vincent/research/presentations/PNN.pdf |title=Archived copy |accessdateaccess-date=2012-03-22 |url-status=dead |archiveurlarchive-url=https://web.archive.org/web/20120131053940/http://www.psi.toronto.edu/~vincent/research/presentations/PNN.pdf |archivedatearchive-date=2012-01-31 }}</ref> It was introduced by D.F. Specht in 1966.<ref>{{Cite journal|last=Specht|first=D. F.|date=1967-06-01|title=Generation of Polynomial Discriminant Functions for Pattern Recognition|journal=IEEE Transactions on Electronic Computers|volume=EC-16|issue=3|pages=308–319|doi=10.1109/PGEC.1967.264667|issn=0367-7508}}</ref><ref name=Specht1990>{{Cite journal | last1 = Specht | first1 = D. F. | doi = 10.1016/0893-6080(90)90049-Q | title = Probabilistic neural networks | journal = Neural Networks | volume = 3 | pages = 109–118 | year = 1990 | pmid = | pmc = }}</ref> In a PNN, the operations are organized into a multilayered feedforward network with four layers:
* Input layer
* Pattern layer
Line 7:
==Layers ==
 
PNN is often used in classification problems.<ref>{{cite web |url=http://www.mathworks.in/help/toolbox/nnet/ug/bss38ji-1.html{{dead link|title=Probabilistic Neural Networks :: Radial Basis Networks (Neural Network Toolbox™) |website=www.mathworks.in |access-date=September6 2017June 2022 |botarchive-url=InternetArchiveBothttps://archive.today/20120804150441/http://www.mathworks.in/help/toolbox/nnet/ug/bss38ji-1.html |fixarchive-attempteddate=yes4 August 2012 |url-status=dead}}</ref> When an input is present, the first layer computes the distance from the input vector to the training input vectors. This produces a vector where its elements indicate how close the input is to the training input. The second layer sums the contribution for each class of inputs and produces its net output as a vector of probabilities. Finally, a compete transfer function on the output of the second layer picks the maximum of these probabilities, and produces a 1 (positive identification) for that class and a 0 (negative identification) for non-targeted classes.
 
=== Input layer ===
Line 13:
 
===Pattern layer===
This layer contains one neuron for each case in the training data set. It stores the values of the predictor variables for the case along with the target value. A hidden neuron computes the [[Euclidean distance]] of the test case from the neuron’sneuron's center point and then applies the [[radial basis function]] kernel function]] using the sigma values.
 
===Summation layer===
Line 22:
 
== Advantages==
There are several advantages and disadvantages using PNN instead of [[multilayer perceptron]].<ref>{{cite web |url=http://www.dtreg.com/pnn.htm |title=ArchivedProbabilistic copyand General Regression Neural Networks |accessdateaccess-date=2012-03-22 |url-status=dead |archiveurlarchive-url=https://web.archive.org/web/20120302075157/http://www.dtreg.com/pnn.htm |archivedatearchive-date=2012-03-02 }}</ref>
* PNNs are much faster than multilayer perceptron networks.
* PNNs can be more accurate than multilayer perceptron networks.
Line 28:
* PNN networks generate accurate predicted target probability scores.
* PNNs approach Bayes optimal classification.
* Chaitanya.
* Srikanth.
 
==Disadvantages ==
* PNN are slower than multilayer perceptron networks at classifying new cases.
* PNN require more memory space to store the model.
* Vivek
 
==Applications based on PNN==
* probabilistic neural networks in modelling structural deterioration of stormwater pipes.<ref>{{cite journal |last1=Tran |first1=D. H. |last2=Ng |first2=A. W. M. |last3=Perera |first3=B. J. C. |last4=Burn |first4=S. |last5=Davis |first5=P. |title=Application of probabilistic neural networks in modelling structural deterioration of stormwater pipes |journal=Urban Water Journal |date=September 2006 |volume=3 |issue=3 |pages=175–184 |doi=10.1080/15730620600961684 |bibcode=2006UrbWJ...3..175T |s2cid=15220500 |url=http://vuir.vu.edu.au/583/1/UrbanWater-Dung.pdf|archive-url=https://web.archive.org/web/20170808222146/http://vuir.vu.edu.au/583/1/UrbanWater-Dung.pdf|archive-date=8 August 2017 |access-date=27 February 2023}}</ref>
* probabilistic neural networks method to gastric endoscope samples diagnosis based on FTIR spectroscopy.<ref>{{cite journal |pmid=19810529 | volume=29 | issue=6 | title=[Application of probabilistic neural networks method to gastric endoscope samples diagnosis based on FTIR spectroscopy] | year=2009 | journal=Guang Pu Xue Yu Guang Pu Fen Xi | pages=1553–7| last1=Li | first1=Q. B. | last2=Li | first2=X. | last3=Zhang | first3=G. J. | last4=Xu | first4=Y. Z. | last5=Wu | first5=J. G. | last6=Sun | first6=X. J. }}</ref>
* Application of probabilistic neural networks to population pharmacokineties.<ref>{{Cite book | doi=10.1109/IJCNN.2003.1223983| isbn=0-7803-7898-9| chapter=Application of probabilistic neural networks to population pharmacokineties| title=Proceedings of the International Joint Conference on Neural Networks, 2003| year=2003| last1=Berno| first1=E.| last2=Brambilla| first2=L.| last3=Canaparo| first3=R.| last4=Casale| first4=F.| last5=Costa| first5=M.| last6=Della Pepa| first6=C.| last7=Eandi| first7=M.| last8=Pasero| first8=E.| pages=2637–2642| s2cid=60477107}}</ref>
* Probabilistic Neural Networks in Solving Different Pattern Classification Problems.<ref>http://www.idosi.org/wasj/wasj4(6)/3.pdf</ref>
* Probabilistic Neural Networks to the Class Prediction of Leukemia and Embryonal Tumor of Central Nervous System.<ref>{{Cite journal|url=http://dl.acm.org/citation.cfm?id=1011984|doi = 10.1023/B:NEPL.0000035613.51734.48|title = Application of Probabilistic Neural Networks to the Class Prediction of Leukemia and Embryonal Tumor of Central Nervous System|year = 2004|last1 = Huang|first1 = Chenn-Jung|last2 = Liao|first2 = Wei-Chen|journal = Neural Processing Letters|volume = 19|issue = 3|pages = 211–226|s2cid = 5651402|url-access = subscription}}</ref>
* Application of probabilistic neural networks to population pharmacokineties.<ref>http://ieeexplore.ieee.org/xpl/freeabs_all.jsp?arnumber=1223983</ref>
* Ship Identification Using Probabilistic Neural Networks.<ref>{{cite journal |last1=Araghi |first1=Leila Fallah |last2=d Khaloozade |first2=Hami |last3=Arvan |first3=Mohammad Reza |title=Ship Identification Using Probabilistic Neural Networks (PNN) |journal=Proceedings of the International MultiConference of Engineers and Computer Scientists |date=19 March 2009 |volume=2 |url=https://www.iaeng.org/publication/IMECS2009/IMECS2009_pp1291-1294.pdf |access-date=27 February 2023 |___location=[[Hong Kong]], China |language=en}}</ref>
* Probabilistic Neural Networks to the Class Prediction of Leukemia and Embryonal Tumor of Central Nervous System.<ref>http://dl.acm.org/citation.cfm?id=1011984</ref>
* Probabilistic Neural Network-Based sensor configuration management in a wireless ''ad hoc'' network.<ref>{{Cite web |url=http://www.ll.mit.edu/asap/asap_04/DAY2/27_PA_STEVENS.PDF |title=Archived copy |access-date=2012-03-22 |archive-url=https://web.archive.org/web/20100614171621/http://www.ll.mit.edu/asap/asap_04/DAY2/27_PA_STEVENS.PDF |archive-date=2010-06-14 |url-status=dead }}</ref>
* Ship Identification Using Probabilistic Neural Networks.<ref>http://www.iaeng.org/publication/IMECS2009/IMECS2009_pp1291-1294.pdf</ref>
* Probabilistic Neural Network-Based sensor configuration management in a wireless ''ad hoc'' network.<ref>http://www.ll.mit.edu/asap/asap_04/DAY2/27_PA_STEVENS.PDF</ref>
* Probabilistic Neural Network in character recognizing.
* Remote-sensing Image Classification.<ref>{{cite journal|last1=Zhang|first1=Y.|title=Remote-sensing Image Classification Based on an Improved Probabilistic Neural Network|journal=Sensors|date=2009|volume=9|issue=9|pages=7516–7539|doi=10.3390/s90907516|pmid=22400006|pmc=3290485|bibcode=2009Senso...9.7516Z|doi-access=free}}</ref>
 
== References ==
{{Reflist}}
 
[[Category:ArtificialNeural neuralnetwork networksarchitectures]]