Radial basis function network: Difference between revisions

Content deleted Content added
Impinball (talk | contribs)
Network architecture: Add sources for a claim and narrow it down, add a tag to call some attention to it.
Alter: pages. Add: journal, s2cid. Removed URL that duplicated unique identifier. Removed parameters. Formatted dashes. | You can use this tool yourself. Report bugs here. | via #UCB_Gadget
Line 39:
|citeseerx=10.1.1.497.5646
}}</ref><ref>{{cite conference
|journal=IEEE Xplore
|conference=Proceedings of the Second Joint 24th Annual Conference and the Annual Fall Meeting of the Biomedical Engineering Society
|conference-url=https://ieeexplore.ieee.org/servlet/opac?punumber=8844528
Line 50 ⟶ 49:
|publication-date=6 January 2003
|volume=3
|pages=2184-21852184–2185
|doi=10.1109/IEMBS.2002.1053230
|url=https://ieeexplore.ieee.org/document/1053230/?arnumber=1053230
|url-access=subscription
|access-date=25 May 2020
|title=Mahalanobis distance with radial basis function network on protein secondary structures
|journal = Engineering in Medicine and Biology Society, Proceedings of the Annual International Conference of the IEEE|isbn=0-7803-7612-9
|isbn=0-7803-7612-9
|issn=1094-687X
}}</ref>{{Editorializing|date=May 2020}}<!-- Was previously marked with a "citation needed" asking in what sense using Mahalanobis distance is better and why the Euclidean distance is still normally used, but I found sources to support the first part, so it's likely salvageable. -->) and the radial basis function is commonly taken to be [[Normal distribution|Gaussian]]
Line 68 ⟶ 65:
i.e. changing parameters of one neuron has only a small effect for input values that are far away from the center of that neuron.
 
Given certain mild conditions on the shape of the activation function, RBF networks are [[universal approximator]]s on a [[Compact space|compact]] subset of <math>\mathbb{R}^n</math>.<ref name="Park">{{cite journal|last=Park|first=J.|author2=I. W. Sandberg|s2cid=34868087|date=Summer 1991|title=Universal Approximation Using Radial-Basis-Function Networks|journal=Neural Computation|volume=3|issue=2|pages=246–257|doi=10.1162/neco.1991.3.2.246|pmid=31167308}}</ref> This means that an RBF network with enough hidden neurons can approximate any continuous function on a closed, bounded set with arbitrary precision.
 
The parameters <math> a_i </math>, <math> \mathbf{c}_i </math>, and <math> \beta_i </math> are determined in a manner that optimizes the fit between <math> \varphi </math> and the data.