Hyper basis function network: Difference between revisions

Content deleted Content added
No edit summary
Tr00rle (talk | contribs)
No edit summary
Line 1:
In [[machine learning]], a '''Hyper basis function network''', or '''HyperBF network''', is a generalization of [[Radial basis function network|radial basis function (RBF) networks]] concept, where the [[Mahalanobis distance|Mahalanobis]]-like distance is used instead of Euclidian distance measure. Hyper basis function networks were first consideredintroduced by Poggio and Girosi in the 1990 at the paper “Networks for Approximation and Learning” <ref name="PoggioGirosi1990"> T. Poggio and F. Girosi (1990). "Networks for Approximation and Learning". ''Proc. of the IEEE'' '''Vol. 78, No. 9''':1481-1497.</ref><ref name="Mahdi">R.N. Mahdi, E.C. Rouchka (2011). [http://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=5733426 "Reduced HyperBF Networks: Regularization by Explicit Complexity Reduction and Scaled Rprop-Based Training"]. ''IEEE Transactions of Neural Networks'' '''2''':673–686.</ref>.
 
==Network Architecture==
Line 31:
==References==
{{reflist}}
 
[[Category:Artificial neural networks]]
[[Category:Classification algorithms]]
[[Category:Machine learning algorithms]]