Hyper basis function network: Difference between revisions

Content deleted Content added
No edit summary
GreenC bot (talk | contribs)
 
(5 intermediate revisions by 5 users not shown)
Line 1:
In [[machine learning]], a '''Hyper basis function network''', or '''HyperBF network''', is a generalization of [[Radial basis function network|radial basis function (RBF) networks]] concept, where the [[Mahalanobis distance|Mahalanobis]]-like distance is used instead of EuclidianEuclidean distance measure. Hyper basis function networks were first consideredintroduced by Poggio and Girosi in the 1990 at the paper “Networks for Approximation and Learning” .<ref name="PoggioGirosi1990"> T. Poggio and F. Girosi (1990). "Networks for Approximation and Learning". ''Proc. of the IEEE'' '''Vol. 78, No. 9''':1481-1497.</ref><ref name="Mahdi">R.N. Mahdi, E.C. Rouchka (2011). [httphttps://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=5733426 "Reduced HyperBF Networks: Regularization by Explicit Complexity Reduction and Scaled Rprop-Based Training"]. ''IEEE Transactions of Neural Networks'' '''2''':673–686.</ref>.
 
==Network Architecture==
Line 27:
where <math>\omega</math> determines the rate of convergence.
 
Overall, training HyperBF networks can be computationally challenging. Moreover, the high degree of freedom of HyperBF leads to overfitting and poor generalization. However, HyperBF networks have an important advantage that a small number of neurons is enough for learning complex functions.<ref name="Mahdi">R.N. Mahdi, E.C. Rouchka (2011). [http://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=5733426 "Reduced HyperBF Networks: Regularization by Explicit Complexity Reduction and Scaled Rprop-Based Training"]. ''IEEE Transactions of Neural Networks'' '''2''':673–686.</ref>
 
==References==
{{reflist}}
 
[[Category:Artificial neural networks]]
[[Category:Classification algorithms]]
[[Category:Machine learning algorithms]]