Content deleted Content added
No edit summary |
GreenC bot (talk | contribs) Move 1 url. Wayback Medic 2.5 per WP:URLREQ#ieee.org |
||
(5 intermediate revisions by 5 users not shown) | |||
Line 1:
In [[machine learning]], a '''Hyper basis function network''', or '''HyperBF network''', is a generalization of [[Radial basis function network|radial basis function (RBF) networks]] concept, where the [[Mahalanobis distance|Mahalanobis]]-like distance is used instead of
==Network Architecture==
Line 27:
where <math>\omega</math> determines the rate of convergence.
Overall, training HyperBF networks can be computationally challenging. Moreover, the high degree of freedom of HyperBF leads to overfitting and poor generalization. However, HyperBF networks have an important advantage that a small number of neurons is enough for learning complex functions.<ref name="Mahdi"
==References==
{{reflist}}
[[Category:Artificial neural networks]]
[[Category:Classification algorithms]]
[[Category:Machine learning algorithms]]
|