Multivariate kernel density estimation: Difference between revisions

Content deleted Content added
Citation bot (talk | contribs)
m Add: doi. | You can use this bot yourself. Report bugs here. | User-activated.
KL divergence is not distance since it is asymmetric
Line 175:
: <math>\operatorname{MUAE} (\mathbf{H}) = \operatorname{E}\, \operatorname{sup}_{\mathbf{x}} |\hat{f}_\mathbf{H} (\mathbf{x}) - f(\mathbf{x})|.</math>
which has been investigated only briefly.<ref>{{cite journal | author1=Cao, R. | author2=Cuevas, A. | author3=Manteiga, W.G.| title=A comparative study of several smoothing methods in density estimation | journal = Computational Statistics and Data Analysis | year=1994 | volume=17 | issue=2 | pages=153–176 | doi=10.1016/0167-9473(92)00066-Z}}</ref> Likelihood error criteria include those based on the Mean [[Kullback-LeiblerKullback–Leibler distancedivergence]]
 
: <math>\operatorname{MKL} (\mathbf{H}) = \int f(\mathbf{x}) \, \operatorname{log} [f(\mathbf{x})] \, d\mathbf{x} - \operatorname{E} \int f(\mathbf{x}) \, \operatorname{log} [\hat{f}(\mathbf{x};\mathbf{H})] \, d\mathbf{x}</math>