Stochastic approximation: Difference between revisions

Content deleted Content added
for robust estimation
Line 5:
has a function <math>M(x)</math> for which one wishes to find the value of <math>x</math>, <math>x_0</math>, satisfying <math>M(x_0)=\alpha</math>. However, what is observable is not <math>M(x)</math>, but rather a random variable <math>N(x)</math> such that <math>E(N(x)|x)=M(x)</math>. The algorithm is then to construct a sequence <math>x_1, x_2, \dots</math> which satisfies
::<math>x_{n+1}=x_n+a_n(\alpha-N(x_n))</math>.
Here, <math>a_1, a_2, \dots</math> is a sequence of positive step-sizes. [[Herbert Robbins|Robbins]] and Monro proved that, if <math>N(x)</math>ref isname="rm" uniformly bounded, <math>M(x)</math> is nondecreasing, <mathsup>M'(x_0)</math> exists and is positive, and ifTheorem <math>a_n2</mathsup> satisfies a set of bounds (fulfilled if one takes <math>a_n=1/n</math>), thenthat <math>x_n</math> [[convergence of random variables|converges]] in <math>L^2</math> (and hence also in probability) to <math>x_0</math>.<ref name="rm"provided /><sup>, Theorem 2</sup>. In general, the <math>a_n's</math> need not equal <math>1/n</math>. However, to ensure convergence, they should converge to zero, and in order to average out the noise in <math>N(x)</math>, they should converge slowly.that:
* <math>N(x)</math> is uniformly bounded,
* <math>M(x)</math> is nondecreasing,
* <math>M'(x_0)</math> exists and is positive, and
* <math>a_n</math> satisfies a certain set of bounds, to converge slowly to zero.
 
The last condition is fulfilled for example by taking <math>a_n=1/n</math>; other series are possible but in order to average out the noise in <math>N(x)</math>, <math>a_n</math> must converge slowly.
 
==Kiefer-Wolfowitz algorithm==