Content deleted Content added
Undid revision 368947327 by 162.105.72.77 (talk) |
CRGreathouse (talk | contribs) m cleanup |
||
Line 10:
: <math> y_i = y(t_i) = y(t_0 + i h) </math>
: <math> f_i = f(t_i, y_i) </math>
where
A linear multistep method uses a linear combination of <math>y_i</math> and <math>y_i'</math> to calculate the value of
Multistep method will use the previous
A linear multistep method is a method of the form
: <math> \begin{align}
& y_{n+s} + a_{s-1} y_{n+s-1} + a_{s-2} y_{n+s-2} + \cdots + a_0 y_n \\
& \qquad {} = h \bigl( b_s f(t_{n+s},y_{n+s}) + b_{s-1} f(t_{n+s-1},y_{n+s-1}) + \cdots + b_0 f(t_n,y_n) \bigr),
\end{align} </math>
where ''h'' denotes the step size and ''f'' the right-hand side of the differential equation. The coefficents <math> a_0, \ldots, a_{s-1} </math> and <math> b_0, \ldots, b_s </math> determine the method. The designer of the method chooses the coefficients; often, many coefficients are zero. Typically, the designer chooses the coefficients so they will exactly interpolate <math>y(t)</math> when it is an
If the value of <math>b_s</math> is nonzero, then the value of <math>y_{n+s}</math> depends on the value of <math> f(t_{n+s}, y_{n+s}) </math>. Consequently, the method is [[explicit]] if <math> b_s = 0 </math>. In that case, the formula can directly compute <math> y_{n+s} </math>. If <math> b_s \ne 0 </math> then the method is [[implicit]] and the equation for <math> y_{n+s} </math> must be solved. [[Iterative methods]] such as [[Newton's method]] are often used to solve the implicit formula.
Line 58:
==Multistep Method Families==
Three families of linear multistep methods are commonly used: Adams–Bashforth methods, Adams–Moulton methods, and the [[backward differentiation formula]]s (BDFs).
=== Adams–Bashforth methods ===
The Adams–Bashforth methods are explicit methods. The coefficients are <math>a_{s-1} = -1</math> and <math>a_{s-2} = \cdots = a_0 = 0</math>, while the <math>b_j</math> are chosen such that the methods has order ''s'' (this determines the methods uniquely).
The Adams–Bashforth methods with ''s'' = 1, 2, 3, 4, 5 are ({{harvnb|Hairer|Nørsett|Wanner|1993|loc=§III.1}}; {{harvnb|Butcher|2003|p=103}}):
* <math> y_{n+1} = y_n + hf(t_n, y_n) \, </math>
* <math> y_{n+2} = y_{n+1} + h\left( \tfrac32 f(t_{n+1}, y_{n+1}) - \tfrac12 f(t_n, y_n)\right); </math>
* <math> y_{n+3} = y_{n+2} + h\left( \tfrac{23}{12} f(t_{n+2}, y_{n+2}) - \tfrac43 f(t_{n+1}, y_{n+1}) + \tfrac{5}{12}f(t_n, y_n)\right); </math>
Line 92:
* <math> y_{n+2} = y_{n+1} + h \big( \tfrac5{12} f(t_{n+2},y_{n+2}) + \tfrac23 f(t_{n+1},y_{n+1}) - \tfrac1{12} f(t_n,y_n) \big); </math>
* <math> y_{n+3} = y_{n+2} + h \big( \tfrac38 f(t_{n+3},y_{n+3}) + \tfrac{19}{24} f(t_{n+2},y_{n+2}) - \tfrac5{24} f(t_{n+1},y_{n+1}) + \tfrac1{24} f(t_n,y_n) \big). </math>
* <math> \begin{align} y_{n+4} = y_{n+3} + h \big( &\tfrac{251}{720} f(t_{n+4},y_{n+4}) + \tfrac{646}{720} f(t_{n+3},y_{n+3}) \\
&- \tfrac{264}{720} f(t_{n+2},y_{n+2}) + \tfrac{106}{720} f(t_{n+1},y_{n+1}) - \tfrac{19}{720} f(t_n,y_n) \big). \end{align} </math>
Line 104:
The first question is whether the method is consistent: is the difference equation
:<math> \begin{align}
& y_{n+s} + a_{s-1} y_{n+s-1} + a_{s-2} y_{n+s-2} + \cdots + a_0 y_n \\
& \qquad {} = h \bigl( b_s f(t_{n+s},y_{n+s}) + b_{s-1} f(t_{n+s-1},y_{n+s-1}) + \cdots + b_0 f(t_n,y_n) \bigr),
\end{align} </math>
a good approximation of the differential equation <math> y' = f(t,y) </math>? More precisely, a multistep method is ''consistent'' if the local error goes to zero as the step size ''h'' goes to zero, where the ''local error'' is defined to be the difference between the result <math>y_{n+s}</math> of the method, assuming that all the previous values <math>y_{n+s-1}, \ldots, y_n</math> are exact, and the exact solution of the equation at time <math>t_{n+s}</math>. A computation using [[Taylor series]] shows out that a linear multistep method is consistent if and only if
Line 120:
In terms of these polynomials, the above condition for the method to have order ''p'' becomes
:<math> \rho(\mathrm{e}^h) - h\sigma(\mathrm{e}^h) = O(h^{p+1}) \quad \text{as } h\to 0. </math>
In particular, the method is consistent if it has order one, which is the case if <math>\rho(1)=0</math> and <math>\rho'(1)=\sigma(1)</math>.
If the roots of the characteristic polynomial <math>\rho</math> all have modulus less than or equal to 1 and the roots of modulus 1 are of multiplicity 1, we say that the [[root condition]] is satisfied.
The method is convergent [[if and only if]] it is consistent and the root condition is satisfied. Consequently, a consistent method is stable if and only if this condition is satisfied, and thus the method is convergent if and only if it is stable.
Line 128:
=== Example ===
Consider the Adams–Bashforth three-step method
:<math>y_{n+1} = y_n + h\left( {23\over 12} f(t_n, y_n) - {16 \over 12} f(t_{n-1}, y_{n-1}) + {5\over 12}f(t_{n-2}, y_{n-2})\right).</math>
The characteristic equation is thus
Line 137:
These two results were proved by [[Germund Dahlquist]] and represent an important bound for the order of convergence and for the [[Stiff_equation#A-stability|A-stability]] of a linear multistep method.
A [[zero stability|zero-stable]] and linear ''q''-step multistep method cannot attain an order of convergence greater than ''q'' + 1 if ''q'' is odd and greater than ''q'' + 2 if ''q'' is even. If the method is also explicit, then it cannot attain an order greater than ''q'' {{harv|Hairer|Nørsett|Wanner|1993|loc=Thm III.3.5}}.
There are no explicit [[Stiff_equation#A-stability|A-stable]] and linear multistep methods. The implicit ones have order of convergence at most 2 {{harv|Hairer|Wanner|1996|loc=Thm V.1.4}}.
== See also ==
*[[Digital energy gain]]
== References ==
Line 162:
* {{MathWorld | urlname= AdamsMethod| title= Adams Method }}
* [http://math.fullerton.edu/mathews/n2003/AdamsBashforthMod.html Adams-Bashforth-Moulton Method]
*[http://www.dotnumerics.com/NumericalLibraries/DifferentialEquations/ DotNumerics: Ordinary Differential Equations for C# and VB.NET] Initial-value problem for nonstiff and stiff ordinary differential equations (explicit Runge-Kutta, implicit Runge-Kutta, Gear’s BDF and Adams-Moulton).
{{Numerical integrators}}
|