This article provides insufficient context for those unfamiliar with the subject. |
In control theory a control-Lyapunov function is a generalization of the notion of Lyapunov function used in stability analysis. The ordinary Lyapunov function is used to test whether a dynamical system is stable, that is whether the system started in a state will eventually return to . The control-Lyapunov function is used to test whether a system is feedback stabilizable, that is whether for any state x there exists a control such that the system can be brought to the zero state by applying the control u.
More formally, suppose we are given a dynamical system
Definition. A control-Lyapunov function is a function that is continuous, positive-definite (that is V(x,u) is positive except at where it is zero), proper (that is as ), and such that
(this is the key condition; in words it says that for each state x we can find a control u that will reduce the "energy" V). Intuitively, if in each state we can always find a way to reduce the energy, we should eventually be able to bring the energy to zero, that is to bring the system to a stop. This is made rigorous by the following result:
Artstein's theorem. The dynamical system has a differentiable clf if and only if there exists a regular stabilizing feedback u(x).
It may not be easy to find a clf for a given system, but if we can find one thanks to some ingenuity and luck, then the feedback stabilization problem simplifies considerably, in fact it reduces to solving a static non-linear programming problem
for each state x.
The theory and application of CLF's were developed by Z. Artstein and E. D. Sontag in the 1980's and 1990's.