Definition of Lyapunov functions
A Lyapunov function is a scalar function that lets you determine whether a system is stable without actually solving the differential equations describing it. The core idea is borrowed from physics: if you can show that some "energy-like" quantity is always decreasing along system trajectories, the system must be settling down toward equilibrium.
Named after the Russian mathematician Aleksandr Lyapunov (who introduced the concept in the late 19th century), these functions generalize the notion of energy in physical systems. The real power here is that they work for nonlinear systems, where eigenvalue-based methods from linear analysis don't directly apply.
Properties of Lyapunov functions
A valid Lyapunov function must satisfy several properties. Missing even one of these can invalidate your stability conclusions, so pay close attention.
Positive definiteness
A function is positive definite if:
- (it's zero at the equilibrium)
- for all (it's strictly positive everywhere else)
This is the most fundamental requirement. Think of it as saying the "energy" is zero only at equilibrium and positive anywhere else. Simple examples: in one dimension, or in two dimensions.
Continuous differentiability
Lyapunov functions must be continuously differentiable (). This means has partial derivatives that are themselves continuous. You need this because the entire method hinges on computing , the time derivative along system trajectories. Without continuous differentiability, that derivative may not exist or may behave erratically.
Radial unboundedness
A function is radially unbounded if as . This property ensures the "energy" grows without bound as you move farther from the origin.
Radial unboundedness is specifically needed for global stability conclusions. Without it, you can only claim stability in some local region. For example, is radially unbounded, but is not (it saturates at 1).
Note: is radially unbounded but is not differentiable at the origin, so it wouldn't qualify as a valid Lyapunov function despite satisfying this particular property.
Lyapunov stability theory
Lyapunov stability vs. asymptotic stability
These two terms sound similar but mean different things:
- Lyapunov stable: Trajectories that start near the equilibrium stay near it. The system doesn't blow up, but it also doesn't necessarily converge. Think of a ball rolling in a perfectly flat-bottomed bowl with no friction.
- Asymptotically stable: Trajectories not only stay near the equilibrium but actually converge to it as . Now the bowl has friction, and the ball eventually comes to rest.
Asymptotic stability is strictly stronger. Every asymptotically stable system is Lyapunov stable, but not the other way around.
Lyapunov stability theorems
Lyapunov's direct method (also called the second method) gives sufficient conditions for stability based on the properties of and its time derivative :
- Lyapunov stability: If is positive definite and (negative semi-definite), then the equilibrium is Lyapunov stable.
- Asymptotic stability: If is positive definite and for all (negative definite), then the equilibrium is asymptotically stable.
- Global asymptotic stability: If, in addition to condition 2, is also radially unbounded, then the equilibrium is globally asymptotically stable.
The time derivative is computed along system trajectories using the chain rule:
where is the system dynamics from . You never need to solve for explicitly.
Lyapunov instability theorem
The logic also works in reverse for proving instability. If you can find a continuously differentiable function such that:
- for some arbitrarily close to the origin
- for all in a neighborhood of the origin
then the equilibrium is unstable. The "energy" is increasing, so trajectories are being pushed away from equilibrium.
_%3D_x%5E2_V(x)_%3D_x1%5E2_%2B_x2%5E2%22-6-7403272x88.png)
Construction of Lyapunov functions
Finding a Lyapunov function is often the hardest part of the analysis. There's no universal recipe, but several standard approaches exist depending on the system type.
Quadratic Lyapunov functions
The most common form is:
where is a symmetric positive definite matrix. This function is automatically positive definite and radially unbounded.
For a linear system , you can systematically find by solving the Lyapunov equation:
where is any positive definite matrix you choose (often , the identity matrix). If is stable (all eigenvalues have negative real parts), this equation has a unique positive definite solution , and the resulting proves asymptotic stability.
Steps to construct a quadratic Lyapunov function for a linear system:
- Choose a positive definite matrix (e.g., ).
- Solve for .
- Verify that is positive definite (check that all eigenvalues of are positive).
- If is positive definite, then is a valid Lyapunov function and the system is asymptotically stable.
Non-quadratic Lyapunov functions
For systems with complex nonlinearities, quadratic forms may not work. Alternatives include:
- Polynomial Lyapunov functions: Higher-degree polynomials, sometimes found using sum-of-squares (SOS) optimization techniques
- Logarithmic Lyapunov functions: Useful in population dynamics and certain chemical systems
- Energy-based functions: Directly use the physical energy (kinetic + potential) of the system
Constructing these typically requires insight into the specific system's structure. There's no one-size-fits-all method.
Lyapunov functions for linear vs. nonlinear systems
For linear systems , the problem is essentially solved: if all eigenvalues of have negative real parts, the Lyapunov equation gives you a valid quadratic function. The process is algorithmic.
For nonlinear systems, things get harder. Common strategies include:
- Linearization: Use the Jacobian at the equilibrium to get a local quadratic Lyapunov function. This only proves local stability.
- Energy-based methods: If the system has a physical interpretation, try the total energy as a candidate.
- Variable gradient method: Assume a form for and solve for .
- SOS programming: Use computational tools to search for polynomial Lyapunov functions.
The choice depends heavily on the specific system. Trial and error is common, and there's no guarantee you'll find one even if the system is stable.
Applications of Lyapunov functions
Stability analysis of equilibrium points
The most direct application: given a nonlinear system, propose a candidate Lyapunov function, compute , and check its sign. Classic examples include the damped pendulum (where total mechanical energy serves as a natural Lyapunov function) and the Van der Pol oscillator.
Stability analysis of periodic orbits
Lyapunov functions can also assess stability of periodic orbits (limit cycles), not just equilibrium points. You construct a Lyapunov-like function in a neighborhood of the orbit and show it decreases toward the orbit. This appears in biological oscillator models like the FitzHugh-Nagumo model and in chaotic systems like the Lorenz system.
Controller design
Lyapunov functions are not just analysis tools; they're also design tools. The idea is to choose a control input that makes negative definite for a chosen . This approach underpins several major control design methods:
- Feedback linearization: Cancel nonlinearities and impose linear closed-loop dynamics
- Backstepping: Build up a Lyapunov function and controller recursively for systems in strict-feedback form
_%3D_x%5E2_V(x)_%3D_x1%5E2_%2B_x2%5E2%22-6-7403272x7.png)
Adaptive control
In adaptive control, system parameters are unknown and must be estimated online. Lyapunov functions guide the design of both the control law and the parameter adaptation law simultaneously. You augment the Lyapunov function to include parameter estimation errors, then derive update laws that keep . This is the foundation of model reference adaptive control (MRAC) and adaptive observer design.
Limitations of Lyapunov functions
Conservativeness
Lyapunov's theorems give sufficient conditions, not necessary ones. A system can be perfectly stable, yet you might fail to prove it because you picked the wrong candidate function. This conservativeness can also lead to overly cautious controller designs that sacrifice performance for guaranteed stability margins.
Difficulty in construction
For complex nonlinear systems, finding a suitable Lyapunov function remains an open challenge. There is no systematic procedure that works for all systems. The search often relies on physical intuition, educated guessing, and computational tools. This is the single biggest practical limitation of Lyapunov-based methods.
Extensions of Lyapunov stability theory
Barbalat's lemma
Barbalat's lemma helps you conclude asymptotic convergence in cases where is only negative semi-definite. It states: if is uniformly continuous and exists and is finite, then as .
This is particularly useful in adaptive control, where can be shown but cannot. Barbalat's lemma lets you bridge that gap and still prove that certain signals converge to zero.
LaSalle's invariance principle
LaSalle's invariance principle is another way to extract asymptotic stability conclusions from a Lyapunov function with only . It states:
If is positive definite with , then all bounded trajectories converge to the largest invariant set contained in .
If the only invariant set within is the origin itself, you can conclude asymptotic stability even though is only negative semi-definite. This principle is extremely useful in practice because finding a strictly negative definite is often much harder than finding a negative semi-definite one.
Lyapunov-like functions
These are generalizations that relax one or more classical Lyapunov function requirements. Examples include:
- Semi-definite Lyapunov functions: but not necessarily for all
- Vector Lyapunov functions: Use a vector of scalar functions instead of a single scalar, useful for large-scale interconnected systems
- Integral Lyapunov functions: Incorporate integral terms, useful for systems with time-varying or non-smooth dynamics
These extensions broaden the class of systems that can be analyzed using Lyapunov-based reasoning, though they come with their own technical conditions and subtleties.