Marginal stability

In the theory of dynamical systems and control theory, a linear time-invariant system is marginally stable if it is neither asymptotically stable nor unstable. Roughly speaking, a system is stable if it always returns to and stays near a particular state (called the steady state), and is unstable if it goes farther and farther away from any state, without being bounded. A marginal system, sometimes referred to as having neutral stability,[1] is between these two types: when displaced, it does not return to near a common steady state, nor does it go away from where it started without limit.

Marginal stability, like instability, is a feature that control theory seeks to avoid; we wish that, when perturbed by some external force, a system will return to a desired state. This necessitates the use of appropriately designed control algorithms.

In econometrics, the presence of a unit root in observed time series, rendering them marginally stable, can lead to invalid regression results regarding effects of the independent variables upon a dependent variable, unless appropriate techniques are used to convert the system to a stable system.

  1. ^ Gene F. Franklin; J. David Powell; Abbas Emami-Naeini (2006). Feedback Control of Dynamic Systems (5 ed.). Pearson Education. ISBN 0-13-149930-0.

Developed by StudentB