What exactly is steady-state solution?

Solution 1:

In different areas, steady state has slightly different meanings, so please be aware of that.

We want a theory to study the qualitative properties of solutions of differential equations, without solving the equations explicitly.

Moreover, we often want to know whether a certain property of these solutions remains unchanged if the system is subjected to various changes (often called perturbations).

It is very important to be able to study how sensitive the particular model is to small perturbations or changes of initial conditions and of various paramters.

This leads us to an area of DEQ called Stability Analysis using phase space methods and we would consider this for both autonomous and nonautonomous systems under the umbrella of the term equilibrium.

Autonomous

Definition: The equilibrium solution ${y}0$ of an autonomous system $y' = f(y)$ is said to be stable if for each number $\varepsilon$ $>0$ we can find a number $\delta$ $>0$ (depending on $\varepsilon$) such that if $\psi(t)$ is any solution of $y' = f(y)$ having $\Vert$ $\psi(t)$ $- {y_0}$ $\Vert$ $<$ $\delta$, then the solution $\psi(t)$ exists for all $t \geq {t_0}$ and $\Vert$ $\psi(t)$ $- {y_0}$ $\Vert$ $<$ $\varepsilon$ for $t \geq {t_0}$ (where for convenience the norm is the Euclidean distance that makes neighborhoods spherical).

Definition: The equilibrium solution ${y_0}$ is said to be asymptotically stable if it is stable and if there exists a number ${\delta_0}$ $> 0$ such that if $\psi(t)$ is any solution of $y' = f(y)$ having $\Vert$ $\psi(t)$ $- {y_0}$ $\Vert$ $<$ ${\delta_0}$, then $\lim_{t\rightarrow+\infty}$ $\psi(t)$ = ${y_0}$.

The equilibrium solution ${y_0}$ is said to be unstable if it is not stable.

Equivalent definitions can be written for the nonautonomous system $y' = f(t, y)$.

Now we can add notions of globally asymptoctically stable, regions of asymptotic stability and so forth.

From all of these definitions, we can write nice theorems about Linear and Almost Linear system by looking at eigenvalues and we can add notions of conditional stability.

Update

You might also want to peruse the web for notes that deal with the above. For example DEQ.

Regards

Solution 2:

Steady state means some properties of the system are unchanging wrt to time. It usually occurs after some time the process is initiated. Corresponding solutions are the steady state solutions.

Solution 3:

A steady state solution is a solution for a differential equation where the value of the solution function either approaches zero or is bounded as t approaches infinity. It sort of feels like a convergent series, that either converges to a value (like f(x) approaching zero as t approaches infinity) or having a radius of convergence (like f(x) being bounded above and below as t goes to infinity). So if the solution function for your differential equation has an upper and lower bound or asymptotically approaches some constant value as t approaches infinity, then it is a steady-state solution. A function solution that is not bounded or has no constant asymptotic behavior is not steady-state, and feels like a divergent series.

DISCLAIMER: I'm using the concept of series to give you a feel for how this works, I'm not implying series are relevant in steady-state solutions to differential equations. I mean, they very well might be; but I have no idea about that myself.

Source: I'm a math & stats professor 🤓

Solution 4:

Example from dynamics: You can picture for yourself a cantilever beam which is loaded by a force at its tip say: $F(t) = \sin(t)$. At $t=0$ the force is applied, then you get the transient state, after some time the system will become in equilibrium: the steady-state. In this state no changes are applied to the system. You can expand this thinking to other differential equations as well. Hope that helps.