equilibrium distribution, steady-state distribution, stationary distribution and limiting distribution

I was wondering if equilibrium distribution, steady-state distribution, stationary distribution and limiting distribution mean the same thing, or there are differences between them?

I learned them in the context of Discrete-time Markov Chain, as far as I know. Or do they also appear in other situations of stochastic processes and probability?

From the Wikipedia page for Markov Chain, it seems not very clear to me how to define and use these concepts.

Thanks!


(1) You forgot one! In the index to Gregory Lawler's book Introduction to Stochastic Processes (2nd edition) we find

  • equilibrium distribution, see invariant distribution
  • stationary distribution, see invariant distribution
  • steady state distribution, see invariant distribution

All this terminology is for one concept; a probability distribution that satisfies $\pi=\pi P$. In other words, if you choose the initial state of the Markov chain with distribution $\pi$, then the process is stationary. I mean if $X_0$ is given distribution $\pi$, then $X_n$ has distribution $\pi$ for all $n\geq 0$. Such a $\pi$ exists if and only if the chain has a positive recurrent state. An invariant distribution need not be unique. For example, if the Markov chain has $n<\infty$ states, the collection $\{\pi: \pi=\pi P\ \}$ is a non-empty simplex in $\mathbb{R}^n$ whose extreme points (corners) correspond to recurrent classes.

(2) The concept of a limiting distribution is related, but not exactly the same. Suppose that $\pi_j:=\lim_n P_{ij}^n$ exists and doesn't depend on $i$. These are called limiting probabilities and the vector $\pi:=(\pi_1,\dots,\pi_n)$ will satisfy $\pi=\pi P$. So a limiting distribution (if it exists) is always invariant. Limiting probabilities exist when the chain is irreducible, positive recurrent, and aperiodic.

A typical case when the limiting distribution fails to exist is when the chain is periodic. For instance, for the two state chain with transition matrix $P=\pmatrix{0&1\cr 1&0}$ the unique invariant distribution is $\pi=(1/2,1/2)$, but $P_{ij}^n$ alternates between $0$ and $1$ so fails to converge.

I'm not sure that all authors use these terms in the same way, so you want to be careful when reading other books.