Equilibrium distributions of Markov Chains

I often get confused about

  1. when a Markov chain has an equilibrium distribution;
  2. when this equilibrium distribution is unique;
  3. which starting states converge to the equilibrium distribution; and
  4. how finite and countably infinite Markov chains different with respect to the above.

(Google isn't quite clearing up my confusion.) Is the following correct/am I missing anything?

An irreducible Markov chain (finite or countably infinite) has a unique equilibrium distribution if and only if all states are positive recurrent. (What about reducible Markov chains? A reducible Markov chain has a non-unique equilibrium distribution iff all states are positive recurrent?) However, not all starting states necessarily converge to the unique equilibrium, unless the Markov chain is also aperiodic; that is, an irreducible Markov chain converges to its unique equilibrium regardless of initial state, if and only if all states are positive recurrent and aperiodic.


Solution 1:

For a Markov chain with $N<\infty$ states, the set $I$ of invariant probability vectors is a non-empty simplex in ${\mathbb R}^N$ whose extreme points correspond to the recurrent classes of the chain. Thus, the vector is unique iff there is exactly one recurrent class; the transient states (if any) play absolutely no role (as in Jens's example). The set $I$ is a point, line segment, triangle, etc. exactly when there are one, two, three, etc. recurrent classes.

If the invariant vector $\pi$ is unique, then there is only one recurrent class and the chain will eventually end up there. The vector $\pi$ necessarily puts zero mass on all transient states. Letting $\phi_n$ be the law of $X_n$, as you say, we have $\phi_n\to \pi$ only if the recurrent class is aperiodic. However, in general we have Cesàro convergence: $${1\over n}\sum_{j=1}^n \phi_j\to\pi.$$

An infinite state space Markov chain need not have any recurrent states, and may have the zero measure as the only invariant measure, finite or infinite. Consider the chain on the positive integers which jumps to the right at every time step.

Generally, a Markov chain with countable state space has invariant probabilities iff there are positive recurrent classes. If so, every invariant probability vector $\nu$ is a convex combination of the unique invariant vector $m_j$ corresponding to each positive recurrent class $j\in J$, i.e., $$\nu=\sum_{j\in J} c_j m_j,\qquad c_j\geq 0,\quad \sum_{j\in J}c_j=1.$$

This result is Corollary 3.23 in Wolfgang Woess's Denumerable Markov Chains.

Solution 2:

The answers you have given are true at least for finite Markov chains. (My courses did not cover any others, I am afraid. And all references I have are German, so of limited use to you =) ).

The part

A reducible Markov chain has a non-unique equilibrium distribution iff all states are positive recurrent?

is not true. Consider the Markov Chain on the states 0 and 1, which goes from 0 to 1 with probability 1 and then stays there. It has a unique equilibrium distribution ($\delta_1$), without the state 0 being positive recurrent.