Start with a Markov matrix $\mathbf{M}$, whose elements are all between $0 \le \mathbf{M}_{ij} \le 1$ and each row sums to one. There is a natural connection with this matrix and the rate matrix $\mathbf{W}$ in the Master Equation

$$ \mathbf{M} = \exp( t \mathbf{W} ) $$

Here, given $\mathbf{W}$, the calculation of $\mathbf{M}$ is unambiguous with $t$ since the matrix exponential is unique and converges by the Taylor expansion. What about the other direction?

$$ t \mathbf{W} = \log( \mathbf{M} ) $$

Do the properties of the Markov matrix guarantee this is unique and that the alternating sum in the log Taylor series converge?

Please provide a reference where this is discussed if possible!

Motivation (by request)

I've been studying the dynamics assoicated with an Ising model type system under single spin-flip Glauber dynamics. Glauber dynamics gives essentially the $\mathbf{W}$ matrix. If one were to observe the system over a finite time, an approximation to $\mathbf{M}$ could be made. I was interested in when it was permissible to convert between the two. In the reference provided by one of the answers the question boils down to:

In probabilistic terms a Markov matrix A is embeddable if it is obtained by taking a snapshot at a particular time of an autonomous finite state Markov process that develops continuously in time. On the other hand a Markov matrix might not be embeddable if it describes the annual changes in a population that has a strongly seasonal breeding pattern; in such cases one might construct a more elaborate model that incorporates the seasonal variations. Embeddability may also fail because the matrix entries are not accurate; in such cases a regularization technique might yield a very similar Markov matrix that is embeddable;


Solution 1:

This is an old subject which goes back at least to Elfving in the '30s. Kingman got interested in it in the '60s and more recently Israel, Rosenthal and Wu studied it with an eye to the practical side of the problem.

Markov matrices $M$ which can be written as $M=\exp(W)$ for a generator matrix $W$ are called embeddable because then, there exists a whole semi-group $(M_t)$ of Markov matrices such that $M=M_1$ (simply define $M_t=\exp(tW)$ for every nonnegative real number $t$). Recall that $W$ is a generator if the sum of its rows is zero and if every non diagonal element is nonnegative.

There are some obvious obstructions for a given Markov matrix $M$ to be embeddable. For example, the set of couples of states $(x,y)$ such that $(M^n)_{xy}=0$ cannot depend on $n\ge1$. It may also happen that the matrix $V=\log(M)$ defined by the usual series expansion of $x\mapsto\log(1+x)$ at $x=0$ applied to the matrix $I+(M-I)$, is not a generator even though $M=\exp(V)$ (I seem to remember this can even happen when $M$ is embeddable).

The dimension $2$ excepted (and maybe the dimension $3$ but I do not remember), no complete characterization of embeddability is available, but much is known. I would suggest to begin with the recent paper Embeddable Markov Matrices by E. B. Davies. It is available on the arXiv and it provides useful references to some previous works including the ones I mentioned in this post.

A related problem is to know when the equation $M=\exp(W)$ has several admissible solutions $W$, on this see the paper by Speakman in the list of references of Davies's paper.

Solution 2:

In general there is no $W$ with real coefficients. Here is a random example from here: $M=\pmatrix{1/2&1/4&1/4\\ 1/6&1/6&2/3 \\ 1/3&1/3&1/3}$ has eigenvalues $1$ and $\pm1/\sqrt{24}$. If $M=\exp W$ then the eigenvalues of $W$ must be modulo $2\pi i\mathbb{Z}$ equal to $0$, $-\log(24)/2$, $-\log(24)/2+\pi i$. if $W$ is real then its eigenvalues are either real or come in complex-conjugate pairs, and here it's not the case.