How to compute time ordered Exponential?
Solution 1:
One example of where such ordered exponentials arise is in computing the solution to ODEs with time-dependent coefficients: $$ \frac{d\mathbf{x}(t)}{dt} = A(t) \mathbf{x}(t), $$ where $\mathbf{x}(t)=(x_1(t), \ldots, x_n(t))$ and $A(t)$ is an $n\times n $ matrix, the general solution is $$ \begin{equation} \mathbf{x}(t) = \mathcal{T}\{ e^{\int_0^{t} A(t') dt'}\} \mathbf{x}(0), \end{equation} $$ where $\mathcal{T}$ denotes time-ordering,
$\mathcal{T}\{ e^{\int_0^{t} A(t') dt'}\} \equiv \sum\limits_{n=0}^{\infty} \frac{1}{n!}\int_0^t \ldots \int_0^t \mathcal{T}\{A(t_1') \ldots A(t_{n}') \} = \sum\limits_{n=0}^{\infty} \int_0^t \ldots \int_0^{t_{n-1}'} A(t_1') \ldots A(t_{n}')$.
Only when the matrices at different times commute - i.e. $[A(t_1), A(t_2)]=0$, $\forall t_1,t_2$ - does the time-ordered expression simplify to $e^{\int_0^{t} A(t') dt'}$.
Evaluating such exponentials is an arduous task, since even for time-independent matrix exponentials there are articles such as the one mentioned in René's answer. A quick search in the literature revealed a few approaches. Perturbatively, the solution is given by a Magnus series, and there are various papers describing methods to approximate or exactly solve them:
- The Magnus expansion and some of its applications
- Decomposition of Time-Ordered Products and Path-Ordered Exponentials
- An Exact Formulation of the Time-Ordered Exponential using Path-Sums
The methods typically involve heavy maths and it would not be appropriate to describe them here with my limited understanding.
Solution 2:
You have probably seen this before but in any case, the following reference explore some of the ways to calculate the matrix exponential 19 dubious ways ..!.
Hope it helps!
Solution 3:
There seems to be some confusion here between two different problems: the first is the computation of a time-ordered (also called path-ordered) exponential, a matrix E(t',t) solving $\frac{d}{dt}E(t,0) = M(t)E(t,0) $ and the computation of the matrix exponential of a time dependent matrix $\exp(A(t))$.
Recall that the first and second problems are related whenever $M$ commutes with itself at different times. Indeed if $M(t')M(t)-M(t)M(t')=0$ for all t,t', then $E(t)=\exp(\int_0^t M(\tau)d\tau)$ and so the time-ordered exponential is an ordinary matrix exponential. It is then straightforward to compute with the usual expm (Matlab) or MatrixExp (Mathematica) functions and the likes.
If the matrix $M$ does not commute with itself at different times, then $E(t)$ is not an ordinary matrix exponential. As notedin Yiteng's response, the problem is then much more difficult and admits few analytical answers. As far as I can tell, since Magnus-series are impossible (baring rare exceptions?) to compute exactly to all orders, the only analytical approach is the path-sum formulation thing, which looks quite difficult. If numerical is sufficient for you, then you can always Magnus the hell out of it.
Solution 4:
I note with astonishment that, once again, the current answers have only a vague relation to the asked question.
That follows is an example that answers the question "how to calculate $exp(A(t))$" ? An exact calculation is possible only if one knows explicitly the eigenvalues and eigenvectors as functions of $ t $, which is generally hopeless; it's even more infeasible if $ A (t) $ has multiple eigenvalues for some values of $t$ (because of the instability of the calculation of the Jordan forms). A numerical approximation can be obtained as follows; it's not the most efficient method but it's the simplest one.
Let $A(t)=\begin{pmatrix}\cos(t)&1+t^2-t&\sin(t)\\t^2-2t&\tan(t)&-t^3+1\\2t^2-1&\log(1+t)&t^4+1\end{pmatrix}$. We seek $\exp(A(t))$ for $t\in[0,1]$.
Step 1. We calculate for every $i=0,\cdots,10$, $exp(A(i/10))$.
Let $U=A(i/10)$; we calculate $exp(U)$ as follows; let $V=A/1024$;
then $exp(V)\approx R=I+V+\cdots+1/10!V^{10}$ and $\exp(U)$ is given by
for i from 1 to 10 do
R:=R^2:
od:
For example $\exp(A(0.6))[1,1]\approx 1.2717213247490905765$.
Step 2. For each $j,k$, we calculate a polynomial interpolation using the $\exp(A(i/10))[j,k]$.