Deducing the exact solution of a ODE

In page 53 of Arieh Iserles's A first course in the numerical analysis of differential equations, he presents the following ODE:

$(\vec{y})'=\Gamma\cdot\vec{y}$,

$\vec{y}(0)=\vec{y_0}$

Using the fact that the spectral factorization of $\Gamma$ gives us $\Gamma=VDV^{-1}$, where $D$ is the diagonal matrix containing the eigenvalues and $V$ is the corresponding eigenvector matrix, he deduces (without working) that the exact solution of the ODE is: $\vec{y}(t)=e^{t\Gamma} = Ve^{tD}V^{-1}\vec{y_0}$

  1. In order to get $\vec{y}(t)=e^{t\Gamma}$ it looks like he did: $\displaystyle \int \frac{(\vec{y})'}{\vec{y}} dt = \int \Gamma dt$. But how can he do this when $\vec{y}$ is a vector?

  2. Using the spectral factorization, $e^{t\Gamma} = e^{tVDV^{-1}}$. How did he get $Ve^{tD}V^{-1}\vec{y_0}$ from this?


Solution 1:

It looks like you are confused by the formal procedure... don't write $\int \frac{(\vec{y})'}{\vec{y}}dt=\int \Gamma dt$, it doesn't make a lot of sense in this form unless you really specify what you mean by any of the operations here.

It's much simpler than that. You can imagine substituting $\vec{w}=V^{-1}\vec{v}$. Now, your set of differential equations reads as

$$(\vec{w})'=D\vec{w}$$

Because $D$ is diagonal, the components are not coupled and the above is a set of $n$ independent differential equation, which can each be integrated separately.

Additionally, it's perfectly fine to solve by Ansatz without integration, in which case you don't get have ambiguous expression.

Solution 2:

The answer to the first question I present below was copied from this answer of mine and it works whether the matrix is diagonalizable or not.

Given $n\in \mathbb N, A\in \mathbb R^{n\times n}$, a non trivial interval $I, t_0\in I, y_0\in \mathbb R^n$ and $b\colon I\to \mathbb R^n$ a continuous function, consider the initial value problem $$y'+Ay=b, y(t_0)=y_0.$$

Let $f\colon I\to\mathbb R^n$ be a differentiable function.

Fact: For all $t\in \mathbb R$, $e^{At}$ is invertible and $\left(e^{At}\right)^{-1}=e^{-At}$.

There exists $C\in \mathbb R^n$ such that for all $t\in I$ the following holds: $$\begin{align} f'(t)+Af(t)=b(t)&\iff e^{At}(f'(t)+Af(t))=e^{At}b(t)\\ &\iff e^{At}f'(t)+e^{At}Af(t)=e^{At}b(t)\\ &\iff e^{At}f'(t)+Ae^{At}f(t)=e^{At}b(t)\\ &\iff \int \limits _{t_0}^te^{As}f'(s)+Ae^{As}f(s)\mathrm ds=\int \limits_{t_0}^te^{As}b(s)\mathrm ds+C\\ &\iff e^{At}f(t)=\int \limits_{t_0}^te^{As}b(s)\mathrm ds+C\\ &\iff f(t)=e^{-At}\int \limits_{t_0}^te^{As}b(s)\mathrm ds+e^{-At}C. \end{align}$$

Taking into account $f(t_0)=y_0$ after some simple calculations it follows that $C=e^{At_0}y_0$.


As for the second question, just use the definition of matrix exponential together with what was done above: $$\begin{align} e^{t\Gamma }&=\sum \limits_{n=0}^\infty\left(\dfrac {t^n} {n!}\Gamma^n\right)\\ &=\sum \limits_{n=0}^\infty\left(\dfrac {t^n} {n!}\left(VDV^{-1}\right)^n\right) \\ &=\sum \limits_{n=0}^\infty\left(\dfrac {t^n} {n!}VD^nV^{-1}\right)\\ &=V\sum \limits_{n=0}^\infty\left(\dfrac {t^n} {n!}D^n\right)V^{-1}\\ &=Ve^{tD}V^{-1}. \end{align}$$