How to prove $e^{A \oplus B} = e^A \otimes e^B$ where $A$ and $B$ are matrices? (Kronecker operations)
How to prove that $e^{A \oplus B} = $$e^A \otimes e^B$? Here $A$ and $B$ are $n\times n$ and $m \times m$ matrices, $\otimes$ is the Kronecker product and $\oplus$ is the Kronecker sum: $$ A \oplus B = A\otimes I_m + I_n\otimes B, $$ where $I_m$ and $I_n$ are the identity matrices of size $m\times m$ and $n\times n$, respectively.
EDIT: Actually if you go to the page http://mathworld.wolfram.com/KroneckerSum.html it tells us this property is true.
http://digitalcommons.unf.edu/cgi/viewcontent.cgi?article=1025&context=etd
Solution 1:
What is to be proved is the following: $$ e^{A \otimes I_b +I_a \otimes B} = e^A \otimes e^B~$$ where $I_a,A \in M_n$ , $ I_b, B \in M_m$
This is true because $$ A \otimes I_b~~~~\text{and}~~~~ I_a \otimes B$$ commute, which can be shown by using the so called mixed-product property of the Kronecker product. i.e. $$ (A \otimes B)\cdot (C \otimes D) = (A\cdot C) \otimes (B\cdot D)~$$ Here, $\cdot$ represents the ordinary matrix product.
One can also show that for an arbitrary matrix function $f$, $$f(A\otimes I_b) = f(A)\otimes I_b~~~~\text{and}~~~ f(I_b \otimes A) = I_b \otimes f(A)~.$$ Together with the commutative property mentioned above, you can prove your result.
Solution 2:
If $A$ and $B$ are $n\times n$, then by Taylor expansion we have:
$$e^A=\sum_{k=0}^{\infty}\frac{A^k}{k!}$$
Therefore:
$$e^Ae^B=\sum_{k_1=0}^{\infty}\frac{A^{k_1}}{k_1!}\sum_{k_2=0}^{\infty}\frac{B^{k_2}}{k_2!}$$
$$\Rightarrow e^Ae^B=\sum_{k_1=0}^{\infty}\sum_{k_2=0}^{\infty}\frac{A^{k_1}}{k_1!}\frac{B^{k_2}}{k_2!}$$
$$\Rightarrow =\sum_{k_1=0}^{\infty}\sum_{k_2=0}^{\infty}\frac{(k_1+k_2)!}{(k_1+k_2)!}\frac{A^{k_1}}{k_1!}\frac{B^{k_2}}{k_2!}$$
$$\Rightarrow =\sum_{k_1=0}^{\infty}\sum_{k_2=0}^{\infty}\frac{1}{(k_1+k_2)!}\binom{k_1+k_2}{k2}A^{k_1+k_2-k_2}B^{k_2}$$
Set $k=k_1+k2$
$$\Rightarrow =\sum_{k=0}^{\infty}\frac{1}{k!}(A+B)^{k}=e^{A+B}$$
Solution 3:
First and foremost, the result is not true as stated. It is only true of $A$ and $B$ commute, which is a very restrictive condition for matrices.
To handle the commutative case, one can first consider the formal power series case. In the ring $\Bbb Q[[X,Y]]$ of formal power series with rational coefficients in commuting indeterminates $X,Y$, one defines $\exp(X)$, $\exp(Y)$, and $\exp(X+Y)$ by the usual power series, and the identity $\exp(X)\exp(Y)=\exp(X+Y)$ is easily checked by comparing coefficients of an arbitrary monomial in $X,Y$: both series are equal to $\sum_{k,l\geq0}\binom{k+l}k\frac{X^kY^l}{(k+l)!}$.
Now if one restricts to formal power series with more than exponentially decreasing coefficients, substitution of a concrete value (for instance a matrix) for an indeterminate will give an absolutely convergent power series, whose limit assigns a well defined value to the substitution. If $M$ is your ring of matrices (which is also a topolgical $K$-vector space for $K=\Bbb R$ or $K=\Bbb C$), and $A,B\in M$ commute, then the substitutions $X:=A,Y:=B$ define, for the appropriate subring $R\subset\Bbb Q[[X,Y]]$, a continuous ring homomorphism $f:R\to M$, whose image lies in the commutative subring $K[A,B]$ of $M$ generated by $A,B$. This homomorphism then satifies $f(\exp(S))=\exp(f(S))$ (by the definition of matrix exponentiation), so that applying $f$ to $\exp(X)\exp(Y)=\exp(X+Y)$ gives $\exp(A)\exp(B)=\exp(A+B)$.