When the sum of independent Markov chains is a Markov chain?
In general, the sum of two independent Markov chains is not a Markov chain.
Let $X$ be a random variable such that $\mathbb{P}(X=0) = \mathbb{P}(X=1) = \frac{1}{2}$ and set $X_n := X$ for all $n \in \mathbb{N}$. Obviously, $(X_n)_{n \in \mathbb{N}}$ is a Markov chain. Moreover, let $(Y_n)_{n \in \mathbb{N}_0}$, $Y_0 := 0$, be a Markov chain independent from $X$ with state space $\{-1,0,1\}$ and transition matrix
$$P := \begin{pmatrix} \frac{1}{4} & \frac{3}{4} & 0 \\ \frac{1}{4} & \frac{1}{2} & \frac{1}{4} \\ 0 & \frac{3}{4} & \frac{1}{4} \end{pmatrix}.$$
Now set $Z_n := X_n+Y_n$. Then, by the independence of $X$ and $(Y_n)_{n \in \mathbb{N}_0}$, we have
$$\begin{align*} \mathbb{P}(Z_2 = 0 \mid Z_1 = 1, Z_0 = 1) &= \frac{\mathbb{P}(Z_2 = 0, Z_1 = 1, Z_0 = 1)}{\mathbb{P}(Z_1 =1, Z_0=1)} \\ &=\frac{\mathbb{P}(Y_2 = -1, Y_1 = 0, X= 1)}{\mathbb{P}(Y_1 =0, X=1)} \\ &= \frac{\mathbb{P}(Y_2 = -1, Y_1 = 1)}{\mathbb{P}(Y_1 = 0)} \frac{\mathbb{P}(X=1)}{\mathbb{P}(X=1)} \\ &= \mathbb{P}(Y_2 = -1 \mid Y_1 = 0) = \frac{1}{4}. \end{align*}$$
On the other hand, a very similar calculation shows that
$$\begin{align*} \mathbb{P}(Z_2 = 0 \mid Z_1 = 1) &= \frac{\mathbb{P}(Y_2 = -1, Y_1=0) \mathbb{P}(X=1) + \mathbb{P}(Y_2 = 0, Y_1=1) \mathbb{P}(X=0)}{\mathbb{P}(Y_1 = 0) \mathbb{P}(X=1) + \mathbb{P}(Y_1=1) \mathbb{P}(X=0)} \\ &= \frac{5}{12} \neq \frac{1}{4}. \end{align*}$$
This means that $(Z_n)_{n \in \mathbb{N}_0}$ is not a Markov chain.
I know this is an old post, but it's the top result on google when one searches for "sum of Markov chains" -- at least it is for me at the time of writing. I just wanted to suggest a much easier and more intuitive solution.
Let $U$ and $V$ be independent Bernoulli$(\tfrac12)$. Let $X_n = U$ for all $n$. Let $Y_n = -(-1)^n V$ for all $n$ Let $$Z_n = U - (-1)^n V.$$ In particular, $U-V = Z_0 = Z_2 = Z_4 = \cdots$ and also $U+V = Z_1 = Z_3 = Z_5 = \cdots$.
Suppose that the 'present' is time $n = 2$, and we wish to consider the distribution of $Z_3$. Note that $Z_2 = 0$ implies $U = V$, and so $Z_1 = Z_3 = 2U$. Knowing $Z_1$ now determines precisely $Z_3$. For example, $$ P(Z_3 = 2 \mid Z_2 = 0, Z_1 = 2 ) = 1 \neq P(Z_3 = 2 \mid Z_2 = 0, Z_1 = 0). $$ (In particular, if one sees consecutive $0$s, then one knows that $U = V = 0$.)