Relation between independent increments and Markov property
Independent increments and Markov property.do not imply each other. I was wondering
- if being one makes a process closer to being the other?
- if there are cases where one implies the other?
Thanks and regards!
Solution 1:
Independent increments do imply Markov property.
To see this, assume that $(X_n)_{n\ge0}$ has independent increments, that is, $X_0=0$ and $X_n=Y_1+\cdots+Y_n$ for every $n\ge1$, where $(Y_n)_{n\ge1}$ is a sequence of independent random variables. The filtration of $(X_n)_{n\ge0}$ is $(\mathcal{F}^X_n)_{n\ge0}$ with $\mathcal{F}^X_n=\sigma(X_k;0\le k\le n)$. Note that $$ \mathcal{F}^X_n=\sigma(Y_k;1\le k\le n), $$ hence $X_{n+1}=X_n+Y_{n+1}$ where $X_n$ is $\mathcal{F}^X_n$ measurable and $Y_{n+1}$ is independent on $\mathcal{F}^X_n$. This shows that the conditional distribution of $X_{n+1}$ conditionally on $\mathcal{F}^X_n$ is $$ \mathbb{P}(X_{n+1}\in\mathrm{d}y|\mathcal{F}^X_n)=Q_n(X_n,\mathrm{d}y), \quad \mbox{where}\quad Q_n(x,\mathrm{d}y)=\mathbb{P}(x+Y_{n+1}\in\mathrm{d}y). $$ Hence $(X_n)_{n\ge0}$ is a Markov chain with transition kernels $(Q_n)_{n\ge0}$.
On the other hand, Markov property does not imply independent increments.