Can I apply the Girsanov theorem to an Ornstein-Uhlenbeck process?

Solution 1:

@Nate: I think your argument is fine. I didn't know the Ferinique's theorem. My argument on that part was this one.

Fix $T>0$ (as you did eventually $T+1$). Applying Jensen we get: $$\exp\left(\frac{1}{2}\int_S^{S+\epsilon} X_t^2\,dt\right)=\exp\left(\frac{1}{\epsilon}\int_S^{S+\epsilon} \frac{\epsilon}{2}X_t^2\,dt\right)\leq \frac{1}{\epsilon}\int_S^{S+\epsilon} \exp\left(\frac{\epsilon}{2}X_t^2\right)\,dt \qquad a.s..$$

Now taking the expectation and using Tonelli's theorem we have to study: $$\frac{1}{\epsilon}\int_S^{S+\epsilon} E\left[\exp\left(\frac{\epsilon}{2}X_t^2\right)\right]\,dt.$$

$X_t \sim N(\mu_t=X_0e^{-t}\,,\,\,\sigma_t^2=\frac{1-e^{-2t}}{2})$ so $$E\left[\exp\left(\frac{\epsilon}{2}X_t^2\right)\right]=E\left[\exp\left(\frac{\epsilon}{2}(\mu_t+\sigma_tZ)^2\right)\right]=$$ $$=e^{\frac{\epsilon}{2}\mu_t^2}\int_{\mathbb{R}} \frac{1}{\sqrt{2 \pi}}\, \exp\left(-\frac{x^2}{2}(1-\sigma_t^2 \epsilon)+\epsilon \mu_t \sigma_t x\right)dx. $$ Setting $\lambda_t=1-\epsilon \sigma_t^2=\frac{1}{2}[2-\epsilon(1-e^{-2t})]$, if $\lambda_t>0$ (for example with $\epsilon &lt 1$) the last integral is convergent and its value is:$$\exp\left(\frac {\epsilon}{2}\mu_t^2\right)\,\exp\left(\frac{\epsilon^2}{2 \lambda_t}\mu_t^2 \sigma_t^2\right)\frac{1}{\sqrt{\lambda_t}}.$$

Finally all the functions involved are continuous and since $\epsilon &lt 1$, $\lambda_t$ is away from 0 and so the moment generating function is integrable.

My first idea was to use instead of $\epsilon$ $T$, but moment generating function of the chi squared is not define to away from 0 but only in a neighbourhood.

Solution 2:

I think I got it worked out.

Lemma. There exists $\epsilon > 0$ such that for any $S \in [0,T]$, we have $$E\left[\exp\left(\frac{1}{2}\int_S^{S+\epsilon} X_t^2\,dt\right)\right] &lt \infty.$$

Proof. Set $M = \sup_{[0, T+1]} |X_t|$. Then $$E\left[\exp\left(\frac{1}{2}\int_S^{S+\epsilon} X_t^2\,dt\right)\right] \le E\left[\exp\left(\frac{\epsilon}{2} M^2\right)\right].$$ By Fernique's theorem, there exists $\epsilon$ small enough that this is finite. ($X_t$, being a continuous Gaussian process, induces a Gaussian measure on the Banach space $C([0,T+1])$, and $M$ is the norm on this space.) Perhaps there is also a more direct way to get this.

Now we use Corollary 3.5.14 from Karatzas and Shreve, Brownian Motion and Stochastic Calculus, whose proof I'll paraphrase. Let $$Z_t = \exp\left(\int_0^t X_s\,dW_s - \frac{1}{2} \int_0^t X_s^2\,ds\right)$$ be the process we have to show is a martingale. $Z_t$ is a nonnegative local martingale (by Itô's formula) hence it is a supermartingale (using Fatou's lemma with a sequence of stopping times), so it is enough to show $E[Z_t ]= 1$ for all $t \in [0,T]$. We proceed by induction. Suppose we have shown $E[Z_t] = 1$ for $t \in [0,S]$ (the base case is $S=0$ which is trivial). Let $t \in [S, S+\epsilon]$. Set $$Z_t^S = \exp\left(\int_S^t X_s\,dW_s - \frac{1}{2} \int_S^t X_s^2\,ds\right).$$ By Novikov's condition, $Z_t^S$ is a martingale for $t \in [S, S+\epsilon]$. Now we have $$E[Z_t] = E[Z_S Z_t^S] = E[E[Z_S Z_t^S \mid \mathcal{F}_S]] = E[Z_S E[Z_t^S \mid \mathcal{F}_S]] = E[Z_S Z_S^S].$$ But $Z_S^S = 1$ by definition and $E[Z_S]=1$ by assumption. So we have $E[Z_t] = 1$, and this holds for any $t \in [0, S+\epsilon]$. Now we just repeat the induction $T/\epsilon$ times.

It appears this would work for any continuous Gaussian process $X_t$, which is nice to know.

Solution 3:

$X^2$ follows a CIR type diffusion. Expressions such as occur in Novikov can be computed explicitly and seen to be finite at least for small t. If $Z_t$ is a martingale for small t I'm sure that using some combination of positivity and the markov property you can extend it. The result stated as prob 8.3.14 in Revuz & Yor, though without solution or substantial hints.