Moment generating functions/ Characteristic functions of $X,Y$ factor implies $X,Y$ independent.

This is solely a reference request. I have heard a few versions of the following theorem:

If the joint moment generating function $\mathbb{E}[e^{uX+vY}] = \mathbb{E}[e^{uX}]\mathbb{E}[e^{vY}]$ whenever the expectations are finite, then $X,Y$ are independent.

And there is a similar version for characteristic functions. Could anyone provide me a serious reference which proves one or both of these theorems?


Solution 1:

Theorem (Kac's theorem) Let $X,Y$ be $\mathbb{R}^d$-valued random variables. Then the following statements are equivalent.

  1. $X,Y$ are independent
  2. $\forall \eta,\xi \in \mathbb{R}^d: \mathbb{E}e^{\imath \, (X,Y) \cdot (\xi,\eta)} = \mathbb{E}e^{\imath \, X \cdot \xi} \cdot \mathbb{E}e^{\imath \, Y \cdot \eta}$

Proof:

  • $(1) \Rightarrow (2)$: Straightforward, use $\mathbb{E}(f(X) \cdot g(Y)) = \mathbb{E}(f(X)) \cdot \mathbb{E}(g(Y))$
  • $(2) \Rightarrow (1)$: Let $(\tilde{X},\tilde{Y})$ be such that $\tilde{X}$, $\tilde{Y}$ are independent, $\tilde{X} \sim X$, $\tilde{Y} \sim Y$. Then $$\mathbb{E}e^{\imath \, (X,Y) \cdot (\xi,\eta)} \stackrel{(2)}{=} \mathbb{E}e^{\imath \, X \cdot \xi} \cdot \mathbb{E}e^{\imath \, Y \cdot \eta} = \mathbb{E}e^{\imath \tilde{X} \cdot \xi} \cdot \mathbb{E}e^{\imath \tilde{Y} \cdot \eta} = \mathbb{E}e^{\imath (\tilde{X},\tilde{Y}) \cdot (\xi,\eta)}$$ i.e. the characteristic functions of $(X,Y)$ and $(\tilde{X},\tilde{Y})$ coincide. From the uniqueness of the Fourier transform we conclude $(X,Y) \sim (\tilde{X},\tilde{Y})$. Consequently, $X$ and $Y$ are independent.

Remark: It is not important that $X$ and $Y$ are vectors of the same dimension. The same reasoning works if, say, $X$ is an $\mathbb{R}^k$-valued random variable and $Y$ and $\mathbb{R}^d$-valued random variable.

Reference (not for the given proof, but the result):David Applebaum, B.V. Rajarama Bhat, Johan Kustermans, J. Martin Lindsay, Michael Schuermann, Uwe Franz: Quantum Independent Increment Processes I: From Classical Probability to Quantum Stochastic Calculus (Theorem 2.1).

Solution 2:

Builidng on the answer by saz. If X and Y have a joint density, here is another proof for (2)⇒(1): By the inverse Fouriour transform: $$f_\mathbf{X}(\mathbf{x})=\frac{1}{(2\pi)^n}\int_{R^n}{e^{-j\mathbf{v'x}}\phi_\mathbf{x}(\mathbf{v})d\mathbf{v}}$$

where x and v are vertical vectors, and in this case, vector $\mathbf{x} = [x\ y]'$, vector $\mathbf{v} = [v_1\ v_2]'$

Therefore, $$f_{XY}(x,y)=\frac{1}{(2\pi)^2}\iint{e^{-j(v_1x+v_2y)}\phi_{XY}(v_1,v_2)}dv_1dv_2\\=\frac{1}{2\pi}\int{e^{-j(v_1x)}\phi_{X}(v_1)}dv_1\frac{1}{2\pi}\int{e^{-j(v_2y)}\phi_{Y}(v_2)}dv_2\\=f_X(x)f_Y(y)$$

And the joint probability density function (pdf) equals to the product marginal pdf's is the definition of independence for continuous random variables. This method should work for discrete random variables as well.