Characteristic Function and Random Variable Transformation

Solution 1:

If $Y=g(X)$ (let me use $g$, reserving $f$ for the densities) :

Then $$ \phi_Y(t)=E(e^{i Yt})=\int_{-\infty}^{\infty} f_Y(y) e^{i y t} dy= \int_{-\infty}^{\infty} f_X(x) e^{i g(x) t} dx=\\ =\frac{1}{2 \pi} \int_{-\infty}^{\infty} \int_{-\infty}^{\infty} \phi_X(s) e^{-i s x } ds \, e^{i g(x) t} dx $$

Of course, this is rather trivial ( $ \phi_X(t) \to f_X(x) \to f_Y(y) \to \phi_Y(t)$) and formal (the formula is not practical, it does not give a nice or simple relationship between $\phi_Y(t)$ and $\phi_X(t)$ but I don't see why you'd expect that it can be simplified.

Solution 2:

Here's I think what would be a counterexample: if $X \sim R[0,1]$ and $Y \sim N(X, \sigma^2)$, then MGF of $X$ would be $\frac{e^t-1}{t}$ and MGF of $Y$ would be $e^{xt+\frac{\sigma^2 t^2}{2}}$ and I doubt there's a way to express the latter via the former.

Solution 3:

This is more of a comment than answer and is for someone stumbling into this post.

For linear transformations of the form $Y=aX+b$, we can say that $\phi_Y(t)=e^{itb}\phi_X(at)$. If we are dealing with random vectors, then for transformation $AX+B$, we have $\phi_Y(t)=e^{it^\top B}\phi_X(A^\top t)$.

Source: https://www.statlect.com/fundamentals-of-probability/joint-characteristic-function and references therein.