Determining if something is a characteristic function

Suppose $X$ is a continuous random variable with pdf $f_X(x)$. We can compute its characteristic function as $\varphi_X(t)=\mathbb{E}[e^{itX}].$

Question: Given a function, say $\psi(t)$, how does one show that it is a characteristic function?

(Typed this on my phone - my apologies if there's poor formatting)


There are really a theoretical part and a practical part in checking whether a function is a characteristic function or not.

Surely, Bochner's Theorem is a very nice necessary and sufficient condition to check whether a function f(t) is a characteristic function or not.

BUT!!!!!

It is of hardly a simple use of it,when it comes to a simple function like whether

$f(t)=max${1-|t|,0} or

$f(t)=1/(1-t^2) $ is a characteristic function or not.

The difficult part lies in the part to check whether it is positive definite or not

Here I give a sort of a methodical way to check whether a function is a characteristic function or not.

  1. First Always check the basic conditions and properties of a characteristic function. https://en.wikipedia.org/wiki/Characteristic_function_(probability_theory)#Properties

If the function passes the 1st step, then check step 2.

  1. Pólya’s theorem. https://en.wikipedia.org/wiki/Characteristic_function_(probability_theory)#Criteria_for_characteristic_functions [eg of application of it is

    $f(t)=max${1-|t|,0}

Is $\phi(t)=\max\{t^2+1-2|t|,1/2\}$ a characteristic function? ]

Every time you fail, check the next steps.

  1. Using some notions and examples of characteristic functions, try to create other characteristic functions which are expected to be a characteristic function is the given function f(t) is a characteristic function, but the newly formed characteristic function is not.

For Eg: If f(t) is a characteristic function, then $|f(t)|^2$ is a characteristic function?[See whether $|f(t)|^2$ is a c.f or not?]

Also, it follows some inequality [ $ Re( 1-f(2t))<= 4Re(1-f(t)^2)$ ]

  1. Now, check that the moments from the given f(t) are defined or not.

[For Eg: It can come out that the even order moments are coming out to be negative and 0]

show that $\exp(-t^4)$ is not a characteristic function

THE ABOVE METHOD IS VERY VERY USEFUL.

  1. Now, you have to try the other criterions in Wikipedia and they require a lot of knowledge about the structure of different characteristic functions and a touch of real luck and experience.

  2. Now at the last, you can apply and convert f(t) into its respective distribution function and density function and then check the properties of Distribution function or density function or not.

https://en.wikipedia.org/wiki/Characteristic_function_(probability_theory)#Inversion_formulae.

Will add something new if found.

Hope It Helps.


Let me just state the theorem I linked to in my comment, so that this question does not go unanswered.

Bochner's theorem

If $\varphi:\mathbb{R}^d\to \mathbb C$ is a complex-valued function with $\varphi(0)=1$, continuous at $0$ and nonnegative-definite in the sense that for $n\geq 1$ we have that $$ \sum_{j=1}^n\sum_{k=1}^n\varphi(z_j-z_k)\,\xi_j\bar{\xi}_k\geq 0,\quad \text{for }\;z_1,\ldots,z_n\in\mathbb{R}^d,\;\xi_1,\ldots,\xi_n\in\mathbb C, $$ then $\varphi$ is the characteristic function of a distribution (random variable) on $\mathbb{R}^d$.