How can I get distribution function from characteristic function?
Suppose $F=F(x)$ is distribution function of r.v. $X$ and its characteristic function is $\varphi_X(t)=\int_{-\infty}^\infty e^{itx}dF(x)$. Then for any $a<b$ where $F$ is continuous at $a,b$ we have $$F(b)-F(a)=\lim_{c\to\infty}\frac1{2\pi}\int_{-c}^c\frac{e^{-ita}-e^{-itb}}{it}\varphi_X(t)\;{\rm d}t$$ I wonder if we can get the distribution function $F(x)=\mathbb P(X\leqslant x)$ directly by taking $a\to-\infty$ of the equation above, say $$F(x)=\lim_{a\to-\infty}\lim_{c\to\infty}\frac1{2\pi}\int_{-c}^c\frac{e^{-ita}-e^{-itx}}{it}\varphi_X(t)\;{\rm d}t$$ If not, how can I calculate $F(x)$ from its characteristic function $\varphi$?
Well, actually I read a textbook which says $$F(x)=\frac1{2\pi}\lim_{\sigma\to0^+}\int_{(-\infty,x]}{\rm d}y\int_{\mathbb R} {\rm e}^{-ity-\sigma^2t^2/2}\varphi_X(t)\;{\rm d}t$$ but I have no clue why there is a term $e^{-\sigma^2t^/2}$.
Yes, you can find $F(x)$ by taking limit as $a \to -\infty$ (through continuity poinst of $F$).
$\int_{\mathbb R} {\rm e}^{-ity-\sigma^2t^2/2}\varphi_X(t)\;{\rm d}t$ is the density function of $X+Y_{\sigma}$ where $Y_{\sigma} \sim N(0,\sigma)$, $X$ and $Y_{\sigma}$ being independent. This follows from the fact that $e^{-\sigma^2t^2/2}\varphi_X(t)\;{\rm d}t$ is the characteristic function of $X+Y_{\sigma}$ and the Proposition below. Henec $\int_{(-\infty,x]}{\rm d}y\int_{\mathbb R} {\rm e}^{-ity-\sigma^2t^2/2}\varphi_X(t)\;{\rm d}t$ is the distribution function of $X+Y_{\sigma}$ which tends to the distribution function of $X$.
Proposition
If the characteristic function $\phi_Z$ of $Z$ is integrable then $Z$ has a density given by $f_Z(t)=\int e^{-ity} \phi_Z(t) dt$.
[This proposition is one form of the inversion theorem].