An example of a a convolution of singular distribution and a Gaussian distribution that has a 'simple' pdf

I am looking for a nontrivial example of a singular distribution that when convolved with a Gaussian distribution has a pdf of a 'simple' form.

I let 'simple' be something that you interpret yourself.

Singular distributions are an import class of distributions that are often 'swept under the carpet.' I would like to see a nice illustrative example of how to work with such distributions.

One way to do this is to give a characteristic function $\phi(t)$ that when multiplied by $e^{-t^2/2}$ has a simple Fourier inverse.

However, I don't have a good choice of the characteristic function $\phi(t)$ that would lead to a meaningful result.

For example, for the Cantor distribution, the characteristic function is given by \begin{align} \phi(t)=e^\frac{it}{2} \prod_{i=1}^\infty \cos \left( \frac{t}{3^k} \right). \end{align} However, it and is not easy to work with this characteristic function \begin{align} \phi(t) e^{-\frac{t^2}{2}}. \end{align} In particular, it is difficult to fuind its Fourier inverse.

Edit: By singular distributions I mean: A singular distribution is a probability distribution concentrated on a set of Lebesgue measure zero, where the probability of each point in that set is zero.

Edit 2: Another approach we can take is to look at the convolution directly. That is look at the $U=X+V$ where $V$ is has a singular distribution and $X$ is Gaussian, in this case, the pdf of $U$ is given by \begin{align} f_U(u)=E\left[ \frac{1}{\sqrt{2 \pi}} e^{-\frac{(u-V)^2}{2}} \right]. \end{align}

I was wondering if we can come up with a sequence of random variables $V_n$ that converges in distribution to some $V$ with a singular distribution, for which we can compute the limit \begin{align} \lim_{ n \to \infty}E\left[ \frac{1}{\sqrt{2 \pi}} e^{-\frac{(u-V_n)^2}{2}} \right]. \end{align}


Solution 1:

I may be taking a very liberal interpretation of "simple", but hear me out. Define \begin{align*} Y_n = 2\sum_{k=1}^{n}\frac{X_k}{3^k} \end{align*} where $X_1, X_2, \cdots \overset{\text{iid}}{\sim} \text{Bernoulli}(\frac{1}{2})$. It's a well-known result that $Y_n$ converges in distribution to a Cantor random variable. Trivially, \begin{align*} \mathbb{P}(Y_n = c) = \begin{cases} \frac{1}{2^n} & c \in S_n \overset{\text{def}}{=}\{2\sum_{k=1}^{n}\frac{x_k}{3^k}: x_1, \cdots, x_n \in \{0, 1\}\} \\ 0 & \text{otherwise} \end{cases} \end{align*} For a discrete random variable $D$ taking values $(d_1, \cdots, d_n)$ with probabilities $(p_1, \cdots, p_n)$, the convolution of $D+Z$ for a standard Gaussian $Z$ is a mixture of Gaussians with density \begin{align*} f_{D+Z}(t) = \sum_{k=1}^{n}p_k\phi(t - d_k) \end{align*} where $\phi(t) = (2\pi)^{-1/2}e^{-t^2/2}$ is the density of a standard Gaussian. So \begin{align*} f_{Y_n + Z}(t) = \frac{1}{2^n}\sum_{c \in S_n}\phi(t-c) \end{align*} Checking all necessary conditions and taking the limit $n \rightarrow \infty$, we have the convolution of a Cantor and a standard Gaussian. So, this convolution can be interpreted as an (uncountably) infinite mixture of Gaussians. This isn't quite satisfactory because an infinite mixture of Gaussians can pretty much approximate any continuous distribution, and therefore the answer is still "something", but at least this way we can have a constructive approximation.