Integral of $x' \mapsto e^{-\frac{1}{\alpha}\|x'-x\|_p}$ over an $\ell_p$-ball around $x$ in $\mathbb R^n$
Let $n \in \mathbb N^*$, $p \in [1,\infty]$, $\alpha > 0$, $\beta \ge 1$, and $r \ge 0$. For $x \in \mathbb R^n$, let $B_{n,p}(x;r) := \{x' \in \mathbb R^n \mid \|x'-x\|_p \le r\}$ be the $\ell_p$-ball around $x$ of radius $r \ge 0$ in $\mathbb R^n$.
Question. What is the value of $u(n,p,r,\alpha,\beta) := \int_{B_{n,p}(x;r)}e^{-\frac{1}{\alpha}\|x'-x\|_p^{\beta}}dx'$ ?
Solution 1:
I will write the justification for exchanging the differentiation and integration that @Ciaran mentioned in the comments; I think that is the best way to prove the formula you wrote in the comments, namely:
$$\int_{B_{n,p}(r)} e^{-\|x \|_p} dx = \omega_{n,p} \int_0^r ns^{n-1} e^{-s} ds$$
where $\omega_{n,p}(r)$ is the $\ell_p$-ball of radius $r$ in $\mathbb{R}^n$
We can argue that the integral above is a function of $r$, so we define
$$ f(r) = \int_{B_{n,p}(r)} e^{-\|x \|_p} dx $$.
Now, we try to compute the derivative, so we first compute an upper and a lower bound on the quotient $ f'(r,\epsilon) = \frac{f(r+\epsilon) - f(r)}{\epsilon}$ to make sure that the derivative exists. Here, we can use the monotonicity of the integrand to argue that
$$ \frac{ (\omega_{n,p}(r + \epsilon) - \omega_{n,p}(r)) e^{-r}}{\epsilon} \geq f'(r, \epsilon) \geq \frac{(\omega_{n,p}(r+\epsilon) - \omega_{n,p}(r))e^{-(r + \epsilon)}}{\epsilon}$$
Now, for both sides of the inequality, the limit as $\epsilon$ goes to $0$ is well-defined (we use the continuity of $e^{-r}$ as a funciton or $r \ge 0$, and that $\omega_{n,p}(r) = r^n \omega_{n,p}(1)$, which is also a continuous function of $r \ge 0$) , we get that the derivative $f'(s)$ is well-defined and is equal to
$$f'(s) = n\omega_{n,p}(1) s^{n-1} e^{-s} $$
and the result now follows from fundamental theorem of calculus.
(An alternate approach is to use the layer-cake decomposition; it works just fine (that is how I first did it), though it is a bit longer I think, and I had to use integration by parts to show that the two expressions were equal)
Solution 2:
In the comments, the problem has been essentialized to computing the integral $\int_{B_{n,p}(0, r)}e^{-\frac{1}{\alpha}\|y\|_p^\beta}dy$. Here, I'll pursue the computations, only focusing on the case when $p \in \{1,2,\infty\}$.
Now, let $\varphi :[0, \infty) \rightarrow \mathbb R$ be a measurable function, e.g $\varphi(t) \equiv e^{-t^\beta/\alpha}$ (with $\alpha,\beta > 0$) in my problem. Define measurable functions $g,u:\mathbb R^n \rightarrow \mathbb R$ by $g(y) := \varphi(\|y\|_p)1_{B_{n,p}(0,R)}(y)$, $z(y) := \|y\|_p$. For $t \ge 0$, define the level-set $z^{-1}(t) := \{y \in \mathbb R^n \mid \|y\|_p = t\}=:S_{n,p}(0,t)$. Note that thanks to the triangle inequality for $\ell_p$-norms, $z$ is Lipschitz and $z^{-1}(t)$ is a.e smooth for all $t \ge 0$. Finally, note that $\partial_j u(y) = \dfrac{y_j|y_j|^{p-2}}{\|y\|_p^{p-1}}$ and so for every $y \in \mathbb R^n\setminus\{0\}$, we have $$ \begin{split} \|\nabla z(y)\|_2 &= \frac{1}{\|y\|_p^{p-1}}\sqrt{\sum_{j=1}^n y_j^2y^{2(p-2)}}=\frac{1}{\|y\|_p^{p-1}}\sqrt{\sum_{j=1}^n y^{2(p-1)}}\\ &=a_{n,p} := \begin{cases}1, &\mbox{ if }p \in \{2,\infty\},\\\sqrt{n},&\mbox{ if }p=1.\end{cases} \end{split} $$ For $m \in \mathbb N$, let $\mathcal L^m$ be the $m$-dimensional Lebesgue measure (aka $m$-dimensional volume) and let $\mathcal H^m$ be the $m$-dimensional Hausdorff measure (aka $m$-dimensional surface area). Then by the coarea formula, we have
$$ \begin{split} a_{n,p}\int_{B_{n,p}(0,r)}\varphi(\|y\|_p)dy &= \int_{\mathbb R^n}g(y)\|\nabla z(y)\|_2dy\\ &= \int_{\mathbb R}\left(\int_{z^{-1}(t)}g(y)d\mathcal H^{n-1}(y)\right)dt\\ &= \int_{0}^r \varphi(t)\mathcal H^{n-1}(S_{n,p}(0,t))dt\\ &= \int_{0}^r \varphi(t)\partial_t\mathcal L^n (B_{n,p}(0,t))dt\\ &= n\omega_{n,p}(1)\int_{0}^r \varphi(t)t^{n-1}dt, \end{split} $$ where we've used the fact that $\mathcal L^n(B_{n,p}(0,t)) =: \omega_{n,p}(t) = t^n\omega_{n,p}(1)$. In particular, let $\varphi(t) \equiv e^{-t^\beta/\alpha}$. Define the incomplete gamma function $\gamma:[1,\infty) \times [0, \infty] \rightarrow [0, \infty)$, by $\gamma(a,x) := \int_{0}^x e^{-s}s^{a-1}ds$, and note that $\gamma(a,\infty) \equiv \Gamma(a)$, the ordinary gamma function. Then, letting $u:=u(n,p,r,\alpha,\beta)$, one computes
$$ \begin{split} \frac{\beta a_{n,p}}{\alpha^{n/\beta}\Gamma(n/\beta)n\omega_{n,p}(1)}u &= \frac{\beta a_{n,p}}{\alpha^{n/\beta}\Gamma(n/\beta)n\omega_{n,p}(1)}\int_{\|y\|_p \le r}e^{-\frac{1}{\alpha}\|y\|_p^\beta}dy \\ &= \frac{\beta}{\alpha^{n/\beta}\Gamma(n/\beta)}\int_{0}^r e^{-t^\beta/\alpha}t^{n-1}dt\\ &=\frac{\gamma(n/\beta,r^{\beta}/\alpha)}{\Gamma(n/\beta)}, \end{split} \tag{*} $$ which is the CDF of the Amoroso distribution with parameters $\alpha^{1/\beta}$ (the scale), $n$, and $\beta$.
The LHS of (*) is the probability that a random vector $Y$ drawn from $\mathbb R^n$ drawn with density proportional to $e^{-\frac{1}{\alpha}\|Y\|_p^\beta}$ lies within an $\ell_p$-ball of radius $r$ around the origin. In particular, we derive from the above computations that the $k$ moment of $\|Y\|_p$ is given by $$ \mathbb E[\|Y\|_p^k] = \frac{\alpha^{k/\beta}\Gamma((n + k)/\beta)}{\Gamma(n/\beta)} \sim \left(\frac{\alpha n}{\beta}\right)^{k/\beta},\text{ for }n \gg \beta. \tag{**} $$
The special case $\beta=1$
In particular, if $\beta=1$, then
$$ \begin{split} \frac{\gamma(n/\beta,(r/\alpha)^{1/\beta})}{\Gamma(n/\beta)}&=\frac{\gamma(n,r/\alpha)}{\Gamma(n)}=\mathbb P\left(\sum_{k=1}^nX_k \le r/\alpha\right)\\ &= \mathbb P\left(\sqrt{n}\left(\frac{\sum_{k=1}^n X_k}{n}-1\right)\le \frac{r/\alpha-n}{\sqrt{n}}\right)\\ &= \Phi\left(\frac{r/\alpha-n}{\sqrt{n}}\right) + \mathcal O\left(\frac{1}{\sqrt{n}}\right), \end{split} $$ where $X_1,\ldots,X_n$ are i.i.d unit rate exponential random variables and $\Phi$ is the CDF of the standard Gaussian distribution $\mathcal N(0, 1)$, and we have made use of the Central Limit Theorem.