Dirac Delta Function of a Function

I'm trying to show that

$$\delta\big(f(x)\big) = \sum_{i}\frac{\delta(x-a_{i})}{\left|{\frac{df}{dx}(a_{i})}\right|}$$

Where $a_{i}$ are the roots of the function $f(x)$. I've tried to proceed by using a dummy function $g(x)$ and carrying out:

$$\int_{-\infty}^{\infty}dx\,\delta\big(f(x)\big)g(x)$$

Then making the coordinate substitution $u$ = $f(x)$ and integrating over $u$. This seems to be on the right track, but I'm unsure where the absolute value comes in in the denominator, and also why it becomes a sum.

$$\int_{-\infty}^{\infty}\frac{du}{\frac{df}{dx}}\delta(u)g\big(f^{-1}(u)\big) = \frac{g\big(f^{-1}(0)\big)}{\frac{df}{dx}\big(f^{-1}(0)\big)}$$

Can any one shed some light? Wikipedia just states the formula and doesn't actually show where it comes from.


Solution 1:

Substitute $u=f(x)$, then since $\delta(x)$ is non-vanishing only at $x=0$, we can break up the domain of the integral into small intervals around each root $\alpha_k$ of $f$, where $f$ is monotonic, hence invertible: $$ \begin{align} \int\delta(f(x))\,g(x)\,\mathrm{d}x &=\sum_k\int_{\alpha_k-\epsilon_k}^{\alpha_k+\epsilon_k}\delta(f(x))\,g(x)\,\mathrm{d}x\\ &=\sum_k\int_{f(\alpha_k-\epsilon_k)}^{f(\alpha_k+\epsilon_k)}\delta(u)\,g\!\left(f^{-1}(u)\right)\mathrm{d}f^{-1}(u)\\ &=\sum_k\int_{f(\alpha_k-\epsilon_k)}^{f(\alpha_k+\epsilon_k)}\delta(u)\,\frac{g\!\left(f^{-1}(u)\right)}{f'(f^{-1}(u))}\,\mathrm{d}u\\ &=\sum_k\frac{g(\alpha_k)}{\left|f'(\alpha_k)\right|}\tag{1} \end{align} $$ If $f'(\alpha_k)\lt0$, then $f(\alpha_k+\epsilon_k)\lt f(\alpha_k-\epsilon_k)$ so the limits need to be switched, negating the integral and giving the absolute value.

Equation $(1)$ says that $$ \delta(f(x))=\sum_k\frac{\delta(x-\alpha_k)}{\left|f'(\alpha_k)\right|}\tag{2} $$

Solution 2:

Split the integral into regions around $a_i$, the zeros of $f$ (as integration of a delta function only gives nonzero results in regions where its arg is zero) $$ \int_{-\infty}^{\infty}\delta\big(f(x)\big)g(x)\,\mathrm{d}x = \sum_{i}\int_{a_i-\epsilon}^{a_i+\epsilon}\delta(f(x))g(x)\,\mathrm{d}x $$ write out the Taylor expansion of $f$ for $x$ near some $a_i$ (ie. different for each term in the summation) $$ f(a_i+x) =f(a_i) + f'(a_i)x + \mathcal{O}(x^2) = f'(a_i)x + \mathcal{O}(x^2) $$ Now, for each term, you can show that the following hold: $$ \int_{-\infty}^\infty\delta(kx)g(x)\,\mathrm{d}x = \frac{1}{|k|}g(0) = \int_{-\infty}^\infty\frac{1}{|k|}\delta(x)g(x)\,\mathrm{d}x $$ (making a transformation $y=kx$, and looking at $k<0,k>0$ separately **Note: the trick is in the limits of integration) and $$ \int_{-\infty}^\infty\delta(x+\mathcal{O}(x^2))g(x)\,\mathrm{d}x = g(0) = \int_{-\infty}^\infty\delta(x)g(x)\,\mathrm{d}x $$ (making use of the fact that we can take an interval around 0 as small as we like)

Combine these with shifting to each of the desired roots, and you can obtain the equality you're looking for.