Confusion about the Change of Variables formula.

Suppose that $g$ is one-to-one and continuously differentiable and that $f$ is continuous. The single-variable change of variables formula on Wikipedia is: $$\int_a^b f(g(x))g'(x)\ dx = \int_{g(a)}^{g(b)} f(x)\ dx.$$ Since $g$ is one-to-one, we can write this as: $$\int_a^b f(x)\ dx = \int_{g^{-1}(a)}^{g^{-1}(b)} f(g(x))g'(x)\ dx.$$

Similarly, the multivariable change of variables formula states that: $$\int_{g(U)} f(v)\ dv = \int_U f(g(u))|\text{det}Dg|\ du.$$ If $g$ is again one-to-one, then we can write this as: $$\int_{U} f(v)\ dv = \int_{g^{-1}(U)} f(g(u))|\text{det}Dg|\ du.$$

Now, here is what I'm having trouble with. Suppose $g(t) = -t$ and $U = [0,1]$. We have that $g^{-1}(1) = -1$ and $g^{-1}(0) = 0$, so by the single-variable formula, $$\int_0^1 f(t)\ dt = \int_{0}^{-1} f(-t) \cdot -1\ dt.$$ However, as $g^{-1}((0,1)) = (-1,0)$, we have by the multi-variable formula, $$\int_0^1 f(t)\ dt = \int_{-1}^0 f(-t) \cdot -1\ dt.$$

So which one is correct? Is there some hidden assumption that $g$ is increasing in the first formula?


Solution 1:

The one-dimensional integral $\int_a^b$ is a "directed" integral. For $a\leq b$, one defines it the typical way using Riemann sums, and then for the situation $a>b$, one defines the symbol $\int_a^b$ to mean $-\int_b^a$. It is this second step, which seems convenient to do, that is the cause of your confusion. Of course, the reason one adopts this convention is so that formulas like $\int_a^b=\int_a^c+\int_c^b$ hold for all $a,b,c$, rather than only $a\leq c\leq b$.

In higher dimensions (or even in one-dimension, when starting with Lebesgue integrals) one doesn't "integrate from point $a$ to point $b$". Rather one fixes a set $U$ and defines integrals over sets $\int_U$. For example, if $a\leq b$, one can consider the interval $U=[a,b]$. In this case, the higher-dimensional definition of integration $\int_U$ is nothing but $\int_a^b$. In the case where $a>b$, we can again consider the interval between $a$ and $b$, which is $U=[b,a]$. Now, $\int_U=\int_b^a=-\int_a^b$.

I'll now show how to present the calculation using both the single-variable theorem and multivariable theorem (which of course must still hold in dimension $n=1$), to convince you that both theorems are right (after all, that's why they're called theorems), but you just haven't applied the multivariable theorem correctly (you forgot the absolute value sign on the derivative).

So, our hypothesis now is that $f:[0,1]\to\Bbb{R}$ is continuous and $g:\Bbb{R}\to\Bbb{R}$ is the function $g(t)=-t$ (so $g=g^{-1}$).


Calculating with Single-Variable Formula:

Now, we have \begin{align} \int_0^1f(t)\,dt&=\int_{g^{-1}(0)}^{g^{-1}(1)}f(g(x))\cdot g'(x)\,dx\\ &=\int_{0}^{-1}f(-x)\cdot (-1)\,dx\\ &:=\int_{-1}^0f(-x)\,dx\tag{$*$} \end{align} (last equality is by definition as I've explained above).


Calculating with Multivariable Formula:

Here, we're considering the set $U=[0,1]$. Now, because we're in one-dimension, we have $\det Dg= g'$ (the determinant of a $1\times 1$ matrix is just the single matrix entry). So, \begin{align} \int_{[0,1]}f(t)&=\int_{g^{-1}([0,1])}f(g(x))\cdot |g'(x)|\,dx\\ &=\int_{[-1,0]}f(-x)\cdot 1\,dx\\ &=\int_{[-1,0]}f(-x)\,dx\tag{$**$} \end{align} Now, recall what I said above: if $a\leq b$, then $\int_{[a,b]}=\int_{a}^b$. Hence, both the single-variable calculation and the multivariable calculations, $(*)$ and $(**)$ agree.


So, as you can see, in the single variable theorem, the effect of having a negative $g'$ (i.e a decreasing function $g$ for the change of variables) is already taken into account by the notation via the placement of the limits of integration. So, to answer your final question, NO there is no implicit assumption that $g$ is increasing.

In higher dimensions, one doesn't have a (nice) ordering, which is why we simply define integrals over sets, and simply put absolute values on the $|\det Dg|$ term.