Subadditivity inequality and power functions [duplicate]

Is it true that if $a,b\in\mathbb{R}$ with $a,b\geq 0$ and $0<r<1$, then $(a+b)^r\leq a^r+b^r$?


Solution 1:

Let $x=a/(a+b), y=b/(a+b)$. Then $x,y\in[0,1]$ and $1=x+y\leq x^r+y^r$ for $r\in[0,1]$.

Solution 2:

Define $f:\mathbb{R}_{\ge 0}\to\mathbb{R}$ by $f(x)=(x+b)^{r}-x^{r}$ for fixed $b\in\mathbb{R}$ and $b\ge0$.

$f'(x)=\frac{r}{(x+b)^{1-r}}-\frac{r}{x^{1-r}}\le0$ (since $0<r<1$). So $f$ is non-increasing.

Thus, for all $x\in\mathbb{R}_{\ge0}$ we have $(x+b)^{r}-x^{r}=f(x)\le f(0)=b^{r}$.

So in particular, by renaming our variable $x$ to $a$, we get $(a+b)^{r}\le a^{r}+b^{r}$. Since $b$ is arbitrary in $\mathbb{R}_{\ge0}$ then it holds in general for any $b\in\mathbb{R}_{\ge0}$.

Thus, for any $a,b\in\mathbb{R}_{\ge0}$ we have $(a+b)^{r}\le a^{r}+b^{r}$.