Proving algebraically $a^2+b^2\ge a^{\alpha}b^{2-\alpha}$ for $0\le\alpha\le2$ and $a,b\ge0$

My Analysis professor showed this inequality and elegantly proved it using polar coordinates, saying that it can't be done algebraically. Instead, here's how I think I have handled it: firstly we see it's true for $ab=0$; dividing by the RHS we get $$\left(\frac{a}{b}\right)^{2-\alpha}+\left(\frac{a}{b}\right)^{-\alpha}\ge1, $$ or equivalently, setting $t=a/b$, $$t^2+1\ge t^\alpha$$ which holds because the LHS is $\ge t^2\ge t^\alpha$ for $t\ge1$ and $\ge1\ge t^\alpha$ for $0\le t<1$.

Am I missing something? Are there fancy algebraic (perhaps some linear algebra inequalities) ways to prove the inequality?


Turning my comment into an answer: I don't know whether you'll think this is cheating, but assuming without loss of generality that $a\geq b$, then $$a^2+b^2 \geq a^2 = a^\alpha a^{2-\alpha} \geq a^\alpha b^{2-\alpha}. $$