A multivariate function with bounded partial derivatives is Lipschitz
Solution 1:
The proof is correct and sufficiently detailed. My personal preference is to express it in a more "modular" form by isolating an important fact:
Lemma. A function $f:\mathbb{R}^n\to \mathbb R$ is Lipschitz if and only if there exists a constant $L$ such that the restriction of $f$ to every line parallel to a coordinate axis is Lipschitz with constant $L$.
Notice that the lemma has nothing to do with partial derivatives. One direction is trivial, the other is just the triangle inequality. E.g., $$ |f(a_1,a_2)-f(b_1,b_2)| \le |f(a_1,a_2)-f(b_1,a_2)|+|f(b_1,a_2)-f(b_1,b_2)| \\ \le L|a_1-a_2|+L|b_1-b_2| \le L\sqrt{2}\|a-b\| $$ and similarly for general $n$. $\quad\Box$
Once you have the lemma, the proof of the claim in your post boils down to setting $L = \max_i(\sup |D_if|)$ and using the one-dimensional Mean Value Theorem to show that the hypothesis of the lemma is satisfied.