Maximal points on an $n$-dimensional ellipsoid
First, rewrite the equation of your ellipsoid using matrices:
$$\sum_{i=1}^n \frac{(\vec{x} \cdot \vec{v}_i )^2}{\lambda_i^2} = 1 \quad\iff\quad \vec{x}^T \Lambda\,\vec{x} = 1 \quad\text{ where }\quad \Lambda = \sum_{i=1}^n\frac{\vec{v}_i \otimes \vec{v}_i}{\lambda_i^2} $$ where $\otimes$ stands for outer product between two column vectors.
The unit normal vector $\vec{n}$ at a point $\vec{x}$ on the ellipsoid will be in the direction of the gradient:
$$\vec{n}\;\propto\; \vec{\nabla}(\vec{x}^T \Lambda\,\vec{x}) = 2 \Lambda\vec{x}$$
Let $\mu \in \mathbb{R}$ be the number such that
$$\Lambda \vec{x} = \mu \vec{n} \quad\iff\quad \vec{x} = \mu \Lambda^{-1} \vec{n}$$
We have $$\mu^2 \vec{n}^T \Lambda^{-1} \vec{n} = \vec{x}^T\Lambda\vec{x} = 1 \quad\implies\quad \mu = \frac{1}{\sqrt{\vec{n}^T\Lambda^{-1}\vec{n}}} \quad\implies\quad \vec{x} = \frac{\Lambda^{-1}\vec{n}}{{\sqrt{\vec{n}^T\Lambda^{-1}\vec{n}}}} $$ Given any direction in the form of a unit vector $\vec{e}$, at the point $\vec{x}$ on the ellipsoid which maximizes $\vec{x}\cdot\vec{e}$, the corresponding normal vector $\vec{n}$ lies in the direction of $\vec{e}$. This means the extend of the ellipsoid along direction $\vec{e}$ will be given by
$$\vec{e}\cdot\vec{x} = \vec{n}\cdot\vec{x} = \sqrt{ \vec{n}\Lambda^{-1}\vec{n} } = \sqrt{ \vec{e}\Lambda^{-1}\vec{e} }$$
For example, to obtain the extend of the ellipsoid along the $x$-direction, one just need to compute the inverse matrix $\Lambda^{-1}$ and take the square root of its first diagonal element!
Update
Let us use the $2$-dim example in question as an example.
Instead of using $c$ as the angle between the major- and $x$-axis,
we will use $\theta$ instead.
Redefine $c$ now as $\cos\theta$ and let $s = \sin\theta$. we have
$$ \lambda_1 = a,\;\lambda_2 = b,\; \vec{v}_1 = \begin{pmatrix} c \\ s\end{pmatrix},\; \vec{v}_2 = \begin{pmatrix}-s \\ c\end{pmatrix} $$ The matrix $\Lambda$ is now given by
$$\frac{1}{a^2} \begin{pmatrix}c \\ s\end{pmatrix} \otimes \begin{pmatrix}c \\ s\end{pmatrix} + \frac{1}{b^2} \begin{pmatrix}-s \\ c\end{pmatrix} \otimes \begin{pmatrix}-s \\ c\end{pmatrix} = \frac{1}{a^2} \begin{pmatrix} c^2 & sc \\ sc & s^2\end{pmatrix} + \frac{1}{b^2} \begin{pmatrix} s^2 & -sc\\-sc & c^2\end{pmatrix} $$ Expand everything out, we find $$\Lambda = \begin{pmatrix} \frac{c^2}{a^2} + \frac{s^2}{b^2} & \frac{sc}{a^2} - \frac{sc}{b^2}\\ \frac{sc}{a^2} - \frac{sc}{b^2} & \frac{s^2}{a^2} + \frac{c^2}{b^2} \end{pmatrix} \quad\implies\quad \Lambda^{-1} = \frac{1}{\Delta} \begin{pmatrix} \frac{s^2}{a^2} + \frac{c^2}{b^2} & \frac{sc}{b^2} - \frac{sc}{a^2}\\ \frac{sc}{b^2} - \frac{sc}{a^2} & \frac{c^2}{a^2} + \frac{s^2}{b^2} \end{pmatrix} $$ where $\Delta = \det\Lambda = \frac{1}{a^2b^2}$. From this, we can read off the maximal $x$-extend of the ellipse as
$$ x_{max} = \sqrt{\begin{pmatrix}1 \\ 0\end{pmatrix}^T \Lambda^{-1}\begin{pmatrix}1 \\ 0\end{pmatrix}} = ab \sqrt{\frac{s^2}{a^2} + \frac{c^2}{b^2}} = \sqrt{a^2\cos^2\theta + b^2\sin^2\theta} $$ This is a simplified version of the expression of $x_{max}$ in question.
The problem is to maximize $\langle x,e_1\rangle$ subject to $\sum_i \lambda_i^2\langle x,v_i\rangle^2 = 1$. (Well, $\le 1$, but by convexity the extreme value is attained on the boundary.) Since the $v_i$ are an orthonormal basis, $$ \langle x,e_1\rangle = \sum_i \langle x,v_i\rangle \langle e_1,v_i\rangle = \sum_i \lambda_i \langle x,v_i\rangle \cdot \frac{\langle e_1,v_i\rangle}{\lambda_i} $$ Now use Cauchy-Schwarz (the equality case of which will tell you enough to get a formula for $x = \sum_i \langle x,v_i\rangle v_i$).