Constrained variational problems intuition

Solution 1:

I think you have some problems, because you use an incorrect notation. Let me rewrite your original problem: \begin{align*} \text{minimize}\quad & J(y) = \int_{x_0}^{x_1} F(x, y(x), y'(x)) \, \mathrm{d} x \\ \text{subject to}\quad & G(x, y(x), y'(x)) = 0 \quad\text{for all } x \in [x_0,x_1]. \end{align*} Here, $F : \mathbb{R} \times \mathbb R \times \mathbb R \to \mathbb R$ and $G : \mathbb R \times \mathbb R \times \mathbb R \to \mathbb R^n$. Do you see the differences? $J$ only depends on the function $y$, whereas the integrand $F$ and the constraint $G$ depend on real numbers.

Now (if a constraint qualification is satisfied), you get a multiplier $\lambda : [x_0, x_1] \to \mathbb R^n$ (compare with section 6.2 in your link: you get a multplier for each constraint, that is, for each $x$), such that the derivative of the Lagrangian $$J(y) + \int_{t_0}^{t_1} G(x, y(x), y'(x)) \, \lambda(x) \, \mathrm{d} x$$ with respect to $y$ is zero (that is, the derivative of your lagrangian w.r.t. the optimization variable). Now, you can continue like for the derivation of the euler-lagrange equation.

@2.: Yes, this is correct, but the derivative of $J$ w.r.t. $y$ does not equals the derivative of $J_1$ w.r.t. $y$.

@3.: As joriki already said, you have to solve the resulting Euler-Lagrange equation together with the constraint. In other words: The Euler-Lagrange equation depends on $\lambda$. Once you have fixed $\lambda$, they have a unique solution $y$ (depending on $\lambda$). It remains to choose $\lambda$, such that the corresponding $y$ satisfies the constraints (this is reasonable, since you have as many constraints as degrees of freedom in $\lambda$).