Variation of Parameters: why do we assume the "constraint" $v'_1\left(t\right)y_1\left(t\right)+v_2'\left(t\right)y_2\left(t\right)=0$? [duplicate]
Solution 1:
To see why this condition is natural, and not at all “magical”, consider the more general situation of a system of first order ODEs: $$ \mathbf{y}'(t) = A(t) \, \mathbf{y}(t) + \mathbf{f}(t) , $$ where $A(t)$ is an $n \times n$ matrix, and $\mathbf{y}(t)$ and $\mathbf{f}(t)$ are column vectors.
If we are lucky enough to know a fundamental solution of the homogeneous equation $\mathbf{y}'(t) = A(t) \, \mathbf{y}(t)$, i.e., a nonsingular matrix $\Phi(t)$ such that $$ \Phi'(t) = A(t) \, \Phi(t) , $$ then (as a short computation shows) the substitution $$ \mathbf{y}(t) = \Phi(t) \, \mathbf{v}(t) $$ turns the ODE system into the simpler system $$ \Phi(t) \, \mathbf{v}'(t) = \mathbf{f}(t) , $$ where we can multiply by $\Phi(x)^{-1}$ and integrate to find $\mathbf{v}(t)$, and hence find $\mathbf{y}(t)$. That is, if we know the general solution of the homogeneous equation, then we can solve the inhomogeneous equation by quadratures too. This is “variation of constants” since $\mathbf{y}(t) = \Phi(t) \, \mathbf{c}$ with an arbitrary constant vector $\mathbf{c}$ is the general solution of the homogeneous equation, and here we have replaced $\mathbf{c}$ with the $t$-dependent vector $\mathbf{v}(t)$.
Now, a linear second-order ODE $$ y''(t) + \alpha(t) \, y'(t) + \beta(t) \, y(t) = f(t) $$ can be written as a first-order linear system by letting $y_1=y$ and $y_2=y'$: $$ \frac{d}{dt} \begin{pmatrix} y_1(t) \\ y_2(t) \end{pmatrix} = \begin{pmatrix} 0 & 1 \\ -\beta(t) & -\alpha(t) \end{pmatrix} \begin{pmatrix} y_1(t) \\ y_2(t) \end{pmatrix} + \begin{pmatrix} 0 \\ f(t) \end{pmatrix} . $$ If the general solution of the homogeneous equation $y''(t) + \alpha(x) \, y'(t) + \beta(t) \, y(t) = 0$ is $y(t) = A \, g_1(t) + B \, g_2(t)$, then $$ \Phi(t) = \begin{pmatrix} g_1(t) & g_2(t) \\ g_1'(t) & g_2'(t) \end{pmatrix} $$ is a fundamental matrix for the system, so if we substitute as described above we obtain the simpler system $$ \begin{pmatrix} g_1(t) & g_2(t) \\ g_1'(t) & g_2'(t) \end{pmatrix} \begin{pmatrix} v_1'(t) \\ v_2'(t) \end{pmatrix} = \begin{pmatrix} 0 \\ f(t) \end{pmatrix} . $$ Here, the first row is precisely the constraint $g_1(t) \, v_1'(t) + g_2(t) \, v_2'(t) = 0$ that you are asking about, while the second row gives the other condition $g_1'(t) \, v_1'(t) + g_2'(t) \, v_2'(t) = f(t)$ for the sought “varying constants” $v_1(t)$ and $v_2(t)$.
Solution 2:
This is not an assumption, but rather an educated "guess" (formally called an ansatz) of what one of the conditions could be, in order for a solution to be found. Because a solution is indeed found, our initial guess must be correct. This guess is particularly important because not having it and going straight to higher derivatives would 1) give us an equation more complicated than the one we started with and 2) only lead to one condition, which would not go anywhere.
Sidenote: The solution form for this method $ y(t) = v_1(t)y_1(t) + v_2(t)y_1(t) $ is also an ansatz. The method of undertermined coefficients also works the same way - make a guess, then prove said guess is right.