Why is a linear combination of solutions also a solution?

I'm working on DEs and I've come across some speedbumps. I'm not very familiar with linear algebra and that might be the problem. However, I'm sure that there's a non-linear algebra explanation that I could understand.

Here's what I know right now:

There's a thing called the Principle of Superposition, which says that if $y_0$ and $y_1$ are linearly independent solutions to a given linear DE, then so is $C_0y_0+C_1y_1$. (The DE might have to be homogeneous, I'm not sure.) I don't understand why this is true.

If each solution is not a multiple of the other, then each is linearly independent from the other. Beyond that explanation, I don't know what it means.

Then, I want to learn how to show that this sum is a general solution. This is where my book introduces Cramer's Rule. Is there an intuitive explanation that will explain to me what is happening here?

I'm learning from online sources and trying to mash together an explanation but they all tend to skip over the details here, or assume some form of precogniscience. (You know, where an explanation of something low on the 'math tree' is given in terms of higher order mathematics.)

When I work with a 1st-order DE, I understand that a general solution is a 'family' of solutions, given the unknown constant. They're a one-dimensional family as it were. I can represent this family as a directional field.

With a 2nd-order DE, I guess that the general solution is a 2-dimensional family. I don't know how I would represent that geometrically. Maybe parametrically with a directional field? I'm just guessing.

Any explanations of any of these things would be very much appreciated.


This is true for a homogeneous linear differential equation. The reason is that you are looking for a solution to $$(\sum_{n=0}^k a_nD^n)y = 0$$ where D is the derivative operator. Now observe if $y_1$ and $y_2$ are solutions then observe that $$(\sum_{n=0}^k a_nD^n)y_1 = 0$$ $$(\sum_{n=0}^k a_nD^n)y_2 = 0$$ Now take any two real numbers $c_1$ and $c_2$ and observe that $$(\sum_{n=0}^k a_nD^n)(c_1y_1+c_2y_2) = c_1(\sum_{n=0}^k a_nD^n)y_1 + c_2(\sum_{n=0}^k a_nD^n)y_2 = 0$$

Thus we showed that a linear combination of solutions is also a solution.


Do you recall from Calculus that for constants $c$, differentiable functions $f(x)$ and $g(x)$, and integers $n\geq 0$ we have the following two facts? $$\frac{d^n}{dx^n}\bigl[c\cdot f(x)\bigr]=c\cdot\frac{d^n}{dx^n}\bigl[f(x)\bigr]\tag{1}$$ and $$\frac{d^n}{dx^n}\bigl[f(x)+g(x)\bigr]=\frac{d^n}{dx^n}\bigl[f(x)\bigr]+\frac{d^n}{dx^n}\bigl[g(x)\bigr]\tag{2}$$

The same idea is in play, here, with linear homogeneous DEs (they do need to be homogeneous).