When are time-varying vectors $\phi_1(t), \phi_2(t) \text{for } t \in \mathbb{R}$ linearly independent?
Solution 1:
If $\alpha_{1}t+\alpha_{2}t^{2}=0$, $\forall t \in \mathbb{R}$ when $t=1$ we have:
$\alpha_{1}+\alpha_{2}=0$, so $\alpha_{2}=-\alpha_{1}$.
Then we have that $\alpha_{1}t-\alpha_{1}t^{2}=0$, $\forall t \in \mathbb{R}$.
So when $t=2$ we have that:
$2\alpha_{1}-4\alpha_{1}=0$, so $-2\alpha_{1}=0$, so $\alpha_{1}=0$, and in consecuence, $\alpha_{2}=0$.
Solution 2:
You require that $\alpha_1 \phi_1(t) + \alpha_2 \phi(t) =0$ implies $\alpha_1 = \alpha_2 = 0$ for all $t$ and you have shown that if $\alpha_1 \neq 0 \neq \alpha_2$ then $\alpha_1 \phi_1(t) + \alpha_2 \phi(t) =0$ implies $\alpha_1 = -t \alpha_2$, so the definition of dependence you gave is satisfied.
To understand why this makes sense intuitively consider the geometric interpretation of linear (in)dependence. A (finite) set of vectors is linearly dependent if the number of vectors is larger than the number of dimensions they span. In particular, in the case of just two vectors dependence means that they lie on the same line. In other words one is a multiple of the other. In the example you provided that is clearly the case, both the vectors point in the same direction for all times $t>0$. However, their magnitudes grow at different rates, so one cannot be a constant multiple of the other. (It is however a scalar multiple).
I think you might be confused because the term "constant" is sometimes used in places where "scalar" would be more accurate. Linear dependence is such a case.