Is the empty set linearly independent or linearly dependent?
Is empty set linearly independent or dependent?
By definition, it is linearly independent, because it is not linearly dependent.
A set $S$ is linearly dependent if there exists a finite set of vectors $v_1,\dots, v_n$ and corresponding scalars $\alpha_1,\dots,\alpha_n$ such that there exists at least one $\alpha_i\neq0$ so that $$\sum_{i=1}^n \alpha_i v_i=0$$
Remark: (equivalently, we could demand that all $\alpha_i$ are nonzero, but then we would also need to demand that there exists at least one $\alpha_i$ that is nonzero. This is because the empty set satisfies the demand that every element from it is nonzero...)
Clearly, these is no finite collections of vectors from $\{\}$ that satisfies the above condition, because there is no collection at all.
Furthermore, the empty set is also a basis of the vector subspace $\{0\}$, because $\{0\}$ is the
Smallest vector space that includes $\{\}$ and is a vector space.
In a way, you could also round-about your reasoning that $\{\}$ is linearly independent like this:
- You know that $\{0\}$ is a vector space.
- You know that every vector space has a basis.
- You know that the basis of $\{0\}$ is a subset of $\{0\}$, so the basis of $\{0\}$ can either be $\{\}$ or $\{0\}$
- You know that $\{0\}$ is not a basis because it is not linearly independent (because $1\cdot 0=0$)
- Therefore, $\{\}$ is a basis.
- Because all bases are linearly independent, so is $\{\}$
Note, this isn't really a "good" proof because it makes a sort of begging the question fallacy. This wasn't meant to be a proof in a mathematical sense, just a proof "to yourself" that you already know the empty set is linearly independent, because that's the only way every vector space can have a basis, and you know that that is true (and, in fact, you or someone else must have used the fact that the empty set is linearly independent while proving that fact)
Since the correct definition of "linearly dependent" has not been spelled out in detail in any of the answers so far, let me add a new answer. A subset $S$ of a vector space $V$ is defined to be linearly dependent if there exist finitely many distinct elements $s_1,s_2,\dots,s_n\in S$ and scalars $c_1,c_2,\dots,c_n$ which are not all $0$ such that $$\sum_{i=1}^n c_is_i=0.$$
(If $S$ itself is finite, you can just state this condition with $s_1,\dots,s_n$ being all the elements of $S$, since given any relation of this form you can just make the coefficients be $0$ for all the elements of $S$ you have not used.)
When $S=\emptyset$, the only possible collection of finitely many distinct elements of $S$ is the empty collection, with $n=0$. But there does not exist any collection of $0$ scalars, not all of which are $0$. After all, if you have a collection of no scalars, then vacuously all the scalars in your collection are $0$.
Thus $\emptyset$ is linearly independent (as a subset of any vector space).
It is linearly independent.
If a set is linearly dependent, then there would be a nontrivial linear combination of the vectors in the family that added up to the zero vector. It is also impossible to choose a vector in the empty set and write it as a linear combination of the other vectors in the empty set, since the empty family is empty.
Not only is it linearly independent, but it is also the basis for $\{\textbf{0}\}$.