True of False: Every 3-dimensional subspace of $ \Bbb R^{2 \times 2}$ contains at least one invertible matrix. [duplicate]
Here's a nice solution using the fact that $\Bbb R^{2 \times 2}$ has a "dot-product" given by $$ \DeclareMathOperator{\tr}{Tr} \langle A,B \rangle = \tr(AB^T) $$ With that, we can describe any dimesnion $3$ subspace by $$ S = \{A : \tr(AM) = 0\} $$ for a fixed non-zero matrix $M$. If $M$ is invertible, then we can note that $$ A = \pmatrix{1&0\\0&-1}M^{-1} $$ is an element of $S$. If $M$ is not invertible, then $M = uv^T$ for column vectors $u$ and $v$. It suffices to select an invertible $A$ such that $Au$ is perpendicular to $v$.
Here's a quick proof which uses special properties of the field $\mathbb{R}$. Consider the set of matrices of the form $\begin{pmatrix}a & -b \\ b & a\end{pmatrix}$. Note that every nonzero matrix in this set is invertible, since such a matrix has determinant $a^2+b^2$ which is nonzero unless $a=b=0$ (here is where we use the fact that our field is $\mathbb{R}$). But these matrices form a $2$-dimensional subspace of $\mathbb{R}^{2\times 2}$, which must have nontrivial intersection with any $3$-dimensional subspace. So any $3$-dimensional subspace contains a nonzero matrix of this form, which is invertible.
OK, now here's a more complicated proof that works over any field. Let $V\subseteq\mathbb{R}^{2\times 2}$ be $3$-dimensional and let $\{e_1,e_2\}$ be a basis for $\mathbb{R}^2$. Let $W$ be the $2$-dimensional subspace of $\mathbb{R}^{2\times 2}$ consisting of all $A$ such that $A(e_1)=0$. Note that $\dim V=\dim V\cap W+\dim V/(V\cap W)$ and $V\cap W$ and $V/(V\cap W)$ are each at most $2$-dimensional. So one has dimension $1$, and the other has dimension $2$.
Suppose $\dim V\cap W=1$ so $\dim V/(V\cap W)=2$. Let $A\in V\cap W$ be nonzero, so $A(e_1)=0$ and $A(e_2)\neq 0$. Note that $\dim V/(V\cap W)=2$ means that every element of $\mathbb{R}^{2\times 2}/W$ has a representative in $V$. That is, for any matrix $B$, there is $C\in V$ such that $B-C\in W$, which means $B(e_1)=C(e_1)$. In particular, choosing $B$ such that $B(e_1)$ is linearly independent from $A(e_2)$, there is some $C\in V$ such that $C(e_1)$ is linearly independent from $A(e_2)$. If $C$ is invertible, we're done. Otherwise, $C(e_2)$ is a multiple of $C(e_1)$, and so $C(e_2)+A(e_2)$ is not a multiple of $C(e_1)$. Taking $D=C+A$, we then have that $D(e_1)=C(e_1)$ and $D(e_2)=C(e_2)+A(e_2)$ are linearly independent. Thus $D$ is an invertible element of $V$.
The case that $\dim V\cap W=2$ and $\dim V/(V\cap W)=1$ is similar. Let $A\in V\setminus (V\cap W)$, so $A(e_1)\neq 0$. If $A$ is invertible, we're done; otherwise $A(e_2)$ is a multiple of $A(e_1)$. Since $\dim V\cap W=2$, we have $W\subset V$. In particular, let $B$ be a matrix such that $B(e_1)=0$ and $B(e_2)$ is not a multiple of $A(e_1)$. Then $A(e_2)+B(e_2)$ is not a multiple of $A(e_1)$, and $B\in W\subset V$. So $C=A+B\in V$ is invertible since $C(e_1)=A(e_1)$ and $C(e_2)=A(e_2)+B(e_2)$ are linearly independent.
(In fact, with a little work you can prove you can always choose $e_1$ so that you're in the first case, so the second case is unnecessary.)
I think this is the most straightforward way to see it, using just the basic operations and dimensional considerations. Let $X$ be a $3$-dimensional subspace, and let $$A=\pmatrix{1&0\\0&0}, B=\pmatrix{0&1\\0&0}, C=\pmatrix{0&0\\1&0},D=\pmatrix{0&0\\0&1}$$
Since $X$ is $3$-dimensional, it contains some non-trivial linear combination $aA + dD$. If $a$ and $d$ are non-zero then you're done; if not, $X$ contains $A$ or $D$, so assume without loss of generality it contains $A$. Similarly, assume WLOG it contains $B$.
Since $X$ is $3$-dimensional it contains some third linearly independent matrix; this matrix must have some non-zero entry in its bottom row. By adding multiples of $A$ and $B$, we see that $X$ contains some matrix
$$\pmatrix{0&0\\x&y}$$
With $x$ or $y$ non-zero. Adding this to $A$ or $B$ yields an invertible matrix.
$\newcommand{\Reals}{\mathbf{R}}$Every $2 \times 2$ real matrix can be written uniquely in the form $$ \left[\begin{array}{cc} a + c & b - d \\ b + d & a - c \\ \end{array}\right] = a\left[\begin{array}{rr} 1 & 0 \\ 0 & 1 \\ \end{array}\right] + b\left[\begin{array}{rr} 0 & 1 \\ 1 & 0 \\ \end{array}\right] + c\left[\begin{array}{rr} 1 & 0 \\ 0 & -1 \\ \end{array}\right] + d\left[\begin{array}{rr} 0 & -1 \\ 1 & 0 \\ \end{array}\right] $$ for some real numbers $a$, $b$, $c$, and $d$. The set of non-invertible matrices is the locus $$ \det\left[\begin{array}{cc} a + c & b - d \\ b + d & a - c \\ \end{array}\right] = a^{2} + d^{2} - (b^{2} + c^{2}) = 0, $$ which is a cone on a product of circles. (The intersection with the unit $3$-sphere is the Clifford torus.) Particularly, no three-dimensional subspace of $\Reals^{2 \times 2}$ is contained in the set of non-invertible matrices. Consequently, in every three-dimensional subspace of $\Reals^{2 \times 2}$, the set of non-invertible matrices is a closed algebraic set, and the set of invertible matrices is open and dense.