Does the inverse of the matrix always rely on the determinant of a matrix?

I always thought that if the determinant of a matrix $A$ is $0$ then it has no inverse, $(A^{-1})$, until I saw an exercise in Contemporary Abstract Algebra by Gallian. This asks me to prove that the set of the $2\times2$ matrices of the form $$\begin{bmatrix} a&a\\ a&a\\ \end{bmatrix}\,,$$ where $a \neq 0$ and $a \in \mathbb R$.

is a group under matrix multiplication. The determinant of the above set of matrices is $0$, but still the inverse exists for each matrix which is nothing but $$\begin{bmatrix} \frac{a}{2} & \frac{a}{2} \\ \frac{a}{2} & \frac{a}{2} \\ \end{bmatrix}\,.$$

How is this possible? What makes these type of matrices escape from satisfying determinants? What's the logic behind that?


$\newcommand{\R}{\mathbb{R}}$To pick up from the comment of @NaN, look for an element $$ E = \begin{bmatrix} e & e\\ e & e\\ \end{bmatrix} $$ such that for each $a \ne 0$ we have $$ \begin{bmatrix} a & a\\ a & a\\ \end{bmatrix} \cdot E = \begin{bmatrix} a & a\\ a & a\\ \end{bmatrix} $$ And then you'll find that the inverse is not quite the one you wrote...

Hint 3 below explains the logic behind this exercise.

Hint 1

$e = 1/2$

Hint 2

The inverse of $$\begin{bmatrix}a & a\\a & a\\\end{bmatrix}$$ is $$\begin{bmatrix} a^{-1}/4 & a^{-1}/4\\ a^{-1}/4 & a^{-1}/4\\\end{bmatrix}$$.

Hint 3

The logic behind this is that the given matrices form a semigroup which is isomorphic to the non-zero reals under the operation $a * b = 2 a b$. So you may forget about matrices and think about the latter. This, in turn, is obtained by transport of structure from the usual multiplication, under the map $f(x) = 2x$, that is, $f : (\R^{\star}, *) \to (\R^{\star}, \cdot)$ is an isomorphism.


The inverse given is not the inverse in linear algebra sense, it's just the inverse in the group sense.

In a group we first has to identify the unit element in order to know what the inverse means. In this case we can see that $E = \begin{pmatrix}1/2 & 1/2 \\ 1/2 & 1/2\end{pmatrix}$ is such an element because $EE = E$ and every matrix in the group can then be written $A = \alpha E$ and then $EA = E\alpha E = \alpha EE = \alpha E = A$ and similarily $AE=A$.

It can now be seen that $(\alpha E)^{-1}$ in this group will be ${1\over\alpha}E$ since $\alpha E {1\over\alpha} E = {\alpha\over\alpha}EE = EE = E$


It's a different notion of inverse. In linear algebra, the inverse of a $2\times 2$ matrix $A$ is the matrix $A'$ such that $$A\,A' = \begin{pmatrix}1 & 0 \\ 0 & 1\end{pmatrix}\,,$$ if any such matrix exists. However, the group inverse is the matrix $A''$ such that $$A\,A'' = \begin{pmatrix}i & i \\ i & i\end{pmatrix}\,,$$ for whatever value of $i$ makes $$\begin{pmatrix}x & x\\x & x\end{pmatrix}\,\begin{pmatrix}i & i\\i & i\end{pmatrix} = \begin{pmatrix}x & x\\x & x\end{pmatrix}$$ true for all $x$ (the other answers say that we need $i=\tfrac12$).

$A'$ and $A''$ are clearly different matrices and there's no reason that the existence of $A''$ for a particular matrix $A$ should imply the existence of $A'$.