What does it mean if $\det(A)$ equals $1$?

What does it mean if $\det(A)$ equals $1$? Does it mean that the identity matrix can be obtained from $A$ by only adding multiples of rows onto others?


Matrices with determinant $1$ preserve volume. If all points inside a shape are transformed by the matrix to form a new shape, the proportional change in area (or volume) is the determinant of the matrix. For example if the determinant of a matrix $A$ is $5$, then a unit cube will transform into a shape with volume $5 \times 1=5$. The identity matrix would transform a unit cube into a shape with volume $1 \times 1 = 1$.

Added

After i read the appreciated comment of @danielV, i added this image from Determinant and linear transformation to give more explanations. Transofrm of unit square

It represents the transform of a unit square by a determinant 4 matrix. Notice that the area of the new square is $1 \times 4 = 4$.


Usually when we talk about the determinant without any other information attached, the only really relevant information is whether or not it is zero. However, if we are interested in geometry, there is some significance to matrices with determinant $1$. Namely, an important subset of them form the so-called special orthogonal group, which is just a fancy way to say they are like generalizations of rotations.

[note: As mentioned in the comments, I was a bit too eager. There are (many) matrices of determinant $1$ which are not special orthogonal. I do not know of any signficant properties that all matrices of determinant $1$ possess, except that they are volume-preserving, as noted by Semsem.]

For example, matrices of the form $$\begin{pmatrix}\sin\theta & \cos\theta \\ -\cos\theta & \sin\theta \end{pmatrix}$$ have determinant 1, and you can prove these correspond precisely to a rotation by $\theta$ (can't remember if it's clockwise or counterclockwise when $\sin$ is in the upper-left).

What properties of rotations do they possess? To answer this question, you need some familiarity with a matrix not just as a tool for putting together a system of equations -- which is how I was first taught them, unfortunately. You can think of them as really being functions that take in vectors and spit out other vectors, and these corrospond to geometric transformations of the vector space (so, the plane, or three-dimensional space, or beyond). I'm not sure what level you're coming at this from; I just mean to say that there are several ways of looking at matrices and one of them is a bit more abstract (so not taught in some classes) but is good for understanding why determinant-one matrices are like rotations.

First, they are orientation preserving which just means that a right-handed system never turns into a left-handed system. Or, no picture is mapped under the transformation to its mirror image (unless it was already symmetric). They are also distance-preserving in that points which were $d$ distance apart before being acted upon remain $d$ distance apart afterwards.


For what it's worth, Sabyasachi's construction is valid even if only row manipulations are allowed. However, see the comments for some subtleties about row-reduction; some minor definitions can change your ability to do or not do this.

$$A = \begin{pmatrix}-1 & 0 \\ 0 & -1\end{pmatrix}$$

$$R_2 \to R_2 - R_1$$

$$\begin{pmatrix}-1 & 0 \\ 1 & -1\end{pmatrix}$$

$$R_1 \to R_1 - R_2$$

$$\begin{pmatrix}0 & -1 \\ 1 & -1\end{pmatrix}$$

$$R_2 \to R_2 - R_1$$

$$\begin{pmatrix}0 & -1 \\ 1 & 0\end{pmatrix}$$

$$R_1 \to R_1 + R_2$$

$$\begin{pmatrix}1 & -1 \\ 1 & 0\end{pmatrix}$$

$$R_2 \to R_1 + R_2$$

$$\begin{pmatrix}1 & -1 \\ 2 & -1\end{pmatrix}$$

$$R_2 \to R_2 - 2R_1$$

$$\begin{pmatrix}1 & 0 \\ 0 & 1\end{pmatrix}= I_2$$