Alternative definition of the determinant of a square matrix and its advantages?
Usually, the definition of the determinant of a $n\times n$ matrix $A=(a_{ij})$ is as the following:
$$\det(A):=\sum_{\sigma\in S_n}\text{sgn}(\sigma)\prod_{i=1}^na_{i,\sigma(i)}.$$
In Gilbert Strang's Linear Algebra and Its Application, the author points out that it can be understood as the volume of a box in $n$-dimensional space. However, I am wondering if it can be defined in this way.
Here are my questions:
- [EDIT: Can the determinant be defined as the volume of a box in $n$-dimensional space?]
- Are there other definitions of the determinant for a square matrix?
- What is the "advantages" of the different definition? [EDIT: For instance, as Qiaochu said in the comment, how easy is it to prove that $\det(AB)=\det(A)\det(B)$ with that definition?]
Most properties of determinants follow more or less immediately when you use the following definition.
If $f:V\to V$ is an endomorphism of a vector space of finite dimension $n$ and $\Lambda^nV$ is the $n$th exterior power of $V$, then there is an induced morphism $\Lambda^n(f):\Lambda^nV\to\Lambda^nV$. Now $\Lambda^nV$ is a one dimensional vector space, so there is a canonical isomorphism of $k$-algebras $\operatorname{End}(\Lambda^nV)\cong k$. The image of the map $\Lambda^n(f)\in\operatorname{End}(\Lambda^nV)$ in $k$ under this isomorphism is the determinant of $f$.
If one wants to define the determinant of a matrix $A\in M_n(k)$, then one considers the corresponding map $k^n\to k^n$ determined by $A$, and proceeds as above.
Of course, in this approach one has to prove the properties of exterior powers and maps induced on them—but this is neither conceptually nor practically complicated.
You can define the determinant to be the signed sum of weights of non-intersecting paths in a graph between "source" vertexes $a_1,\ldots,a_n$ and "sink" vertices $b_1,\ldots,b_n$. See Qiaochu's blog post. The $(i,j)$ entries of the matrix are the sums of weights of all paths from $a_i$ to $b_j$.
This definition is nice because it's entirely combinatorial, and it makes clear why determinants should be relevant to discrete mathematics and to physics. And it genuinely provides new insights relative to the "algebraic" definitions, I think. It also provides a nice avenue to prove (and to properly understand) various standard results such as the Cauchy--Binet formula, Dodgson's condensation formula, the Plucker relations, Laplace's expansion and Dodgson's condensation formula. See this paper by Fulmek for details.
I would classify your first one as merely a method of calculating the determinant. I would call the second (Strang's) merely a property, albeit useful, of determinants. But I think that there is one sort of 'lower-level definition' that is better than the others: the determinant is the only function on a square matrix satisfying certain properties (a note on this).
In particular, if a function is linear in each row of a matrix, is 0 if two rows are the same, and is 1 if the matrix is the identity matrix, then it is the determinant function.
NOTE - Mariano has just uploaded an even better answer, but it's more basic (not in the elementary way, but rather in the It Takes Much More Math Way, so I keep my answer as is).
Let me emphasize that in the following discussion $V$ denotes a complex vector space and $T$ denotes an operator on $V$.
Definition: Let $T$ be an operator on $V$ and choose a basis of $V$ such that the matrix of $T$ (with respect to this basis) is upper triangular. If $\lambda$ is a complex number, then the multiplicity of $\lambda$ as an eigenvalue of $T$ is defined to be the number of times $\lambda$ occurs on the diagonal of the matrix of $T$.
Exercise 1: In this exercise, we will prove that the multiplicity of a complex number $\lambda$ of $T$ as an eigenvalue of $T$ is well-defined. Firstly, prove the following result:
If $\lambda$ is a complex number and if ${\cal B}$ is a basis of $V$ such that the matrix of $T$ (with respect to ${\cal B}$) is upper triangular, then the number of times $\lambda$ occurs on the diagonal of the matrix of $T$ equals $\text{null}(T-\lambda I)^{\dim{V}}$.
Deduce that the number of times $\lambda$ occurs on the diagonal of an upper triangular matrix of $T$ does not depend on the basis with respect to which $T$ has an upper triangular matrix. Therefore, the multiplicity of a complex number $\lambda$ as an eigenvalue of $T$ is well-defined.
Exercise 2: Prove that the sum of the multiplicities of the eigenvalues of $T$ equals the dimension of the vector space $V$ on which $T$ operates.
Definition: Let $T$ be an operator on a complex vector space $V$. The determinant of $T$ is defined to be the product of the eigenvalues of $T$ (counting multiplicity).
Exercise 3: Prove that an operator $T$ is invertible if and only if the determinant of $T$ (as defined above) is non-zero.
We will now prove that the determinant of $T$ as an operator equals the determinant of a matrix of $T$ (with respect to any basis of $V$).
Exercise 4: Prove that if $T$ has an upper triangular matrix with respect to a basis of $V$, then the determinant of $T$ equals the product of the diagonal entries of this matrix. Deduce that the determinant of $T$ equals the determinant of this matrix.
Exercise 5: Prove that if $A$ is the matrix of $T$ with respect to a particular basis of $V$ and if $B$ is the matrix of $T$ with respect to another basis of $V$, then there is an invertible matrix $C$ such that $C^{-1}AC=B$.
Exercise 6: Prove that if $A$ and $B$ are $n\times n$ square matrices, then $\det{AB}=\det{A}\det{B}=\det{BA}$.
Exercise 7: Prove that if $A$ is the matrix of $T$ with respect to a particular basis of $V$ and if $B$ is the matrix of $T$ with respect to another basis of $V$, then $\det{A}=\det{B}$ as matrices.
Exercise 8: Finally, prove that if $A$ is a matrix of $T$ with respect to any basis of $V$, then the determinant of $T$ as an operator equals the determinant of the matrix $A$, i.e., the product of the eigenvalues of $T$ (counting multiplicity) equals the determinant of the matrix $A$.
We will now prove the famous Cayley-Hamilton theorem:
Definition: If $T$ is an operator $V$, if $n=\dim{V}$, and if $\lambda_1,\dots,\lambda_n$ are the eigenvalues of $T$ (counting multiplicity), we define the characteristic polynomial of $T$ by the rule $p(z)=(z-\lambda_1)\cdots (z-\lambda_n)$ if $z$ is a complex number.
Exercise 9: Prove that $p(T)=0$ where $p$ is the characteristic polynomial of $T$. (Hint: choose a basis of $V$ with respect to which $T$ has an upper triangular matrix.)
Exercise 10: If $p$ is the characteristic polynomial of $T$, prove that $p(z)=\det{(zI-T)}$ for all complex numbers $z$. (Hint: prove that the eigenvalues of $zI-T$ are precisely the numbers of the form $z-\lambda$ where $\lambda$ is an eigenvalue of $T$. Furthermore, prove that the multiplicity of $z-\lambda$ as an eigenvalue of $zI-T$ equals the multiplicity of $\lambda$ as an eigenvalue of $T$. Use Exercise 3 and the definition of the characteristic polynomial given above.)
I hope this helps! (Please see Linear Algebra Done Right by Sheldon Axler for a more elaborate discussion of the determinant along the same lines as my answer.)
Given a vector space $V$ of dimension $n$, one can prove that there is a unique (up to a normalization factor) non-trivial alternating $n$ form $$ F:V\times\cdots\times V\longrightarrow{\Bbb R}\qquad\hbox{$n$ factors} $$ (I'm pretending that $V$ is a real vector space, but in fact the result is more general). Alternating means that for a permutation $\pi\in{\cal S}_n$ one must have $$ F(v_{\pi(1)},\ldots,v_{\pi(n)})={\rm sign}(\pi)F(v_1,\ldots,v_n) $$ for all $(v_1,...,v_n)\in V^n$. If you fix a basis $\{e_1\ldots,e_n\}$ of $V$ you can choose the normalization factor by declaring that $F(e_1\ldots,e_n)=1$. Then $F$ is exactly the determinant.
Further, if you endow $V$ with an euclidean norm such that the fixed basis is an orthonormal basis, the value of $F(v_1,\ldots,v_n)$ is precisely the signed volume of the parallelotope defined by the ordered $n$-ple $v_1,\ldots,v_n$.