derivative of a determinant of matrix

This identity can be computed directly from the definition of the determinant. It helps to regard $A$ as a matrix whose entries are differentiable functions with respect to the real parameter $t.$ Recall the definition of the determinant -

$\det A = \displaystyle\sum_{\sigma \in S_n} \textrm{ sgn } (\sigma ) \displaystyle\prod_{i=1}^n a_{i\sigma (i) }$

and that the cofactor matrix $A_{ij}$ is defined entrywise by the determinant of a the matrix formed by deleting row $i$ and column $j$; $\textrm{ Adj } A$ is the transpose of this matrix. Now, noting the product rule,

$\frac{d}{dt} \det (A(t) +tB) = \displaystyle\sum_{\sigma \in S_n } \textrm{sgn } (\sigma) \displaystyle\prod_{i=1}^n [a + tb]_{i\sigma (i)} = \displaystyle\sum_{\sigma \in S_n} \textrm{sgn }(\sigma ) \displaystyle\sum_{j=1}^n b_{j \sigma (j) } \displaystyle\prod_{j\ne i} [a + tb]_{i \sigma (i) }= \displaystyle\sum_{j=1}^n \displaystyle\sum_{\sigma \in S_n} b_{j\sigma (j)} \displaystyle\prod_{j\ne i} [a + tb]_{i \sigma (i) }$

Letting $t=0$ gives

$\displaystyle\sum_{j=1}^n \displaystyle\sum_{\sigma \in S_n} \textrm{sgn } (\sigma ) b_{j \sigma (j)} \displaystyle\prod_{j\ne i} a_{i \sigma (i) } = $

$\displaystyle\sum_{j=1}^n \displaystyle\sum_{k=1}^n b_{jk} \displaystyle\sum_{\sigma (j) = k } \textrm{sgn }(\sigma ) \displaystyle\prod_{j\ne i} a_{i \sigma (i) } = \displaystyle\sum_{j=1}^n \displaystyle\sum_{k=1}^n b_{jk} \det A_{jk} = \textrm{ tr }(\textrm{Adj }(A) B) $


What's sort of neat is that this identity can be used to prove Cramer's Rule. Note that as a special case, $\frac{d}{dt} (I+ tB)_{t=0} = \textrm{ Tr } (B)$ Hence

$\frac{d}{dt} \det (A +tB)_{t=0} =$ $\det (A) \frac{d}{dt} \det (I +tA^{-1}B)_{t=0} =$ $\det (A) \textrm{ Tr} (A^{-1} B)$

Note that the diagonal entries of $A^{-1} B$ are defined by $\displaystyle\sum_{k=1}^n a^{-1}_{ik} b_{ki}$ for some $1\le i \le n.$ If we take $B$ to be the matrix with $b_{ji}=1$ and zero entries elsewhere, we have $\textrm{ tr } A^{-1} B = a^{-1}_{ij}.$ Applying a similar remark to $\textrm{Adj }(A) B,$ we deduce that

$\det (A) A^{-1} = \textrm{ Adj } (A),$ as the matrices are equal entrywise.


In my answer here, I provide this definition of the determinant:

$$\begin{align*} \det(A) & = \sum_{\sigma\in S_n}(-1)^{\sigma}\prod_{i=1}^nA_{i,\sigma(i)} \end{align*}$$ (although I do not prove that this is equivalent to other standard definitions.) This definition is useful for things like this, but you have to be comfortable with the permutation group.

Now $$\begin{align*} \det(A+tB) & = \sum_{\sigma\in S_n}(-1)^{\sigma}\prod_{i=1}^n(A_{i,\sigma(i)}+tB_{i,\sigma(i)}) \end{align*}$$

Seeing as how we are going to evaluate at 0 after taking the derivative, the only terms that matter after expanding the product are the linear terms in $t$: $$\begin{align*} \partial_t\det(A+tB)|_{t=0} & = \partial_t\left.\left(\sum_{\sigma\in S_n}(-1)^{\sigma}\sum_{i=1}^n\left(tB_{i,\sigma(i)}\prod_{j\neq i}A_{i,\sigma(i)}\right)\right)\right|_{t=0}\\ & = \sum_{\sigma\in S_n}(-1)^{\sigma}\sum_{i=1}^n\left(B_{i,\sigma(i)}\prod_{j\neq i}A_{j,\sigma(j)}\right)\\ & = \sum_{i=1}^n\sum_{\sigma\in S_n}(-1)^{\sigma}\left(B_{i,\sigma(i)}\prod_{j\neq i}A_{j,\sigma(j)}\right)\\ \end{align*}$$

Meanwhile, $$ \begin{align*} \left(\operatorname{cof}(A)^T\cdot B\right)_{ii} & = \sum_{k=1}^n\operatorname{cof}(A)^T_{ik}B_{ki}\\ &=\sum_{k=1}^n\operatorname{cof}(A)_{ki}B_{ki}\\ &=\sum_{k=1}^n\sum_{\sigma\in S_n;\;\sigma(k)=i}(-1)^{\sigma+i+k}B_{ki}\prod_{j\neq k}A_{j,\sigma(j)}\\ &=\sum_{\sigma\in S_n}(-1)^{\sigma}B_{i,\sigma(i)}\prod_{j\neq k}A_{j,\sigma(j)}\\ \end{align*} $$

And so taking the trace of $\left(\operatorname{cof}(A)^T\cdot B\right)$ would be placing a $\sum_{i=1}^n$ in front of this and LHS=RHS.