"we note that the matrix Σ can be taken to be symmetric, without loss of generality"

Solution 1:

Write (2.44) as $\Delta^2 = (x-\mu)^T A\; (x-\mu)$, where $A = \Sigma^{-1}$.

We know that $A = \frac{1}{2} (A + A^T) + \frac{1}{2} (A - A^T)$.

Let $B = \frac{1}{2} (A + A^T), ~C = \frac{1}{2} (A - A^T)$, then $B$ is symmetric, and $C$ is anti-symmetric, $c_{ij} = - c_{ji}$.

So $\Delta^2 = (x-\mu)^T B\; (x-\mu) + (x-\mu)^T C\; (x-\mu)$, in which: $$\begin{array}{l l l}(x-\mu)^T C\; (x-\mu) & = & \displaystyle \sum_{i=1}^D \sum_{j=1}^D c_{ij} (x-\mu)_i (x-\mu)_j \\ & = & \displaystyle \sum_{i=1}^D\sum_{j=i+1}^D (c_{ij}+c_{ji}) (x-\mu)_i (x-\mu)_j \\ & = & 0\end{array}$$

So $\Delta^2 = (x-\mu)^T B\; (x-\mu)$, where $B = \frac{1}{2} (A + A^T)$ is a symmetric matrix. That is, if $\Sigma^{-1}$ isn't symmetric, then there's another symmetric matrix $B$ so that $\Delta^2 = (x-\mu)^T \Sigma^{-1} (x-\mu)$ is equal to $\Delta^2 = (x-\mu)^T B\; (x-\mu)$.