What is the derivative of $\log \det X$ when $X$ is symmetric?
Let me call $X_0$ the symmetric matrix with entries $(X_0)_{i,j} = x_{i,j}$. We have by assumptions $x_{i,j}=x_{j,i}$. Since $X_0$ is symmetric it can be diagonalized (if it's real). Its determinant is the product of the eigenvalues $\lambda_k$. So for a symmetric matrix $X$
$$ \ln\det X = \sum_k \ln(\lambda_k ) $$
Assume $X$ depends on a parameter $t$. It's derivative would be
$$ \frac{d}{dt} \ln\det X(t) = \sum_k \frac{\dot{\lambda}_k}{\lambda_k} $$
Say we want the derivative of $X_0$ with respect to $x_{i,j}$ for $i\neq j$. Then, defining
\begin{align} V &= |i\rangle \langle j | + |j\rangle \langle i | \\ X(t) &= X_0 +tV, \end{align}
($V$ is the matrix with all zeros except ones at position $(i,j)$ and $(j,i)$). We have
$$ \frac{\partial}{\partial x_{i,j}} \ln\det X_0 = \left . \frac{d}{dt} \ln\det X(t) \right \vert_{t=0}= \sum_k \frac{\dot{\lambda}_k}{\lambda_k} $$
Now
$$ \dot{\lambda}_k = \langle v_k | V| v_k \rangle $$
where $|v_k \rangle$ is the eigenvector of $X_0$ corresponding to $\lambda_k$. Hence (for $i\neq j$)
\begin{align} \frac{\partial}{\partial x_{i,j}} \ln\det X_0 & = \sum_k \frac{ \langle j| v_k \rangle \langle v_k |i \rangle }{\lambda_k} + i \leftrightarrow j \\ &= \left ( X^{-1} \right)_{j,i} +\left ( X^{-1} \right)_{i,j} \\ &= 2\left ( X^{-1} \right)_{i,j} \end{align}
Let us now compute the derivative with respect to $x_{i,i}$. We reason exactly as before with $V = |i\rangle \langle i |$ and we get
\begin{align} \frac{\partial}{\partial x_{i,i}} \ln\det X_0 & = \sum_k \frac{ \langle i| v_k \rangle \langle v_k |i \rangle }{\lambda_k} \\ &= \left ( X^{-1} \right)_{i,i}. \end{align}
Hence the second formula is the correct one for a symmetric matrix. The first formula is correct for a non symmetric matrix. All formulae require of course the matrix to be non-singular.
Added
Let's explain the subtlety with one example that should clarify the matter. Conside the following symmetric matrix:
$$ A=\left(\begin{array}{cc} a & x\\ x & b \end{array}\right) $$
Now,
$$\log\det(A) = \log(ab-x^2)$$
and so
\begin{align} \frac{\partial \log\det(A)}{\partial a } &= \frac{b}{ab-x^2} \\ \frac{\partial \log\det(A)}{\partial x } &= - \frac{2x}{ab-x^2} \\ \frac{\partial \log\det(A)}{\partial b } &= \frac{a}{ab-x^2} \end{align}
And compare this with
$$ A^{-1} = \frac{1}{(ab-x^2)} \left(\begin{array}{cc} b & -x\\ -x & a \end{array}\right) $$
This simple calculation agrees with the formula above (cfr. the factor of 2). As I said in the comment, the point is to be clear about what are the independent variables or what is the variation that we are using. Here I considered variation $V$ which is symmetric, as this seems to be the problem's assumption.
Obviously if you consider
$$ A'=\left(\begin{array}{cc} a & y\\ x & b \end{array}\right) $$
you will obtain $\nabla A' \sim {A'}^{-1}$
This is a really well done paper that describes what is going on:
Shriram Srinivasan, Nishant Panda. (2020) "What is the gradient of a scalar function of a symmetric matrix?" https://arxiv.org/pdf/1911.06491.pdf
Their conclusion is that Boyd's formula is the correct one, which comes by restricting the Frechet derivative (defined in $\mathbb{R}^{n \times n}$) to the subspace of symmetric n x n matrices, denoted $\mathbb{S}^{n \times n}$. Deriving the gradient in the reduced space of $n(n+1)/2$ dimensions and then mapping back to $\mathbb{S}^{n \times n}$ is subtle and can't be done so simply, leading to the inconsistent result by Harville.