Covariant derivative versus exterior derivative
$\def\alt{\textrm{Alt}} \def\d{\mathrm{d}} \def\sgn{\mathrm{sgn}\,}$Let $\nabla$ be a symmetric linear connection in $M$, and $\omega$ be a $k$-form in $M$. I'm trying to find a relation between $\d\omega$ and $\nabla \omega$. I follow Spivak's notation in Calculus on Manifolds and write $$\alt(\nabla \omega)(X_1,\cdots,X_{k+1}) = \frac{1}{(k+1)!}\sum_{\sigma \in S_{k+1}} (\sgn \sigma)\nabla\omega(X_{\sigma(1)},\cdots,X_{\sigma(k+1)}).$$I also assume the formula $$\d\omega(X_1,\cdots,X_{k+1})=\sum_{i=1}^k(-1)^{i+1}X_i(\omega(X_1,\cdots,\widehat{X_i},\cdots,X_{k+1})) + \sum_{i<j}(-1)^{i+j}\omega([X_i,X_j],X_1,\cdots,\widehat{X_i},\cdots,\widehat{X_j},\cdots,X_{k+1}).$$I'd expect something like $\alt(\nabla\omega) = \d \omega$, apart from a multiplicative constant, maybe. I'm stuck. For $k=1$ I got $$\alt(\nabla \omega) = -\frac{1}{2}\d \omega.$$The $1/2$ I can accept, but the minus sign puts me off. I tried brute forcing my way through $k=2$ but I guess I just suck at doing computations like this.
The best I can come up with is
$$\begin{align} (k+1)!\alt(&\nabla\omega)(X_1,\cdots,X_{k+1}) = \sum_{\sigma \in S_{k+1}} (\sgn\sigma) \nabla\omega(X_{\sigma(1)},\cdots,X_{\sigma(k+1)}) \\ &= \sum_{\sigma \in S_{k+1}} (\sgn\sigma) \nabla_{X_{\sigma(k+1)}}\omega(X_{\sigma(1)},\cdots,X_{\sigma(k)}) \\ &= \sum_{\sigma \in S_{k+1}}(\sgn\sigma)\left(X_{\sigma(k+1)}(\omega(X_{\sigma(1)},\cdots,X_{\sigma(k)})) - \sum_{i=1}^n\omega(X_{\sigma(1)},\cdots, \nabla_{X_{\sigma(k+1)}}X_{\sigma(i)},\cdots,X_{\sigma(k)})\right)\\ &= \sum_{\sigma \in S_{k+1}}(\sgn\sigma)X_{\sigma(k+1)}(\omega(X_{\sigma(1)},\cdots,X_{\sigma(k)})) - \sum_{\sigma \in S_{k+1}}\sum_{i=1}^n(\sgn \sigma)\omega(X_{\sigma(1)},\cdots, \nabla_{X_{\sigma(k+1)}}X_{\sigma(i)},\cdots,X_{\sigma(k)}) \\ &\stackrel{\color{red}{(\ast)}}{=} \sum_{\sigma \in S_{k+1}}X_{\sigma(k+1)}(\omega(X_1,\cdots,X_k)) - \sum_{\sigma \in S_{k+1}}\sum_{i=1}^n(\sgn \sigma)\omega(X_{\sigma(1)},\cdots, \nabla_{X_{\sigma(k+1)}}X_{\sigma(i)},\cdots,X_{\sigma(k)})\end{align}$$
and I'm stuck. I'm not sure of the $\color{red}{(\ast)}$ step either. I know that we must use that $\nabla$ is symmetric to get rid of all these $\nabla$ but I don't know how to do this. What is the smart way to do this?
This answer is the most similar thing to what I'm trying to do that I found, but I don't find it easy to see such relation, as it is said there. Also, I'd like to avoid coordinate computations if possible.
I'll ignore the $\color{red}{(\ast)}$ step since I think it is wrong. I don't know how to write it neatly, but doing it for $k=1$ and $k=2$ suggests $$\sum_{\sigma \in S_{k+1}}(\sgn \sigma)X_{\sigma(k+1)}(\omega(X_{\sigma(1)},\cdots,X_{\sigma(k)})) = (-1)^kk!\sum_{i=1}^k(-1)^iX_i(\omega(X_1,\cdots,\widehat{X_i},\cdots,X_{k+1})).$$Also, it seems that $$\sum_{\sigma \in S_{k+1}}\sum_{i=1}^k(\sgn \sigma)\omega(X_{\sigma(1)},\cdots, \nabla_{X_{\sigma(k+1)}}X_{\sigma(i)},\cdots,X_{\sigma(k)})=(-1)^{k+1}k!\sum_{i<j}(-1)^{i+j}\omega([X_i,X_j],X_1,\cdots,\widehat{X_i},\cdots,\widehat{X_j},\cdots,X_{k+1})$$
So: $$(k+1)! {\rm Alt}(\nabla \omega) = (-1)^kk!{\rm d}\omega \implies (-1)^k(k+1)\alt(\nabla \omega) = {\rm d}\omega.$$
Three comments:
- There are two different conventions in common use for the wedge product. The one Spivak uses and I use (which I call the determinant convention) is $$ \alpha \wedge \beta = \frac{(k+l)!}{k!l!} \operatorname{Alt}(\alpha\otimes\beta), $$ when $\alpha$ is a $k$-form and $\beta$ is an $l$-form. The one Kobayashi & Nomizu use (the Alt convention) is $$ \alpha \wedge \beta = \operatorname{Alt}(\alpha\otimes\beta). $$ The formula you wrote down for $d\omega$ is correct if you're using the determinant convention. But the formula $d\omega = \operatorname{Alt}(\nabla\omega)$ would only be correct using the Alt convention. With the determinant convention, the formula should be $d\omega = \pm (k+1)\operatorname{Alt}(\nabla\omega)$ when $\omega$ is a $k$-form.
- The plus or minus sign depends on how you define the $(k+1)$-tensor $\nabla \omega$. Some authors define it to be $$\nabla\omega(\dots,Y) = \nabla_Y\omega(\dots),$$ while others define it to be $$\nabla\omega(Y,\dots) = \nabla_Y\omega(\dots).$$ Your computation shows that you're using the first convention, in which case the correct formula is $d\omega = (-1)^k (k+1)\operatorname{Alt} (\nabla\omega)$.
- To derive your formula $\color{red}{(\ast)}$ (or rather the version with the correct constant multiple) by brute force, you can note that each of the terms of the form $\nabla_{X_i} X_j$ can be matched up with a term $\nabla_{X_j} X_i$ with the opposite sign, and because the connection is symmetric, these combine to give $[X_i,X_j]$, which then matches one of the terms in the formula for $d\omega$. But a much easier approach is to note that both $d\omega$ and $\operatorname{Alt}(\nabla\omega)$ are well-defined tensor fields, and thus their values at a point $p$ can be compared in terms of any convenient local frame. If you let the $X_i$'s be coordinate vector fields in Riemannian normal coordinates centered at $p$, then many terms go away.
This is only a partial answer.
This result is proved in Kobayashi-Nomizu, Foundations of differential geometry, vol. 1: it's Corollary 8.6 p.149 ($\operatorname{Alt}(\nabla w) = dw$ holds for any torsion-free connection $\nabla$). But it does not look like their proof is what you are looking for.
I understand the calculation that you are trying to do and I'm not sure how to fix it, but here is an observation that may help: if $\omega$ is a $k$-form, in other words an element of $\Gamma(\Lambda^k T^*M)$, then $\nabla \omega$ is an element of $\Gamma(T^*M \otimes \Lambda^k T^*M)$, so I think that $\operatorname{Alt}(\nabla w)$ is simply given by $\operatorname{Alt}(\nabla w) = \sigma(\nabla w)$, where $\sigma : \Gamma(T^*M \otimes \Lambda^k T^*M) \to \Gamma(\Lambda^{k+1} T^*M)$ is the linear map given by $\sigma(\alpha \otimes \eta) = \alpha \wedge \eta$. In this paper, they proceed to show the result, but they use local coordinates. It's probably possible to do a coordinate-free proof like you want but I haven't tried long enough.