Curl of Cross Product of Two Vectors

I want to prove the following identity

$$\text{curl } \left(\textbf{F}\times \textbf{G}\right) = \textbf{F}\text{ div}\textbf{ G}- \textbf{G}\text{ div}\textbf{ F}+ \left(\textbf{G}\cdot \nabla \right)\textbf{F}- \left(\textbf{F}\cdot \nabla \right)\textbf{G}$$

But I do not know how! Also, what does $\textbf{F}\cdot \nabla $ mean, isn't it the divergence of $\textbf{F}$!


Solution 1:

You only need two things to prove this. First, the BAC-CAB rule:

$$A \times (B \times C) = B(A \cdot C) - C(A \cdot B)$$

And the product rule. Let $\dot \nabla \times (\dot F \times G)$ mean "differentiate $F$ only; pretend $G$ is constant here". So the product rule would read

$$\nabla \times (F \times G) = \dot \nabla \times (\dot F \times G) + \dot \nabla \times (F \times \dot G)$$

Now, apply the BAC-CAB rule. I'll do this for just one term for brevity:

$$\dot \nabla \times (\dot F \times G) = \dot F (\dot \nabla \cdot G) - G(\dot \nabla \cdot \dot F)$$

Now, here's where the dots become important: since $G$ is not differentiated in this whole equation, $\dot \nabla \cdot G$ is a directional derivative, conventionally written $G \cdot \nabla$. Indeed, we have

$$\dot F(\dot \nabla \cdot G) = (G \cdot \nabla) F$$

On the other hand, the $G(\dot \nabla \cdot \dot F)$ term can just drop the dots to get something that looks like a divergence:

$$G (\dot \nabla \cdot \dot F) = G(\nabla \cdot F)$$

Carry out the same expansion for the $\dot \nabla \times (F \cdot \dot G)$ term, and you're done.

Solution 2:

Here is a simple proof using index notation and BAC-CAB identity.

$$\begin{align} \nabla \times \left( {{\bf{A}} \times {\bf{B}}} \right) &= {{\bf{e}}_i} \times {\partial _i}\left( {{A_j}{{\bf{e}}_j} \times {B_k}{{\bf{e}}_k}} \right)\\ &= {\partial _i}\left( {{A_j}{B_k}} \right){{\bf{e}}_i} \times \left( {{{\bf{e}}_j} \times {{\bf{e}}_k}} \right)\\ &= \left( {{\partial _i}{A_j}{B_k} + {A_j}{\partial _i}{B_k}} \right)\left( {\left( {{{\bf{e}}_i} \cdot {{\bf{e}}_k}} \right){{\bf{e}}_j} - \left( {{{\bf{e}}_i} \cdot {{\bf{e}}_j}} \right){{\bf{e}}_k}} \right)\\ &= \left( {{\partial _i}{A_j}{B_k} + {A_j}{\partial _i}{B_k}} \right)\left( {{\delta _{ik}}{{\bf{e}}_j} - {\delta _{ij}}{{\bf{e}}_k}} \right)\\ &= {\partial _i}{A_j}{B_i}{{\bf{e}}_j} - {\partial _i}{A_i}{B_k}{{\bf{e}}_k} + {A_j}{\partial _i}{B_i}{{\bf{e}}_j} - {A_i}{\partial _i}{B_k}{{\bf{e}}_k}\\ &= {\bf{B}} \cdot \nabla {\bf{A}} - \left( {\nabla \cdot {\bf{A}}} \right){\bf{B}} + \left( {\nabla \cdot {\bf{B}}} \right){\bf{A}} - {\bf{A}} \cdot \nabla {\bf{B}} \end{align}$$

Solution 3:

The divergence is $\nabla\cdot\mathbf{F}$ whereas $(\mathbf{F}\cdot\nabla)$ is another way of writing the directional derivative operator. In component notation we have

$$(\mathbf{F}\cdot\nabla) = \sum_{\alpha=1}^dF_\alpha\frac{\partial}{\partial x_\alpha}$$

which when applied to each component ($\beta$) of $\mathbf{G}$ gives

$$\left((\mathbf{F}\cdot\nabla)\mathbf G\right)_\beta = \sum_{\alpha=1}^dF_\alpha\frac{\partial G_\beta}{\partial x_\alpha} $$

which is the same as if we consider $\mathbf{F}\cdot(\nabla\otimes\mathbf{G})$ where $\nabla\otimes\mathbf{G}$ is

$$\left(\nabla\otimes\mathbf{G}\right)_{\alpha \beta}= \frac{\partial G_\beta}{\partial x_\alpha}$$