The anti-derivative of any matrix function

If we have some differentiable function $f:\mathbb{R}^n\mapsto\mathbb{R}^m$, we can always calculate the Jacobian of this function, i.e., $$ \frac{df}{dx}(x) = \begin{pmatrix}\frac{\partial f_1 }{\partial x_1 } & \cdots & \frac{\partial f_1 }{\partial x_n } \\ \vdots & \ddots & \vdots \\ \frac{\partial f_m }{\partial x_1 } & \cdots & \frac{\partial f_m }{\partial x_n } \end{pmatrix} $$ So, for example if $f=\begin{pmatrix} x_1+x_2^2 \\ \sin(x_1 x_2) \end{pmatrix}$, we have $$ F(x) = \frac{df}{dx}(x) = \begin{pmatrix} 1 & 2 x_2 \\ x_2 \cos(x_1x_2 ) & x_1 \cos(x_1x_2 ) \end{pmatrix}. $$

My question is, can we find an $f$ for any $F(x)$? In other words, is any matrix function of the form $$F(x_1, \ldots, x_n) = \begin{pmatrix} F_{1,1}(x_1, \ldots, x_n) & \cdots & F_{1,n}(x_1, \ldots, x_n) \\ \vdots & \ddots & \vdots \\ F_{m,1}(x_1, \ldots, x_n) & \cdots & F_{m,n}(x_1, \ldots, x_n) \end{pmatrix}, \quad (F_{i,j}\text{ is assumed to be continuous}) $$ a Jacobian matrix for some mapping $f$?

On one hand this seems trivial, on the other hand I cannot find anything useful. I already tried to have it row by row, but finding the integral basically from a freaky row-vector is not really insightful...

(my concern is that if I take $F(x)$ as some super weird matrix function there will not exist an $f$...)

(NB: please add some reference or keywords in your answer)


Solution 1:

It can not be done in general. Consider the function $F: \mathbb{R}^2 \rightarrow \mathbb{R}^2$ given by

$$F(x,y)=(-y,x)$$

You can check that there is no function $f$ such that $\nabla f=F$.

EDIT: Supose $\nabla f = F$, so we have $f_x = -y$ and so $f(x,y)=-yx+g(y)$ for some function $g$. Now derive in $y$ and get $f_y=-x+g'(y)$, but $f_y=x$ implies $g'(y)=2x$. This is absurd since $g$ should depend only on $y$.

Solution 2:

Given $f:\mathbb R^n\to \mathbb R^m$, let $Df:\mathbb R^n\to \mathbb R^{m\times n}$ denote its Jacobian. Writing $f$ in terms of its $m$ component functions $f_1,f_2,\dots,f_m:\mathbb R^n\to \mathbb R$, we see that $$ D\begin{bmatrix}f_1\\\vdots\\f_m\end{bmatrix}=\begin{bmatrix}Df_1\\\vdots\\Df_m\end{bmatrix} $$ i.e. the rows of $Df$ are the Jacobians of the components of $f$. Therefore, in order to solve the equation $Df=F$, it suffices to solve each of the equations $Df_k=F_k$ seperately, for each $k\in \{1,\dots,m\}$, where $F_k$ is the $k^{th}$ row of $F$. In other words, we can restrict our attention to the $m=1$ case.

In this case, the Jacobian is just the gradient, so the question becomes

Given $F:\mathbb R^n\to \mathbb R^n$, when does there exist $f:\mathbb R^n\to \mathbb R$ for which $\nabla f=F$?

That is, how can we tell when a vector field is conservative? As long as $F$ is differentiable, with continuous partial derivatives, an obvious necessary condition is $$ \forall i,j\in \{1,\dots,n\}:\frac{\partial F_j}{\partial x_i}=\frac{\partial F_i }{\partial x_j} $$ The necessity of this condition follows from Schwarz's theorem, since if $F=\nabla f$, then $$ \frac{\partial F_j}{\partial x_i} =\frac{\partial }{\partial x_i}\frac{\partial f}{\partial x_j} =\frac{\partial }{\partial x_j}\frac{\partial f}{\partial x_i} =\frac{\partial F_i}{\partial x_j}. $$ Schwarz's theorem requires $f$ to have continuous second partial derivatives, which is why I included the differentiability condition on $F$.

It turns out this condition is sufficient as well. In fact, it remains true for function $F:E\to \mathbb R^n$, where $E\subseteq \mathbb R^n$ is open, as long as $E$ is simply connected, meaning it does not have any "holes." However, I cannot prove this fact here, since it requires developing de Rham cohomology.