Hessian matrix as derivative of gradient
From a text:
For a real-valued differentiable function $f:\mathbb{R}^n\rightarrow\mathbb{R}$, the Hessian matrix $D^2f(x)$ is the derivative matrix of the vector-valued gradient function $\nabla f(x)$; i.e., $D^2f(x)=D[\nabla f(x)]$.
$\nabla f(x)$ is just an $n\times 1$ matrix consisting of $\partial f/\partial x_1,\partial f/\partial x_2,\ldots,\partial f/\partial x_n$.
Then $D[\nabla f(x)]$ must be a $1\times n$ matrix.
But I know that the Hessian matrix is an $n\times n$ matrix consisting of $\partial ^2f/\partial x_i\partial x_j$. How can the given definition be consistent with this?
The line "Then $D[\nabla f(x)]$ must be a $1\times n$ matrix" is where your confusion lies.
The derivative operator $D$ applied to a vector gives us how each component changes with each direction. Being more explicit with the notation we have
$$\begin{align}\nabla f(\mathbf x) &= D[f (\mathbf x)]\\ &= \left(\frac{\partial f}{\partial x_1}, \ldots, \frac{\partial f}{\partial x_n}\right)\end{align}$$
Now think of applying $D$ to each element of this vector individually;
$$\begin{align}D[\nabla f(\mathbf x)] &= D[D[f(\mathbf x)]]\\ &=\left(D\left[\frac{\partial f}{\partial x_1}\right]^T, \ldots, D\left[\frac{\partial f}{\partial x_n}\right]^T\right)\end{align}$$ Which expands to give us the Hessian matrix $$D^2[f(\mathbf x)]=\left(\begin{matrix}\frac{\partial^2 f}{\partial x_1^2} & \ldots & \frac{\partial^2 f}{\partial x_1\partial x_n}\\ \vdots & \ddots & \vdots \\ \frac{\partial^2 f}{\partial x_n\partial x_1}& \ldots & \frac{\partial^2 f}{\partial x_n^2}\end{matrix}\right)$$ which is indeed $n\times n$.
This is an old discussion and I would not intent to correct anything. Just in mind that this had caused myself long-time confusion so that I wish to accept some simple rules to make it clear.
If one allows to define the operators of the function $f(x):\mathbb{R}^n\rightarrow\mathbb{R}$:
- Gradient operator $\nabla$: defined as $n\times1$ column vector.
- Derivative operator $\nabla^T$: defined as a row vector (i.e., $1\times n$),
- Hessian operator $\mathbf{H}$: defined as the gradient of the derivative of $f(x)$. $$\mathbf{H}=\left(\begin{matrix}\frac{\partial}{\partial x_1}\\ \vdots\\\frac{\partial}{\partial x_n}\end{matrix}\right)\left(\begin{matrix}\frac{\partial}{\partial x_1}& \ldots&\frac{\partial}{\partial x_n}\end{matrix}\right)=\left(\begin{matrix}\frac{\partial}{\partial x_1^2} & \ldots & \frac{\partial^2}{\partial x_1\partial x_n}\\\vdots &\ddots&\vdots \\ \frac{\partial^2}{\partial x_n\partial x_1}& \ldots & \frac{\partial^2}{\partial x_n^2}\end{matrix}\right).$$