Relation between Positive definite matrix and strictly convex function
I have a problem. From wikipedia http://en.wikipedia.org/wiki/Positive-definite_matrix any quadratic function can be written as $$z^TMz$$ where z is a column vector and M is a symmetric real matrix. However this quadratic function is strictly convex only when M is symmetric positive definite. Why? I thought any quadratic function should be convex? Doesn't $$z^TMz>0$$ shows only that the range of this function is greater than zero?
Why isn't any symmetric matrix M (which represents a quadratic function) convex?
Why is it only the case that when $$z^TMz > 0$$ the function is strictly convex?
Thanks
UPDATE: As pointed out in the comments by @Erik, positive definiteness is a sufficient condition for strict convexity. However in these case, positive definiteness is indeed directly implied since the second derivative is a positive definitive matrix.
PREVIOUS ANSWER: For any twice differentiable function, it is strictly convex if and only if, the Hessian matrix is positive definite. You can find it from any standard textbook on convex optimization. Now here the function at hand is $z^TMz$ which is clearly twice differentiable (by virtue of being quadratic). Now the Hessian of this function is $M$ (please verify yourself, it helped me a lot to memorize it). So $M$ should be positive definite for that quadratic function to be convex.
Ok so here is my explanation using second order directional derivatives and hessian matrix.
1)Real Symmetric Hessian matrix $H$(at some point $x^{(0)}$) of our function $f(x):R^{n}\rightarrow R $ can be decomposed into $H=D\lambda D^{T}$ (eigendecomposition of symmetric matrix. $\lambda$ is identity matrix with eigenvalues of $H$ on its diagonal), which implies $D^{T}HD=\lambda$, which implies $d_{i}^{T}Hd_{i}=\lambda_{i}$ (where $d_{i}$ is i-th eigenvector of $H$ and $\lambda_{i}$ is corresponding eigenvalue, at point $x^{(0)}$)
2) Second order directional derivative in some direction $c$ is given by $c^{T}Hc$. In case $c$ equals to eigenvector of $H$, value of directional derivative is given by eigenvalue corresponding to $c$. In case $c$ is not equal to eigenvector, then directional derivative ($c^{T}Hc$) equals to weighted average of eigenvalues ( so value of directional derivative in any direction $c$ are bounded by minimal and maximal eigenvalues). Additionaly, eigenvector corresponding to maximal eigenvalue gives us maximal second order directional derivative, similarly eigenvector corresponding to minimal eigenvalue gives us smallest possible value of second order directional derivative.So maximal and minimal eigenvalues indicate the greatest and smallest possible value of second order directional derivative.
3) If Hessian is positive semi-definite then all its the eigenvalues are positive or zero. As a consequence second order directional derivative in any direction $c$ has also to be positive or zero (because their values are bounded by eigenvalues).
4) If second order directional derivatives are always positive or zero, then no matter what direction $c$ we will choose, second order directional derivative is positive or zero, and as a result curvature is positive or flat($c^{T}Hc>=0$). In univariate case: $f^{"}(x)>=0$ for every $x\ \epsilon\ R$. So function is convex (but not strictly, we can come across flat regions).
5) In this case we are talking about Hessian $H=H(x^{(0)})$ (in given point) !!!! So Hessian has to be positive semi-definite in every possible point $x^{(0)}$
6)If $H$ is positive definite then function is stritly convex