Newton optimization algorithm with non-positive definite Hessian

I found a good example:

$f(x_1,x_2)=(1.5-x_1+x_1*x_2)^2 + (2.25-x_1+x_1*x_2^2)^2 + (2.625-x_1+x_1*x_2^3)^2$

Dampted Newton method with starting point $x_0 = (4,1)^T$ fails. This is why:

$\nabla^2 f(x_0)= \left( \begin{matrix} 0 & 27.75 \\ 27.75 & 610 \\ \end{matrix} \right) $ and search direction is $p_0=-\nabla^2 f(x_0)*\nabla f(x_0)=(-4,0)^T$

The search direction is pointing to a wrong direction! and this is because Hessian matrix is not positive definite and thus $p_0$ is not in the descent direction.

enter image description here

After using the Hessian modification technique such as "adding a multiple of the identity", the algorithm works fine and the search direction points to the corect direction, and eventually finds the local minimum $(3,0.5)^T$

enter image description here