How to solve simultaneous equations using Newton-Raphson's method?

Solution 1:

Newton's method is, provided an initial guess $x_0$ to $f(x)=0$, you just iterate $x_{n+1}=x_n-\frac{f(x_n)}{f'(x_n)}$. In higher dimensions, there is a straightforward analog. So in your case, define $$f\left(\left[ \begin{array}{c} x \\ y \end{array}\right]\right)=\left[\begin{array}{c} f_1(x,y) \\ f_2(x,y) \end{array}\right]=\left[\begin{array}{c} \sin(3x)+\sin(3y) \\ \sin(5x)+\sin(5y) \end{array}\right]$$

so you throw in a vector of size two and your $f$ returns a vector of size two. The derivative is simply the 2x2 Jacobian matrix here $$J=\left[\begin{array}{cc} \frac{\partial f_1}{\partial x} & \frac{\partial f_1}{\partial y} \\ \frac{\partial f_2}{\partial x} & \frac{\partial f_2}{\partial y} \end{array}\right].$$

The only thing to be careful about is that now you have vector operations. The $f'(x)$ in the denominator is equivalent to inverting the Jacobian matrix and then you have a matrix vector multiply and then a vector subtraction. So the full equation is $$\left[ \begin{array}{c} x_{n+1} \\ y_{n+1} \end{array}\right]=\left[ \begin{array}{c} x_n \\ y_n \end{array}\right]-\left[\begin{array}{cc} \frac{\partial f_1}{\partial x} & \frac{\partial f_1}{\partial y} \\ \frac{\partial f_2}{\partial x} & \frac{\partial f_2}{\partial y} \end{array}\right]^{-1}_{(x_n,y_n)}*f \left(\left[ \begin{array}{c} x_n \\ y_n \end{array}\right]\right)$$

So the Jacobian is inverted and evaluated at the point $(x_n,y_n)$ and then multiplied by $f$. Note that the matrix is being multiplied on the left. And then you can generalize this to any dimension in exactly the same manner.

Solution 2:

The idea behind Newton's method is first-order approximation. If $f(x)$ is differentiable at $x_0$, then near $x_0$ it is similar to the linear function $$ f(x) \sim f(x_0) + (x - x_0) f'(x_0) $$ So if the equation $$ f(x_0) + (x - x_0) f'(x_0) = 0$$ has a solution $x = x_1$, then $f(x_1) \sim 0$. When everything is well-behaved, this will be a better approximation of 0, and repeating the process will converge to a root of $f(x)=0$.

The same idea works in higher dimensions. If we have two functions $f$ and $g$, then

$$f(x,y) \sim f(x_0, y_0) + f_1(x_0, y_0) (x - x_0) + f_2(x_0, y_0) (y - y_0)$$ $$g(x,y) \sim g(x_0, y_0) + g_1(x_0, y_0) (x - x_0) + g_2(x_0, y_0) (y - y_0)$$

If we set the right hand sides to zero and find a solution $(x,y) = (x_1,y_1)$, then $f(x_1,y_1) \sim 0$ and $g(x_1,y_1) \sim 0$.

(Here, $f_1$ means "the derivative of $f$ with respect to its first variable", etc)