0
$\begingroup$

I'm trying to use Newton's method to solve the following system of equations, where f and g are functions of x and y. (h,a,f,c,d,b and k are just constants).

$f(y,x)=\left[\begin{array}{c} y^{1}\\ y^{2}\\ y^{3}\\ y^{4} \end{array}\right]-\left[\begin{array}{c} y_{0}^{1}\\ y_{0}^{2}\\ y_{0}^{3}\\ y_{0}^{4} \end{array}\right]+h\left[\begin{array}{cccc} a_{1} & f_{1} & 0 & 0\\ c_{1} & a_{1} & f_{1} & 0\\ 0 & c_{1} & a_{1} & f_{1}\\ 0 & 0 & c_{1} & a_{1} \end{array}\right]\left[\begin{array}{c} y^{1}\\ y^{2}\\ y^{3}\\ y^{4} \end{array}\right]+\left[\begin{array}{cccc} b_{1} & 0 & 0 & 0\\ d_{1} & b_{1} & 0 & 0\\ 0 & d_{1} & b_{1} & 0\\ 0 & 0 & d_{1} & b_{1} \end{array}\right]\left[\begin{array}{c} x^{1}\\ x^{2}\\ x^{3}\\ x^{4} \end{array}\right]-\left[\begin{array}{c} k_{1}\\ 0\\ 0\\ k_{2} \end{array}\right]=0$

$g(y,x)=\left[\begin{array}{c} x^{1}\\ x^{2}\\ x^{3}\\ x^{4} \end{array}\right]-\left[\begin{array}{c} x_{0}^{1}\\ x_{0}^{2}\\ x_{0}^{3}\\ x_{0}^{4} \end{array}\right]+h\left[\begin{array}{cccc} a_{2} & f_{2} & 0 & 0\\ c_{2} & a_{2} & f_{2} & 0\\ 0 & c_{2} & a_{2} & f_{2}\\ 0 & 0 & c_{2} & a_{2} \end{array}\right]\left[\begin{array}{c} y^{1}\\ y^{2}\\ y^{3}\\ y^{4} \end{array}\right]+\left[\begin{array}{cccc} b_{2} & 0 & 0 & 0\\ d_{2} & b_{2} & 0 & 0\\ 0 & d_{2} & b_{2} & 0\\ 0 & 0 & d_{2} & b_{2} \end{array}\right]\left[\begin{array}{c} x^{1}\\ x^{2}\\ x^{3}\\ x^{4} \end{array}\right]-\left[\begin{array}{c} k_{1}\\ 0\\ 0\\ k_{2} \end{array}\right]=0$

Am I right in saying that the Newton Iteration equations be:

$y^{k+1}=y^{k}-\frac{f(y,x)}{\frac{\partial f(y,x)}{\partial y}}$ and $x^{k+1}=x^{k}-\frac{g(y,x)}{\frac{\partial g(x,y)}{\partial x}}$.

Or would the denominators be partial derivatives: $\left(x\frac{\partial f}{\partial y}+y\frac{\partial f}{\partial x}\right)$.

I know it was not necessary to write the entire functions but I have some more questions about the equations depending on the answer to this one.

Many Thanks!

  • 0
    I think I might know the answer now. Will post as soon as I am sure.2012-01-22

2 Answers 2

1

Your equations amount to a system of $8$ scalar equations with $8$ scalar unknowns. Writing $x$ and $y$ as separate vectors is not helping clarity here. Stack them into one $8$-dimensional vector $z$, and similarly stack $f$ and $g$ into one function $h$. Formally speaking, the equation $h(z)=0$ could be solved by the Newton-Raphson method as $ z_{n+1} = z_n - (Dh(z_n))^{-1} h(z_n) $
... except this is pointless/tautological when the system is linear. Whatever the starting point is, you get the solution after one step of iteration. Indeed, the Newton-Raphson method amounts to approaching the root by replacing the given system with its linearization around a previously found point, and solving the linear system to find the next approximation. Your system is linear to begin with.

0

What you have is a matrix with two unknowns and four equations. This implies that there isn't necessarily a consistent solution. However, if you are trying to find the nearby local optima solution, you can use a computational method called gradient descent. It is a multivariable analog to Newton's Method. It works a finite (sometimes adaptive, depending on algorithm) step size.

The recurrent equation is: $\vec x_{i+1}=\vec x_i - h*\vec \nabla f(\vec x)$ $h$ is the step size, and is usually small. There are advanced techniques for computing the "optimum" step size in a given situation, but the techniques are often not worth the added calculation complexity. As the system gets close to a solution, the gradient often becomes unstable due to numerical conditioning. As a result, it tends to bounce around the actual solution in random directions. However, at this point, the solution found is usually close enough for practical purposes.

  • 0
    Since $x,y,f,g$ have four components each, there are $8$ scalar unknowns and $8$ scalar equations.2015-01-25