2
$\begingroup$

A well known method of easily solving multi-dimensional non-linear root finding problems, is to bring the equations into the form: $\bf x = g(x)$ And then iterating. The problem is, one has to find a suitable $\bf{g(x)}$, since if the spectral radius of the jacobian is larger than $1$, the iteration will not converge. Calculating the spectral radius involves finding the eigenvalues (or is there an easier way?), which is quite an annoying problem if the functions involved are non trivial.

So:

Are there any suggested methods of properly choosing the function $\bf{g(x)}$, assuming I know the root exists and is isolated? additionally, are there other root finding methods that do not involve calculating the matrix inverse (unlike e.g. Newton's Method)?

1 Answers 1

1

To find the dominant eigenvalue, you can just iterate your function a number of times. The rate of growth will go to it. It may quickly become obvious that it is greater or less than 1.

For the second, non fixed-point version:I have had success with the Nelder-Mead simplex method on toy problems. A routine is given in section 10.4 of Numerical Recipes and other numerical analysis books. Powell's method in section 10.5 does not use any gradient information as well.