Consider newton's method applied to minimization of function}: $$f(x)=|x|^{3}$$ where $x\epsilon R^{n}$.
a) Compute gradient and hessian of f
b) Use the formula $(I+uu^{T})^{-1}=I-\frac{1}{2}uu^{T} $ where $I$ is the $n\times n$ identity matrix and $u\epsilon R^{n}$ is a unit vector to compute the inverse of the hessian of f. Use this to give an explicit formula for the iteration step in Newton's method.
c) Show Newton's method is linearly convergent to the minimizer $x^{*}=0$ Why does quadratic convergence not occur?}
Above is a problem I've been given for review. Assuming that $g(x)=|x|$ and $x=[x_{1}, x_{2}, ... ,x_{n}]$, it seems that $g(x) = \sqrt{x_{1}^{2}+x_{2}^{2}+...+x_{n}^{2}}$
So write $f(x)={(x_{1}^{2}+x_{2}^{2}+...+x_{n}^{2})}^{3/2}$
This seems valid but when I go to try to calculate the gradient and hessian as one normally would, this gets incredibly messy. My assumption is that I'm just doing this completely wrong but I'm unsure of how else I could go about something like this.
Also, looking ahead to part b), I'm not sure how I would use that formula to get the inverse of the hessian, and how I would use the hessian inverse to deduce a formula for Newton's method. Any input is much appreciated.