I can show that $\cos(\sin(x))$ is a contraction on $\mathbb{R}$ and hence by the Contraction Mapping Theorem it will have a unique fixed point. But what is the process for finding this fixed point? This is in the context of metric spaces, I know in numerical analysis it can be done trivially with fixed point iteration. Is there a method of finding it analytically?
Fixed point of $\cos(\sin(x))$
-
1@Gerry, I believe it means you are pregnant. Or possibly someone you know. – 2012-12-21
3 Answers
The Jacobi-Anger expansion gives an expression for your formula as:
$\cos(\sin(x)) = J_0(1)+2 \sum_{n=1}^{\infty} J_{2n}(1) \cos(2nx)$.
Since the "harmonics" in the sum rapidly damp to zero, to second order the equation for the fixed point can be represented as:
$x= J_0(1) + 2[J_2(1)(\cos(2x)) + J_4(1)(\cos(4x))]$.
Using Wolfram Alpha to solve this I get $x\approx 0.76868..$
First off, it is clear that the fixed point $x$ will lie in $[0,1]$
$\cos{(\sin{(x)})}=x$ $\Rightarrow$
$\cos^{-1}(x) = \sin{(x)}$
Let $F(m) = \sin{(m)} -\cos^{-1}{(m)}$ $\Rightarrow$
$F'(m) = \cos{(m)} + \dfrac{1}{\sqrt{1-m^{2}}}$
Then you can also find the fixed point using Newton's method.
With $x_0$ = 0
$x_{n+1} = x_n - \dfrac{F(m)}{F'(m)}$
Using MATLAB we find with $x_0:=0$
$x_{4}$ = $0.768169156736796$
which is correct to 15 decimal places.
As far as solving analytically, I do not believe there is a way.
Newton's method proves to be superior to functional iteration in this case, because we observe faster convergence.
With $x_0 = 0$,
$x_{n+1}$ = $f(x_n)$
$x_{40} \approx 0.768169156736780$ which is not nearly as accurate of an approximation.
Since it's a contraction, just iterate. Pick an arbitrary starting point and keep applying the operator until you achieve the desired accuracy.