From Wikipedia
$f(x) = O(g(x))$ if and only if there exists a positive real number $M$ and a real number $x_0$ such that $|f(x)| \le \; M |g(x)|\mbox{ for all }x>x_0$.
Also from Wikipedia
Suppose that the sequence $\{x_k\}$ converges to the number $L$. We say that this sequence converges linearly to $L$, if there exists a number $μ ∈ (0, 1)$ such that $\lim_{k\to \infty} \frac{|x_{k+1}-L|}{|x_k-L|} = \mu$.
If the sequences converges, and
- $μ = 0$, then the sequence is said to converge superlinearly.
- $μ = 1$, then the sequence is said to converge sublinearly.
I was wondering
- Is it true that if $\{x_n\}$ either linearly, superlinearly or sublinearly converges to $L$, only if $|x_{n+1}-L| = O(|x_n-L|)$? This is based on what I have understood from their definitions and viewing $\{ x_{n+1}-L \}$ and $\{ x_n-L \}$ as functions of $n$. Note that "only if" here means "if" may not be true, since $\mu$ may lie outside of $[0,1]$ and $\{x_n\}$ may not converge.
- Some Optimization book says that the steepest descent algorithm has linear rate of convergence, and writes $|x_{n+1}-L| = O(|x_n-L|)$. Is the usage of big O notation here expanding the meaning of linear rate of convergence?
Thanks and regards!