2
$\begingroup$

Consider a sequence $u_k$ which satisfies

$$ u_{k+1} \leq (1-\frac{1}{\sqrt{k}}) u_k + \frac{1}{k}$$

At what rate does this sequence decay to zero?

This is not a homework problem. I seem to have worked out a proof which shows that it decays to zero asymptotically faster than $1/k^s$ for any exponent $s$. I think my proof might be wrong, and at any rate I'm wondering what the exact rate of decay is.

  • 2
    Why don't you share your proof? That way, we can look through it to find any inaccuracies. Also, it would probably be best to replace the tag (high-school) with (sequences-and-series).2011-04-06
  • 1
    Your proof must be wrong since $u_{k+1}=1/k$ satisfies the inequality.2011-04-06

1 Answers 1

4

The sequence $u_k = \frac {1}{\sqrt{k}}$ satisfies the inequality since $\frac {1}{\sqrt{k+1}} \le \frac {1}{\sqrt{k}}.$

For any $s \lt 1/2$, the inequality will fail asymptotically for $u_k = \frac{1}{k^s}$.