2
$\begingroup$

Apologies if this has already been asked and answered, I could not find anything explicit on it.

I'm trying to find the range of $\alpha$ such that $$ \lim_{n\to\infty} \frac{k}{n^\alpha} = 0 $$

with $k>0$.

Now obviously $ \alpha \ge 1 $ is fine, but intuitively I believe $\alpha > 0$ is sufficient. Is this correct and if so, why?

I have considered thinking of it as a sequence so

$$x_n = \frac{k}{n^\alpha}$$

Then for $ M>n$,

$$ \vert x_M - L \vert <\epsilon $$ $$ \therefore \vert \frac{k}{M^\alpha}-0 \vert < \epsilon $$ $$ \therefore \frac{k}{M^\alpha} < \epsilon $$

But I can get no closer to a definitive answer.

Thanks in advance to whomever can help here!

  • 2
    Consider for example $\alpha=1/2$, you have for every $\epsilon>0$ there exists an $N$ so that $N>\frac1{\epsilon^2}$. It follows $\sqrt N>\frac1\epsilon$ and from monotony of the root you have $\sqrt n>\frac1\epsilon$ or $\frac1{\sqrt n}<\epsilon$ whenever $n>N$. You can extend the basic idea of this to any $\alpha>0$, what you need is the monotony of the function $x\mapsto x^{1/\alpha}$.2017-01-23
  • 0
    Thanks, so does that mean I am correct that $\alpha > 0$ is the required range?2017-01-23
  • 0
    The argument extends to show that for any $\alpha>0$ the sequence converges to $0$. For $\alpha<0$ it diverges (unless $k=0$). Can you see why it should be obvious in that case? And what happens if $\alpha=0$?2017-01-23
  • 0
    Yes, thanks. I just wanted someone to explicitly confirm it. The sqrt example is exactly what I was thinking of and even plotted it in Maxima to confirm I wasn't off my head! $\alpha \le 0$ is trivial!!!2017-01-23
  • 0
    $\alpha=0$ also results in a convergent sequence :)2017-01-23

2 Answers 2

1

Suppose $\alpha > 0$, then let $\epsilon > 0$. Note that: $\frac{k}{n^\alpha} = \exp (\log \frac{k}{n^\alpha}) = \exp(\log k - \alpha \log n)$.

Now, if $n$ increases to infinity, then $\log n$ also increases to infinity. So the whole thing hinges on $\alpha$.

If $\alpha$ is positive, then $\log k - \alpha \log n$ will keep decreasing, then it will go to $- \infty$ (roughly, you need to make this rigorous), and $e^{-\infty} = 0$ (once again, roughly), therefore the limit is zero.

On the other hand, if $\alpha$ is negative, then $\log k - \alpha \log n$ will keep increasing, so that it goes to $+\infty$, and $e^{+\infty} = +\infty$, so convergence doesn't happen.

Of course, if $\alpha = 0$, then the limit is $k$.

This answer is not rigorous, however it wishes to point out the power of rewriting the given expression in a form where it is almost immediately obvious that the values of $\alpha$ convergence occurs are only the non-negative ones.

(By roughly, I mean to say that the understanding of the statement should be clear, along with a rigorous proof, but the statement itself is purely indicative, and in fact incorrect as a mathematical statement).

0

Let $f(x)=k/x^\alpha.$ Note that if we put $u=1/x$ then as $x \to +\infty,$ we have $u \to 0^+,$ and that in terms of $u$ we have $f=ku^\alpha.$ Since for $\alpha>0$ the latter function is continuous from the right at $0,$ your statement is right about the limit of $f$ at infinity.

Some of this conclusion depends on verifying the limits may be substituted in as suggested, and in showing $x^\alpha$ continuous from the right at $0.$