49
$\begingroup$

I got stuck with a problem that pop up in my mind while learning limits. I am still a high school student.

Define $P(m)$ to be the statement: $\quad \lim\limits_{n\to\infty}(\underbrace{\frac{1}{n}+\frac{1}{n}+\cdots+\frac{1}{n}}_{m})=0$

The statement holds for $m = 1$: $\quad \lim\limits_{n\to\infty}\frac{1}{n}=0$.

Assume that $P(k)$ holds for some $k$. So put $m = k$: $\quad \lim\limits_{n\to\infty}(\underbrace{\frac{1}{n}+\frac{1}{n}+\cdots+\frac{1}{n}}_{k})=0$.

We prove $P(k + 1)$: $\quad \lim\limits_{n\to\infty}(\underbrace{\frac{1}{n}+\frac{1}{n}+\cdots+\frac{1}{n}}_{k+1}) =\lim\limits_{n\to\infty}(\underbrace{\frac{1}{n}+\frac{1}{n}+\cdots+\frac{1}{n}}_{k}+\frac{1}{n})$

$=\lim\limits_{n\to\infty}(\underbrace{\frac{1}{n}+\frac{1}{n}+\cdots+\frac{1}{n}}_{k}) +\lim\limits_{n\to\infty}\frac{1}{n}$

$=0+0=0$.

It has now been proved by mathematical induction that statement holds for all natural m.

If we let $m=n$, then $\lim\limits_{n\to\infty}(\underbrace{\frac{1}{n}+\frac{1}{n}+\cdots+\frac{1}{n}}_{n})=0 \tag{*}$.

However, $\underbrace{\frac{1}{n}+\frac{1}{n}+\cdots+\frac{1}{n}}_{n}=1 \implies \lim\limits_{n\to\infty}(\underbrace{\frac{1}{n}+\frac{1}{n}+\cdots+\frac{1}{n}}_{n})=1 \tag{$\dagger$}$.

Then $(*) \, \& \, (\dagger)$ yield $1=0$?

Can anybody explain this? thanks.

  • 79
    Let me take the opportunity to thank you for assuming that something is wrong with your argument as opposed to the nutcase assumption that you've managed to shatter the very foundations of mathematics in half a page.2011-08-26

2 Answers 2

45

$n$ is a free variable of the term $n$ that becomes bound during the substitution $m=n$ into

$\lim\limits_{n\to\infty}(\underbrace{\frac{1}{n}+\frac{1}{n}+\cdots+\frac{1}{n}}_{m})=0$

so the substitution is not logically valid.

  • 0
    No, it's supposed to be an 'm'. The (invalid) substitution is replacing all free (which in this case is just all) instances of 'm' with 'n'.2011-08-26
47

The problem is that the statement you proved was for fixed $m$, and then you let it vary.

What follows is one way of looking at this problem: Rewrite things as: $\underbrace{\frac{1}{n}+\frac{1}{n}+\cdots+\frac{1}{n}}_{m}=\frac{m}{n}$

Then what you proved by induction is that for any fixed $m$ $\lim_{n\rightarrow \infty}\frac{m}{n}=\left(\lim_{n\rightarrow \infty}m\right)\cdot\left(\lim_{n\rightarrow \infty}\frac{1}{n}\right)=m\cdot 0=0.$ This is fine since when the limits exist we can split them up like above. However in the second deduction you try to do the same thing $\lim_{n\rightarrow \infty}\frac{n}{n}=\left(\lim_{n\rightarrow \infty}n\right)\cdot\left(\lim_{n\rightarrow \infty}\frac{1}{n}\right)=n\cdot 0=0.$ This doesn't make any sense now, because $n$ is no longer fixed, and the one limit does not exist. (We only have the multiplicative property when both limits exist)

Hope that helps,

  • 11
    Of course the last line is wrong, that is the point. What I am trying to get across is that treating $n$ as if it were a fixed integer $m$ is not allowed. One way to understand this is by looking at the limits as presented above.2011-08-26