4
$\begingroup$

We have the well-known formula

$$\frac{n (n + 1) (2 n + 1)}{6} = 1^2 + 2^2 + \cdots + n^2 .$$

If the difference between the closest numbers is smaller, we obtain, for example

$$\frac{n \times (n + 0.1) (2 n + 0.1) }{6 \cdot 0.1} = 0.1^2 + 0.2^2 + \cdots + n^2 .$$

It is easy to check. Now if the difference between the closest numbers becomes smallest possible, we will obtain

$$ \frac{n \cdot (n + 0.0..1) \cdot (2 n + 0.0..1)}{6 \cdot 0.0..1} = 0.0..1^2 + 0.0..2^2 + \cdots + n^2$$

So can conclude that

$$\frac{2n ^ 3}{6} = \frac{n ^ 3}{3} = \frac{0.0..1 ^ 2 + 0.0..2 ^ 2 + \cdots + n ^ 2}{0.0..1}.$$

Is this conclusion correct?

  • 2
    We call the last expression the integral of $n^2$2011-12-19
  • 0
    On the RHS of your last equation you should have multiplied by your small number rather than divided.2011-12-19
  • 0
    I don't understand the 2nd formula. Take $n=1$; are you claiming $(1)(1.1)(2.1)/(.6)=(.1)^2+(.2)^2+\cdots+1^2$? Is that true?2011-12-19
  • 1
    @Gerry Myerson it turns out is. ${1\times 1.1\times 2.1\over0.6}=3.85$ and so is $0.1^2+0.2^2.....1^2$2011-12-19

1 Answers 1

6

Suppose we want to calculate the sum of squares with successive differences $\epsilon$ from $0$ to some fixed $n$ (we require $\frac{n}{\epsilon}\in\mathbb{N}$ for this particular calculation, however for the general formulation of integrals and Riemann sums, this is not required), that is

$$S_\epsilon = \sum_{i=0}^{\frac{n}{\epsilon}}(i\epsilon)^2$$

letting $m = \frac{n}{\epsilon}$ this sum is equivalent to

$$\sum_{i=0}^{m}i^{2}\left(\frac{n}{m}\right)^2$$

which we can write as

$$=\left(\frac{n^2}{m^2}\right)\sum_{i=0}^{m}i^2=\left(\frac{n^2}{m^2}\right)\frac{m(m+1)(2m+1)}{6} $$

$$= \frac{\left(\frac{n}{m}\right)m\left(\frac{n}{m}\right)(m+1)\left(\frac{n}{m}\right)(2m+1)}{6\left(\frac{n}{m}\right)}=\frac{n(n+\epsilon)(2n+\epsilon)}{6\epsilon}$$

taking the limit $\epsilon\rightarrow 0$ this is equivalent to $m\rightarrow\infty$ and $S_{\epsilon\rightarrow 0}$ is easily seem to be divergent to $+\infty$. However, $S_{\epsilon\rightarrow 0}\cdot \epsilon$ is convergent (which easily evaluated by simply substituting $\epsilon = 0$, which we can do by the continuity of the expression) and is of certain interest. In particular, we can write

$$S = S_{\epsilon\rightarrow 0}\cdot\epsilon=\lim_{m\rightarrow\infty}\ \sum_{i=0}^{m}\left(i\frac{n}{m}\right)^2\left(\frac{n}{m}\right)$$

we recognize this as the Riemann Sum which defines the integral

$$\lim_{m\rightarrow\infty}\ \sum_{i=0}^{m}\left[f\left(x_0 + i\frac{n}{m}\right)\frac{n}{m}\right]=\int_{x_0}^{x_0 + n}f(x)\ dx$$

for $f(x) = x^2$ and $x_0 = 0$. (In particular this is the left Riemann sum). By the Fundamental Theorem of Calculus,

$$\int_{0}^{n}x^2\ dx = \frac{x^3}{3}\bigg|_{0}^{n} = \frac{n^3}{3}$$

which is exactly the quantity you cite.

  • 0
    But if the OP knew calculus they probably would not have asked the question.2011-12-19
  • 0
    @Bill True enough. However, he is likely to encounter the subject soon where this very problem might arise (I believe it is a common exercise when learning of Riemann sums) and so I write for future reference as well. This is why I laid out everything in a relatively fundamental manner.2011-12-19
  • 0
    +1. I would like to point out that the limit $\lim \limits_{\varepsilon \to 0} \epsilon \cdot S_{\varepsilon}$ is completely elementary, and does not require going through Riemann sums? In fact, all we need is to plug in $\varepsilon = 0$ in the expression for $\varepsilon \cdot S_{\varepsilon}$ (since that function is continuous).2011-12-19
  • 0
    @Srivatsan You are of course right. But since the prior calculations are exactly the ones for calculating the Riemann sum, it felt like a shame not to include it, especially when this question introduces the topic so well.2011-12-19
  • 0
    @EuYu I agree. I wrote my comment, thinking that it might be useful to point out the easier way, *in addition to* the connection to Riemann integrals. :)2011-12-19