If have well-known formula $(n + 1) n / 2 = 1 + 2 +\cdots+ n$. If the difference between the closest numbers smaller, will obtain, for example $(n + 0,1) n / (2 \cdot 0,1) = 0,1 + 0,2 +\cdots+ n$. Now if the difference between the closest numbers the smallest possible, will obtain $(n + 0,0\ldots1) n / (2 \cdot 0,0\ldots 1) = 0,0\ldots1 + 0,0\ldots2 + \ldots + n$, so can conclude $n ^ 2 / 2 = (0,0\ldots1 + 0,0\ldots2 + \cdots + n) / 0,0\ldots1$ whether conclude is correct?
EDITED VERSION:
If have well-known formula $\frac{(n + 1)n}2 = 1 + 2 +\dots+ n$.
If the difference between the closest numbers smaller, will obtain, for example $\frac{(n + 0,1) n}{2.0,1} = 0,1 + 0,2 +\dots + n$.
Now if the difference between the closest numbers the smallest possible, will obtain $\frac{(n + 0,0\dots1) n}{2 . 0,0\dots 1} = 0,0\dots 1 + 0,0\dots 2 + \dots + n$ , so can conclude $\frac{n ^ 2}2 = \frac{0,0\dots1 + 0,0\dots2 + \dots + n}{0,0\dots1}$ whether conclude is correct?