1
$\begingroup$

A normal number is a number where no number is favored to appear in the digits. Does this definition imply that all whole numbers appear in its digits? Because the definition involves notions from probability, I was wondering if it might happen that a certain number would not appear in the digits of a normal number without contradicting the definition.

1 Answers 1

2

Yes it does. Working in base B, a string with A digits should have the natural density $ \frac {1}{ B^A} $ by definition of a normal number. If the string doesn't appear at all, then it has natural density 0.

  • 0
    Looking at the first $N$ numbers, let the number of times that the string appears be $N(S)$. Then, the definition of normal means that $\lim_{N \rightarrow \infty} \frac {N(S)}{S} = \frac {1}{B^A}$, which is the __natural density__. In fact, you can conclude that the string appears infinitely often, since otherwise the density will still be 0.2012-12-29
  • 0
    So is it possible to produce a non-normal number where all natural numbers appear anyway or are these notions equivalent?2012-12-29
  • 1
    I believe so, but may be wrong. Take a normal number, and then insert $ 2^{n-1}$ 0's in the $2^n$ place for $n\geq 2$. For example, take 0.123456789101112... and transform it into 0.12 **0** 34 **00** 5678 **0000** 91011121 **00000000** ... Then for given any string $S$ of length $A$ which isn't 0, $lim_{N \rightarrow \infty} \frac {N(S)} {S} = \frac {1}{2 \times 10^A}$ (which requires a short argument), while the string 0 occurs with natural density $ \frac {1}{2} $.2012-12-29
  • 0
    Like if you have the number 0.10203040506070809010011012... the concatenation of natural numbers with 0 inserted in between. What's the density of 0 there? Is 0 already denser or is this still normal?2012-12-29
  • 0
    Hey Calvin, if you did that construction wouldn't you end up cutting some of the numbers? Like 13 for example got cut by the zeroes you inserted.2012-12-29
  • 0
    I believe that in your construction, 0 still appears with density $ \frac {1}{10}$. Even though you added a lot of 0's at the start, you will be adding very few 0's eventually when your numbers have a lot of digits. Hence, this doesn't change the density. For example, it appears to have a density of (natural density + introduced density) $\frac {1}{10} + \frac {1}{2} $ initially, but when there are 100 digits, it has density of (natural density + introduced density) $\frac {1}{10} + \frac {1}{101}$. Taking the limit, it has density $\frac {1}{10}$.2012-12-29
  • 0
    Yes, but that's fine, because we know that 13 will occur with density $\frac {1}{100}$, and the chance that it will be cut off in the future is extremely low, since we 're only cutting once 'every' $2^n$ digits, it won't always be cut at 113, 213, 313, 413, 513, etc. So the number of times it appears will be off by a linear amount, but we're dividing by an exponential amount, so the density stays the same. In the first 1000000 digits, 13 appears 10000 times, and can be cut off at most $20 \approx \log_2 1000000$ times.2012-12-29
  • 0
    Would this work equally well: what if I just insert 0's that match the length of each digit: 0.1020304050607080901000110012001300... Now no digit is cutoff. But would 0 have density 1/2 or at least not 1/10?2012-12-29
  • 0
    That should work, if all you want is for the string to appear once. But since you talked about natural numbers, I was looking at the density of any string. For example, it is possible that the number of times that 12 appears is greatly changed since we're inserting 0's in many many places, so the initial 12, or 2122, etc would not contribute the count any more, and this could make the natural density 0. My argument was that the difference is linear / exponential, which tends to 0, hence the density is $ \frac {1}{2 \times 10^A}$.2012-12-29