47
$\begingroup$

After learning about the binary number system (only 2 symbols, i.e. 0 and 1), I just thought why did we adopt the decimal number system (10 symbols) after all?

I mean if you go to see, it's rather inefficient when compared to the octal (8 symbols) and the hexadecimal (16 symbols)?

  • 0
    It wouldn't be that difficult to imagine counting up to 10 and then imagining 2 more, or perhaps using our fist (thumb, index, middle, ring, pinky, close hand) X 22017-12-28

8 Answers 8

34

Expanding on the comment by J.M., let me quote from the (highly recommended) book by Georges Ifrah The Universal History of Numbers (Wiley, 2000, pp. 21-22):

Traces of the anthropomorphic origin of counting systems can be found in many languages. In the Ali language (Central Africa), for example, "five" and "ten" are respectively moro and mbouna: moro is actually the word for "hand" and mbouna is a contraction of moro ("five") and bouna, meaning "two" (thus "ten"="two hands").

It is therefore very probable that the Indo-European, Semitic and Mongolian words for the first ten numbers derive from expressions related to finger-counting. But this is an unverifiable hypothesis, since the original meanings of the names of the numbers have been lost.

Ifrah then goes on to explain that

...the hand makes the two complementary aspects of integers entirely intuitive. It serves as an instrument permitting natural movement between cardinal and ordinal numbering. If you need to show that a set contains three, four, seven or ten elements, you raise or bend simultaneously three, four, seven or ten fingers, using your hand as cardinal mapping. If you want to count out the same things, then you bend or raise three, four, seven or ten fingers in succession, using the hand as an ordinal counting tool.

  • 0
    But there isn't really an advantage for using a base 10 system over, say base 16, right? Because, yes, until dec10 you can count with your fingers, but that's not a property that is bound to the numeric system...2017-01-27
28

I think the answer here might be, that the guys who thought base 10 was a good idea had the largest sticks.

If one trusts the wikipedia, the Babylonians had a base 60 system, which can still be felt today with this "60 minutes in an hour" nonesense, and a (related) base 12 system was widely in use too. There are still unique words for "eleven" and "twelve", as well as expressions as "a dozen". After all, you can count to twelve using a single hand.

Then, there was the base 1 latin system, and (wikipedia again) a base 20 system for the mayan.

Something as easy as "base 10 is natural for humans" does not explain it all. =)

  • 0
    60 is a superior highly composite number,which is why it's so useful.2019-05-20
13

Because it makes the metric system so much simpler :).

  • 0
    This should be flagged as nonsense, but it's just too funny.2016-02-27
11

I don't believe you understand the notion of efficiency in terms of encoding. Informally speaking, you have to keep it mind there are two factors involved: (i) cost of having different symbols (in case of base 10 there as 10 different symbols, in case of base 16 there are 16 different symbols etc) and the length of the resulting string to encode a particular number.

When you consider both factors and apply some basic information theory to it, the answer may look a bit surprising: the most efficient encoding has a base $e$ (yes, that very $e = 2.718\dots$). Since we'd rather have some natural number as a base, the best we can get is base 3, and the next is base 2.

So, why, then, computers use base 2 (0 and 1) rather than base 3 (say, -1, 0, and 1)? The answer is that it is simple to design the circuits that distinguish between two (rather than three) states. (I do remember reading some of the earlier computers did use base 3, but I can't recall all the details.)

Now, with respect to octals and hexes, those are simply convenient ways to record the binary strings. If you did some machine-level debugging, you probably had a chance to read what's known as "hexadecimal dump" (contents of a memory). Surely it's easy to read than if it were written as binary dump. But what's lurking underneath of that is base 2.

(The answer on "why do we use base 10" has been answered elsewhere.)

  • 0
    @Marc - Have you got a link to a longer discussion of this? Both the use of $\log(b)$ instead of $b$ and the final result of all radices being equal seem reasonable, but it would be nice to see more on it.2013-09-08
5

It is believed that the decimal system evolved mainly due to anthropomorphic reasons (5 digits on each hand) and is thought to be a simplification of the Babylonian sexagesimal (base 60) counting method.

To make this analogy precise, note that the normal hand has 4 fingers (excluding the thumb) with 3 segments, along with 5 digits on the other hand to be used as segment pointers. This gives 3 x 4 x 5 = 60 unique configurations.

  • 1
    @J.M. Well, if you define that finger has 3 joints, a thumb is not a finger. In fact, it is more rational to assume that thumb is something else as saying, that finger sometimes has 2 joints, sometimes 3.2013-06-25
2

I am tempted to answer "for the same reason as this forum is in English" - ie human convention for effective communication and calculation. However there is another anthropomorphic aspect to this, in that there are advantages for a high base (compact encoding of numbers) and for a low base (smaller number of addition/multiplication facts to learn, fewer number 'symbols' to recall and write distinctly without confusion).

Binary and binary related computations are used in computing because it was technologically easier to encode '0' and '1' than to work with a higher base than 2, and computing conventions were created when computing resources and speed had to be optimised. The available length of string then restricts the size of number which can be stored or manipulated. Many of these reource constraints no longer exist in the same way (my computer has more capacity than I generally need).

So I think there is some form of rough optimisation with base 10, given the recall and ability of human beings, this was a good compromise. And we do not always use it when there is an advantage to be had in using another. And note that the Octal and Hexadecimal representations within computing are the ones closest to base 10 ....

0

Because these ancient folks didn't fully foresee the glory of modern computer technology. Else they would have choosen a base that would be more compatible with computers binary number system: 8!

Generations of computer science students would it have so much easier and everything would be much better:

A Byte would have 10 bits (with 2^10 possible values)

Computer technology would have evolved from

And we would not need funny things like mebibyte:

  • 1kilobyte = 1000 Byte (not 2000 as it is now :)
  • 1MB=1,000,000 Byte
  • 1GB=1,000,000,000 Byte
  • 1TB=1,000,000,000,000 Byte

OK, for one terabyte you would only get 7% of the capacity compared to our current system, but who cares.

  • 0
    The point I wanted to make is: Many programmers would have had an much easier job if they wouldn't have to recalculate decimal values into hexadecimal.2015-11-17
0

The reason is history and tradition. The decimal system is a convention that was adopted long ago and is so widespread and used that it would be enormously difficult to change it for any other system, no matter how advantageous it may be. This is not the only example, we have the gregorian calendar (rather crude), and the british imperial system of units, which one could argue to be "unnatural". Attempts have been made to adopt better systems, but as far as I know they have failed on account of the effort it would take to make such change.

  • 0
    In Germany I know there were several different measures in use, often for example the length of the lower arm of the tailor doing the measuring of cloth. In Britain I assume everybody used *their* foot as a rough measure, that was standardized sometime when precision became a necessity. Same for inch (in Spanish "pulgada", i.e., "length of the thumb"), and it's relation to the foot.2016-02-27