3
$\begingroup$

What is the origin of the (nearly obsolete) term "binary decimal"?

At least two important publications in the 1930s used this oxymoron to mean what is now ordinarily called a "base-2 (or binary) numeral":

  • A. M. Turing, "On Computable Numbers, with an Application to the Entscheidungsproblem", Proceedings of the London Mathematical Society, 2 42 (1): 230–65, 1937:

    The real number whose expression as a binary decimal is obtained by prefacing this sequence by a decimal point is called the number computed by the machine. [p. 232]

  • G. H. Hardy and E. M. Wright, An introduction to the theory of numbers (First ed., 1938), Oxford: Clarendon Press:

    Any positive integer up to $2^n-1$ inclusive can be expressed uniquely as a binary decimal of $n$ figures, i.e., as a sum $\sum_0^{n-1} a_s 2^s$, where every $a_s$ is $0$ or $1$. [p. 115]

NB-1: Turing's paper contains the earliest usage of the term "binary decimal" that I've managed to find. In this paper, the frequent use of the term "decimal" — without the qualifier — in reference to base-2 (binary) notation easily misleads a modern reader to think it's referring to base-ten notation. In fact, the opening paragraph states "According to my definition, a number is computable if its decimal can be written down by a machine", but not until the third page does one find that this "decimal" means binary decimal. This suggests that in Turing's day, there must already have been some established usage of this oxymoron, and very significant ambiguity in the term "decimal".

NB-2: I wrote nearly obsolete, because this remains the terminology in the 2008 edition of Hardy & Wright — which also still contains the original footnote "We ignore the verbal contradiction involved in the use of 'decimal'; there is no other convenient word."

  • 3
    I don't see why "binary decimal" is so terrible. The sense in which one could call it a verbal contradiction is evident, but at the same time the meaning is perfectly clear.2011-12-05

1 Answers 1

0

There is no other convenient word. A numeral, in English, means a representation of an integer. A decimal, in English, means a number with some digits after the decimal point.

It is far easier to abstract away the ten-ness of "decimal" when necessary than to invent a hitherto unused specialist term and expect readers to understand it easily.

You might compare the fuss over whether it is ever possible to have more than two "alternatives", given that alter in Latin implies either-or-ness.