Using the scientific notation:
$3.14 = 0.314 \times 10^1$
From Tanenbaum's Structured Computer Organization, section B.1:
The range is effectively determined by the number of digits in the exponent and the precision is determined by the number of digits in the fraction.
I know how this notation works but I am asking about the meaning of the two words.
Why the book is calling them the range and precision? What do they exactly mean?