0
$\begingroup$

If I create an array with 10 random numbers in the range [0, 2^30]. How can calculate the number of bits that it will consume of memory?

Let's assume that each of the numbers has 10 digits. That totals 100 digits. Would it be 800 bits (8 bits per character)?

  • 0
    This question would probably be better suited for stackoverflow.com, but usually you declare an array of a certain datatype, so here you would probably have an array of 32 bit integers (doubles) so it would occupy 320 bits unless your language dynamically allocates memory based on the items in your array in which case this turns into an expected value problem...2012-10-26

1 Answers 1

0

If they are randomly distributed, each one needs 30 bits, so you need 300 bits if you store them in binary. If you convert them to decimal, you need 10 digits each (maybe 11). Then if you store the digits in 8 bit ASCII you need 800 (or 880) bits. The big inefficiency is taking a decimal digit (of which there are only 10) and using 8 bits (of which there are 256) to store it.