If I create an array with 10 random numbers in the range [0, 2^30]. How can calculate the number of bits that it will consume of memory?
Let's assume that each of the numbers has 10 digits. That totals 100 digits. Would it be 800 bits (8 bits per character)?