Suppose you wanted to write the number 100000. If you type it in ASCII, this would take 6 characters (which is 6 bytes). However, if you represent it as unsigned binary, you can write it out using 4 bytes.
(from http://www.cs.umd.edu/class/sum2003/cmsc311/Notes/BitOp/asciiBin.html)
My question: $\log_2 100,000 \approx 17$. So that means I need 17
bits to represent 100,000
in binary, which requires 3 bytes. So why does it say 4 bytes?