From what I understand of Shanon Entropy, we want when dealing with a random variable, to be able to quantify essentially how far from uniformly distributed our random variable is.
In the case of a RV with two possible events (a coin flip, for ex), Entropy has the nice property that its range is continuous over the interval $[0, 1]$.
However, if we deal with more than two events (for ex die roll RV with outcomes form $1$ through $6$), then the range of our Entropy function is scaled to $[0, -\log_2\big({\frac{1}{6}}\big)]$.
Would it make sense (and still be "correct") to adjust the base of the $\log$, in this case to $\log_6$, to match the number of events in the RV so as to keep the range of our Entropy function as the interval $[0, 1]$?
Is this something that's commonly done, or are there reasons not to do it?