0
$\begingroup$

A novice question. Let say I have a string of size N and I want to store some information in it (e.g. the name of my dog, my birthday etc.) it does not matter what. What would be the maximum amount of information I could encode in that string (including the codes are allowed to overlap if possible). Let say the alphabet I am using is of size 2 (0,1). How can I calculate that? On what does this calculation depend? what are the questions I should be asking someone posting the same question under the assumption I know the answer?

Can anyone give any advice ?

  • 0
    Hint: look at Entropy, Shannon, and information theory.2017-02-22
  • 0
    Apart from unseen_rider's pointers, this may be helpful https://en.wikipedia.org/wiki/Shannon_coding or https://en.wikipedia.org/wiki/Huffman_coding2017-02-22
  • 0
    Be aware that "information" is used to mean two different things. When one is interested in simply encoding distinguishable things (which seems to be your intent), one is essentially talking about the number of distinct encodings available, and "information capacity" might be a good term. In the context where the probability of events is significant, the [Shannon entropy](https://en.wikipedia.org/wiki/Entropy_(information_theory)) is what you'd consider.2017-02-22

0 Answers 0