The most intuitive method of representing an integer is in unary. For example, 10 can be represented as 0000000000, ----------, etc. This requires O(n) space.
The most common method is slightly more complicated. Several different symbols can be used. For example 100 can be represented as 1100100, 144, 64, 100, etc. This requires O(log n)
My question is, is there a next step to this 'sequence' that would allow us to store an integer in sub-log space? Any physically possible way counts, even if it isn't feasible.
Any proof or disproof would be appreciated, as long as it isn't too complicated.