0
$\begingroup$

Say we have an arithmetic progression in $Z_n$ like $3, 6, 9, 12, ...$ etc. If you move a sliding window of at least 3 values over the progression the 'density' in that subset compared to if the window 'had no holes', is always the same, $\frac{1}{3}$ (I think the term I'm looking for is Natural Density, but this is a new concept for me so I'm not sure if I'm using it 100% correctly).

Arithmetic progressions also have the nice property that they can be described with a fixed amount of space, e.g. for an arithmetic progression you only need to know the starting number and the constant difference to derive the rest of the numbers in the progression.

What types of monotonically increasing progressions can be described with fixed space are known that have "varying" density? By "fixed space" I mean to avoid a hard coded list, instead focusing on sequences that can be generated from some finite set of starting numbers. By "varying" I mean:

  1. A sliding window over the values should not see the same density every time, and that the density seen should increase and decrease, so the density should not be strictly monotonically increasing/decreasing, even though the progression is. So for example a sequence of triangular numbers would not work, because they only ever become further apart.
  2. I should be able to choose a window size vastly greater than the amount of space used to describe the progression and still have it vary (so no cycling through a fixed list of differences).

Apologies if I have not formalized this enough :) I think I'm stumbling on something that must be studied already but I don't know the terminology.

Edit: I was trying to keep things succinct but more constraints are needed. My ultimate goal is to use these sequences on a computer, so sequences that are more easily computable have preference. I'd also like to be able to easily compute the Nth element, without calculating every prior element.

  • 0
    Let $A\subset\mathbb N$ be the set of all numbers with a leading 8 or 9.2012-12-16
  • 0
    Note that the only way for a oscillating pattern to be discernible and not be dampened out regardless of the window size is if the pattern gradually slows down (picture $\cos(\log x)$) as in my example.2012-12-16
  • 0
    @MarioCarneiro: Couldn't you get away with just saying all numbers with a leading 8?2012-12-16
  • 0
    Can you clarify what you mean by a "fixed space"? Are you thinking in terms of the space a Turing machine would need to compute the number, or the number of symbols it takes me to write down the function, or the number of operations necessary to calculate the number? (Give an example of something which does *not* fall under your criterion.)2012-12-16
  • 0
    Honestly, any set of leading digits would do. Maybe "any even leading digit" would look better.2012-12-16
  • 0
    @MarioCarneiro: Basically I'm trying to stick to sequences that can be generated from a bounded number of starting bits. Put another way, I'm trying to find sequences that have something like a closed form representation I can work with algebraically and reason about, rather than somebody just dropping a manually figured out list of numbers lacking any generality.2012-12-16
  • 0
    @MarioCarneiro: My ultimate goal does involve using this in a computer program, so leading digits are a bit problematic to extract, unless it's the binary leading digit, but that's always 1 if you consider the 'leading digit' to be the most significant bit, or if you consider it the left most bit in the word just splits the space evenly in half, the numbers below $\frac{2^n - 1}{2}$ and the numbers equal and above. Sorry, I realize I'm springing more details than are in my question, I was trying to capture the essence and keep it succinct, obviously didn't succeed :P2012-12-16
  • 1
    That is fine, the process works in any base. How about "the second-leading digit is 1" for numbers greater than 2? (By leading digit in this context I do mean the one which is always 1 in base 2.) Also, you should edit the OP to reflect the constraints, so others don't have to read the comments.2012-12-16
  • 0
    @MarioCarneiro: That may work. Makes me wonder if you can just say, all the numbers where some subset of the bits are a fixed, e.g. all numbers where the 3rd bit is 1, the 5th bit is 1, and 7th bit is 1.2012-12-16
  • 1
    Sorry, but no. Any constraint on the *last few digits* is equivalent to $n\bmod 2^k\in A$ for some $k$ and some $A\subseteq\{0,1,2,\dots,2^k-1\}$, and so it will always have the same "density" with window size $m\cdot2^k$ for $m\geq 1$. The thing which makes my trick work is that as the number gets bigger, the "period of oscillation" grows along with it, so it will eventually be longer than any fixed window.2012-12-16
  • 0
    @MarioCarneiro: I've added a constraint that I think gets at what I'm going for, getting the Nth element easily. I don't totally follow your explanation for why a set of fixed bits doesn't work -- are you saying that my specific example of constraining that the 3rd/5th/7th bit wouldn't work, or are you just saying it's not always true that any subset of fixed bits will work because if you only pick a chunk of adjacent digits at the beginning/end you only get multiples or all numbers below a threshold?2012-12-16

1 Answers 1

1

If you put any constraint on any set of fixed bits (as in fixed with respect to the radix point), the constraint can be converted to the form "the least-significant $m$ digits form a number in this list...". Let me back up this claim with a proof. Say the constraint is $P(n):=[n_{x_1}=b_1\wedge n_{x_2}=b_2\wedge\dots\wedge n_{x_k}=b_k]$, where $n_i$ is the $i$-th least-significant bit, the $\{x_i\}_1^k$ form an increasing sequence, and $b_i\in\{0,1\}$ are fixed constants for each $i$.

Then the most-significant constraint is for the $x_k$-th bit, so letting $m=x_k+1$, if I let $A=\{n\in\{0,1,\dots,2^m-1\}:P(n)\}$, then if there is some $n\in A$, adding $2^m$ to $n$ will not change any bits below the $m$-th position, so $P(n+2^m)$ is also true, and indeed $P(n+2^ma)$ is true for $a\in\mathbb Z$ as well. Moreover, the same is true for numbers that do not satisfy the constraint, so that $P(n+2^ma)\iff P(n)$ for all $a\in\mathbb Z$. Therefore, $$P(n)\iff n\bmod2^m\in A.$$

Thus if the window is of size $2^m$, the mapping $n\mapsto n\bmod2^m$ takes any set $\{c,\dots,c+2^m-1\}$ to $\{0,\dots,2^m-1\}$, so the "window density" is $|A|/2^m$ regardless of $c$. You get this same result if the window size is a multiple of $2^m$, so it fails your "cycling through a fixed set of differences" test.

  • 0
    I'm probably being dense, but I don't see how your first sentence could be true. If the number is 32-bits for example, fixing bits 3, 5, and 7 to particular values doesn't put any constraint in the "leading digits", that is, the leading bits, 8-32. Unless by first digits you mean the least significant bits, but it doesn't impose any constraint on bits 0-2 either, so I'm still not sure what you mean. Is k indexing the total number of bits, or is it indexing only the bits that we've locked?2012-12-16
  • 0
    Sorry, by "first" I meant it in little-endian sense, i.e. "least significant". I've edited it to reflect that.2012-12-16
  • 0
    Your edit helps a bit, thanks. So it sounds like you're saying that if you take the most significant locked bit, $x_k$, you can keep adding $x_k + 1$ and get numbers satisfying the constraint that are evenly spaced. But isn't the neglecting the unfixed bits that are below $x_k$? Why don't we get variable density from those? I'm actually unsure if your conclusion is answering the question or saying it can't be answered -- my test is to eliminate possibilities, so failing the test would mean a proof that fixing bits works, but in the comments on the question you said fixing them doesn't?2012-12-16
  • 0
    To take a more manageable example, if I say the 2nd bit is 1 and leave the other two free, this is equivalent to $n\bmod8\in\{4,5,6,7\}$. The way the "unconstrained" numbers are accounted for is that I am listing *all* solutions, ranging over all unspecified bits. You do get variable density, but my argument says that for large enough windows, you lose all of that variation, since it is a long but repeating pattern, and you are averaging over the whole thing. (And my proof is answering the question, in the negative.)2012-12-16
  • 0
    Actually my confusion might stem from what you mean by $x_k$ and $x_k + 1$. I interpret $x_k$ to be the 0 or 1 in the kth bit, where k is offset from the radix point of the most significant bit. Did you mean $x_{k+1}$ instead of $x_k + 1$? Because $x_k$ would just be 1 or 0, which when 1 is added would be two? Or maybe by addition you meant string concatenation? I suck at notation :(2012-12-16
  • 0
    Oooooh, that makes much more sense.2012-12-16
  • 1
    If I say "the 2nd bit is 1", then $P(n):=[n_2=1]$, so $k=1$, $x_1=2$, and $b_1=1$. If I said "3rd is 0 and 5th is 1", then $k=2$, $x_1=3$, $x_2=5$, $b_1=0$, and $b_2=1$. Does this help?2012-12-16
  • 0
    Yes, much. Thanks for your patience explaining :)2012-12-16