2
$\begingroup$

If we suppose that we have a polynomial $q(x)$ of the following form:

$q(x) = \sum_{i=0}^N{c_i x^i} \text{ where } c_i=0 \text{ or } c_i=1$

In other words, if we are given a polynomial with binary coefficients (either 0 or 1), what is the probability that $q(x) \bmod p \equiv 0$ where $p$ is prime? There is a twist. We set $x$ equal to 1. That's the probability I'm after. It should be pretty simple.

Now, what I'm really after is a proof that as $N$ tends to infinity, this probability tends to $1/p$ if we set $x=1$.. I'd be delighted if someone could prove this. However, I'd still like to know the probability for any $N$. Additionally, I'm also interested in the cases where we set $x$ equal to a natural less than $p$.

  • 0
    Do you intend $x$ to be a uniformly distributed integer over $[0,p-1]$ ?2011-10-05
  • 0
    @Sasha: Sorry, I meant a simpler problem. I let $x=1$. Please see the new question. Sorry again for the mistakes.2011-10-05
  • 0
    If you let $x=1$, then you're just looking at $\sum_{i=0}^N c_i$... are you supposing the coefficients $c_i$ to be random? It's not clear in your post where the randomness comes in.2011-10-05
  • 0
    @Bruno Joyal: Essentially, yes. You can assume the $c_i$'s are random taken from the set of 0 and 1 (2 elements). Then I'm looking into the limits as $N$ approaches infinity. However, I realized that I also need to know about the cases when $x$ is set equal to a natural less than $p$.2011-10-05
  • 0
    It seems from intuition that all probabilities as $N$ tends to infinity are just $1/p$ that the polynomial is equivalent to 0. I just can't seem to come up with a good way to prove it.2011-10-05
  • 0
    You need some assumptions on the distributions of the $c_i$. A sufficient but not necessary condition is that all $c_i$ are i.i.d. with $0 < P(c_i = 1) < 1$.2011-10-05
  • 0
    That's fine. Further, you can assume $P(c_i=1)=1/2$. They should all be evenly distributed.2011-10-05

1 Answers 1

3

This is a Markov chain problem. The residue mod p can be any one of p values; as we increment N, we multiply the vector of residue probabilities by a doubly-stochastic matrix with entries 0 and 1/2 which will depend on the value of N and x. This set of matrices will repeat with period equal to the period of x in $(Z/pZ)^*$. The product of the matrices in a single period is also doubly-stochastic, and therefore has a maximal eigenvalue of 1 and an eigenvector $(1,1,\ldots,1)$. In the limit as N gets large, the maximal eigenvalue is the only one which matters, and therefore the probability vector of residues converges to a constant times said eigenvector. Normalizing, we get that g(x) takes on values $[0,\ldots,p-1]$ with probability $1/p$ each in the limit.

  • 0
    N.B. This matrix is just $1/2 I + 1/2 S^{x^N}$, with $S$ being the cyclic-shift matrix.2011-10-05