8
$\begingroup$

Let $p$ be a prime. If $1_A(x)$ denotes the indicator function of the set $A\subset\mathbb{Z}/p\mathbb{Z}$ and $$\hat{1}_A(t):=\frac{1}{p}\sum_{n=1}^p 1_A(n)e^{2\pi i \frac{nt}{p}}$$ denotes the Fourier transform of $1_A$, then what can be said about $$\mathbb{E}\left( \sup_{t\neq 0} |\hat{1}_A(t)|\right)?$$ Do we have an upper bound of the form $O\left(\frac{\log p}{\sqrt{p}}\right)$ as $p$ goes to infinity?

Alternate wording: For each subset $A$ of the $p$ roots, let $Sum(A)$ to denote the sum of the elements in $A$, and look at $A^t:=\{a^t:\ a\in A\}$ for integers $0

Remarks: This question originated from an optional homework problem in my Arithmetic Combinatorics class. I tried some fairly long and drawn out things that did not work. Also, note that since $p$ is prime, taking the power for $0

  • 2
    This might be a silly question, but how do you take the sup of a set of complex numbers ?2012-01-19
  • 0
    Just an aside regarding terminology: It might be good to be cautious with the use of the terms "characteristic function" and "Fourier transform" here. Indeed, in probability the use of the term "[characteristic function](http://en.wikipedia.org/wiki/Characteristic_function_%28probability_theory%29)" is interestingly enough a Fourier transform and, the term "[indicator function](http://en.wikipedia.org/wiki/Indicator_function)" is used for functions like $1_A$ that are usually referred to as "characteristic functions" in other branches.2012-01-20
  • 0
    @JoelCohen: Sorry! My mistake, I forgot some critical absolute value bars.2012-01-20
  • 0
    Disregard my comment last night, I was thinking of a similar but different problem.2012-01-20

1 Answers 1

2

A simple probabilistic approach works. In fact, I get a bound of $O(\sqrt{p \log p})$ on the expectation of $\sup_t |Sum(A^t)|$.

Step 1: In this step, let's assume $t=1$, and bound the random variable $Sum(A)$ where $A$ is a uniformly random subset of the roots of unity $\{ 1, \zeta, \ldots, \zeta^{p-1} \}$. Let $X_i$ be the complex random variable that counts the contribution of $\zeta^i$ to the sum; i.e., $X_i$ equals $\zeta^{i}$ if $A$ contains it, and zero otherwise.

Since the $X_i$'s are independent, we may think of bounding their sum using the Chernoff-Hoeffding bound. There is a slight annoyance that our random variables are complex-valued; one simple fix is to bound the real and imaginary parts separately, and then combine them using the triangle inequality. Doing this calculation, we find that the probability that $|Sum(A) - E[Sum(A)]| = |Sum(A)|$ exceeds $2c \sqrt{p \log p}$ is at most $\exp \left(- \frac{2 c^2 p \log p}{4p} \right) = p^{-c^2/2}$.

Step 2: For any fixed $t$, the set $B = A^t$ is again a uniformly random subset of the roots of unity, since the $t$-power map simply permutes the roots. So the above analysis, holds if we replace $A$ by $A^t$ (where $t$ is any fixed number). Then, by the union bound, the probability that $\sup_t |Sum(A^t)| \geq 2c \sqrt{p \log p}$ is at most $(p-1) \cdot p^{-c^2/2} = o(1/p)$, choosing $c$ large enough.

This tail inequality can be easily be converted to a bound on the expectation.