In the paper
Bach, F. R., & Jordan, M. I. (2002). Kernel Independent Component Analysis. Journal of Machine Learning Research, 3(1), 1-48. doi:10.1162/153244303768966085
I stumpled upon the following claim involving a correlation measure the authors define, the $\mathcal F$-correlation of two univariate random variables $x_1,x_2$ relative to a vector space $\mathcal F$ of functions from $\mathbb R$ to $\mathbb R$, $ \rho_{\mathcal F}= \sup_{f_1,f_2\in\mathcal F} \text{corr}\left(f_1(x_1),f_2(x_2)\right)= \sup_{f_1,f_2\in\mathcal F} \frac{ \text{cov}\left(f_1(x_1),f_2(x_2)\right) }{ \text{var}\left(f_1(x_1)\right)^{1/2} \text{var}\left(f_1(x_1)\right)^{1/2} }. $
The authors state that if $x_1,x_2$ are independent, then $\rho_\mathcal{F}(x_1,x_2)=0$, but they also claim that the converse ($\rho_{\mathcal F}=0~\implies$ $x_1,x_2$ are independent) also holds when $\mathcal F$ is large enough.
My question: As an example, they say that it is well known that if $\mathcal F$ contains the Fourier basis (i.e. functions $f_\omega(x) = \exp(i\omega x)$ with $\omega \in \mathbb R$) then $\rho_{\mathcal F}=0~\implies$ $x_1\bot\!\!\!\bot x_2$. My problem is, that I do not see how this is obviously true and I also failed at proving it. Unfortunately, there is no reference or proof for that claim in the paper. When I tried to prove it myself, I could not find a good starting point. First, I thought that the proof could be done via properties of the characteristic function, but I did not get far with that.
I am explicitly interested in the claim for the Fourier basis and not so much in the more general claim of Bach and Jordan. If anyone could show me how to prove it (or point at a reference) I would be grateful?