1
$\begingroup$

Here is a problem I am working on, hoping to get some guidance from the experts here:

Given the arrays $C=[C_1,C_2,...,C_N]$ and $S=[S_1,S_2,...,S_N]$ of lengths $N$ with elements that are discrete iid uniform distributed with equal probability (p=1/2) of being $\pm$ 1

Consider the sum below: \begin{equation*} A=\sum_{l=1}^N \sum_{m=1}^N \sum_{n=1}^N C_lC_mC_n+S_lS_mC_n-C_lS_mS_n+S_lC_mS_n \end{equation*} Let's assume $N$=16 (or higher). I'm trying to find the probability distribution/density of $A$ which is essentially the sum of $4N^3$ terms with each term being a triple product of discrete iid uniform rvs. It can be shown that these terms ($C_lC_mC_n, S_lS_mC_n, C_lS_mS_n, S_lC_mS_n$) are also uniform distibuted with p=1/2 for $\pm$1. Furthermore, it can be shown that they are uncorrelated (e.g. $E[(C_lC_mC_n)(S_lS_mC_n)]=0$ since they are zero mean ($E[C_i]=E[S_i]=0$) and $C_n^2=1$ ).

Edit: As pointed out in comments below, the product terms are pairwise independent however they are not all jointly (mutually) independent. Therefore the question remains unsolved for now.

So here are the questions:

  1. What is the probability density/distribution of $A$? How can the pdf of $A$ be determined knowing that the product terms are identically distributed, zero mean, uncorrelated and pairwise independent but not jointly (mutually) independent?

  2. If an exact solution is not tractable/feasible, are there any approximations that need to be used?

  3. What is the variance of $A$? Since the product terms are uncorrelated and each has variance of 1, the variance of $A$ seems to simply be the sum of the variances of all the terms (i.e. the number of terms $4N^3$).

  4. While the central limit theorem (CLT) applies for iid rvs, in this case the terms are uncorrelated, identically distributed and pairwise independent but not jointly (mutually) independent. Is there a special case of the CLT that can be applied here?

  • 0
    It seems $c_lc_mc_n$ and $s_ls_ms_n$ are also independent because $E[c_lc_mc_n\,s_ls_ms_n]=E[c_lc_mc_n]E[s_ls_ms_n]$ since $E[c_lc_mc_n]=E[s_ls_ms_n]=0$?2012-09-13
  • 0
    That's true but I'm not sure if that would help with the solution since the terms in the sum ($c_lc_mc_n, s_ls_mc_n, c_ls_ms_n, s_lc_ms_n$) haven't been shown to be independent.2012-09-13
  • 0
    they all have zero mean and correlation zero right?2012-09-13
  • 0
    ok I see what you're saying, the terms being independent really simplifies it and the CLT can be applied without any issues. Thanks!2012-09-13
  • 0
    Isn't that first summand $(\sum C_i)^3$ etc ? If so the limiting ditribtion is divide by $n^{-\frac 32} $ and get $C^3 _ S^2C $ with S,C independent normal.2012-09-14
  • 0
    This question still remains open because even though as Seyhmus has pointed out, the product terms ($C_lC_mC_n, S_lS_mC_n, C_lS_mS_n, S_lC_mS_n$) from the sum above are pairwise independent, they are not jointly (mutually) independent. That is the case for some rvs $X_i$ (i.e they are jointly independent) iff $E[\prod_{i=1}^nX_i]=\prod_{i=1}^nE[X_i]$. In the case above, $E[C_lC_mC_nS_lS_mC_nC_lS_mS_nS_lC_mS_n]$=1 since C_i^2=S_i^2=1 while the product of expectations is 0.2012-09-14
  • 0
    So based on the above, how can the pdf of $A$ be determined knowing that the product terms are identically distributed, zero mean, uncorrelated and pairwise independent but not jointly (mutually) independent?2012-09-14
  • 0
    Mike, thanks for the comment, I wasn't able to quite follow how it may help simplify the problem or help with a solution but please feel free to elaborate further or provide a reference.2012-09-14
  • 0
    Upon further study, turns out that $E(C_lC_mC_n S_lS_mC_n)=0=E(C_lC_mC_n)E(S_lS_mC_n)=0$ is not sufficient to show independence. See the "Note on Converse to Corollary" here: http://www.proofwiki.org/wiki/Condition_for_Independence_from_Product_of_Expectations2012-10-01
  • 0
    Well, turns out uncorrelated Bernoulli random variables are independent. The proof is here: http://math.stackexchange.com/questions/205597/how-to-show-that-these-random-variables-are-pairwise-independent2012-10-02
  • 0
    Also, it has been proven by Hong (1995) and others that if $X_1, X_2,...$ are jointly symmetric, pairwise independent and identically distributed with a finite second moment, then the central limit theorem holds for $X_1, X_2,...$. Reference: Hong, D.H., "A remark on the C.L.T. for sums of pairwise i.i.d. random variables," Math. Japon. 42, 87-89, 1995.2012-10-02

0 Answers 0