1
$\begingroup$

I am familiar with the usual CLT's, but I was wondering whether there would be a corresponding CLT or for the following sort of sum of random variables:

Let $\{u_t\}_{t\in\mathbb{N}}\sim $ iid $(0,\sigma^2)$. Then let us observe that $\forall j=1,...,n-1$ the random variable $u_n^2$ is independent of $u_j^2$ and thus: \begin{equation*} \mathbb{E}\left[ u_nu_j\right] =\mathbb{E}\left[ u_n\right] \mathbb{E}\left[ u_j\right] =0 \qquad \text{ and } \qquad \text{Var}(u_nu_j) =\mathbb{E}\left[ u_n^2u_j^2\right] = \mathbb{E}\left[ u_n^2 \right]\mathbb{E}\left[u_j^2\right] = \sigma^4 \end{equation*} So that the sequence of random variables $\{u_{n+1}u_j\}_{j=1}^n$ for each $n\in\mathbb{N}$ is identically distributed with mean 0 and variance $\sigma^4$ but the sequence of random variables across $j$ is dependent. Then does the sum of random variables: \begin{equation*} \sum_{j=1}^{n} u_{n+1}u_j \quad \overset{\mathcal{D}}\longrightarrow \quad??? \end{equation*}

i.e. After some appropriate normalisation/rescaling converge to something meaningful? (if it does at all...)as $n\rightarrow\infty$)

Thanks for any help in advanced!

2 Answers 2

1

A normalization is required. Let $X$ be a random variable independent of the sequence $\left(u_i\right)_{i\in\mathbb N}$ and having the same distribution as $u_0$ (enlarging the probability space if needed). Then for any positive integer $n$, the random variable $Y_n$ and $Z_n$ defined by $$Y_n=u_ {n+1 } \frac 1 { \sqrt n} \sum_{i=1}^nu_i \mbox{ and }Z_n=X \frac 1 { \sqrt n} \sum_{i=1}^nu_i$$ have the same distribution. The sequence $ \left(Z_n\right)_{n\geqslant 1}$ converges in distribution to $X\cdot N$, where $N$ is a zero-mean Gaussian random variable with variance $\sigma^2$.

  • 0
    Hi Davide, Thank you for this, reply. I definitely agree with this, and I am gathering that your answer implies that \begin{equation*} Y_n \overset{\mathcal{D}}\longrightarrow X\cdot N \end{equation*} Do you think though, there is any way of getting a more "exact/known" distribution of what this sum may possibly converge to? Or do you think with this limited/given information this would be the best asymptotic distribution that one may obtain?2017-01-10
  • 0
    The reason being why I ask is that this "problem" I have asked is but a small part of a larger problem which I am trying to solve (and I eventually need to obtain an (ideally) pivotal asymptotic distribution of an estimator which would rely on not having any unknown parameters in it)!2017-01-10
  • 0
    You can get the characteristic function of the limiting distribution, given by $t\mapsto \mathbb E\left[\exp\left(-t^2\sigma^2u_0^2/2\right)\right]$.2017-01-11
0

If you rescale with $\sqrt{n},$ you get $u_{n+1}\frac{1}{\sqrt{n}}\sum_{j=1}^nu_j.$

$u_{n+1}$ is from it's distribution and $\frac{1}{\sqrt{n}}\sum u_j$ converges in distribution to an independent normal so their distribution is whatever the product of independent samples from a normal and the parent distribution is.

  • 0
    Unfortunately its not quite that simple...$u_{t+1}$ is only iid $(0,\sigma^2)$ we have no normality assumption on it!2017-01-10
  • 0
    @user405613 oh ok, misread. Then it's like whatever the distribution is times a normal.2017-01-10
  • 0
    Thanks for this reply. Yes I agree.2017-01-10