0
$\begingroup$

The classical CLT states that: \begin{equation} \sqrt{n} (\bar{X} - \mathbb{E}(X)) \stackrel{d}{\rightarrow}\mathcal{N}(0,\textrm{Var}(X)), \end{equation} under an i.i.d sequence $X_1, \dots, X_n$ and finite second moment assumptions.

But what about functions of $X$ estimated with $n$ i.i.d. samples, denoted by $f_n(X)$? That is, suppose we can estimate a function $f(X)$ in the sense that $f_n(X) \stackrel{p}{\rightarrow} f(X)$, where $f(X)$ is the true underlying function. For example, with non-linear regression where $\widehat{E}(Y|X)=f_n(X)$.

Is there a CLT of the following form? \begin{equation} \sqrt{n} (\bar{f}_n(X) - \mathbb{E}(f(X))) \stackrel{d}{\rightarrow}\mathcal{N}(0,\textrm{Var}(f(X))), \end{equation}

How would we go about proving it?

1 Answers 1

0

Assume that $X_1,..., X_n$ satisfy the requirements for the CLT, i.e., $$ n^{1/2}(\bar{X}_n - \mu)\xrightarrow{D}N(0, \sigma^2). $$ Now let us look at $g(x)$ that is continuous and differentiable at the neighborhood of $\mu$, as such taking Taylor expansion of $g$ at $\mu$ we get $$ g(\bar{X}_n) = g(\mu)+g'(\mu)(\bar{X}_n-\mu)+R_n, $$ assume that $R_n\xrightarrow{p} 0$ as $\bar{X}_n \to \mu$, and rearrange the equation $$ (g(\bar{X}_n) - g(\mu)) = g'(\mu)(\bar{X}_n-\mu)+o_p(1), $$ multiplying by $n^{1/2}$ we have $$ n^{1/2}(g(\bar{X}_n) - g(\mu)) = g'(\mu)n^{1/2}(\bar{X}_n-\mu)+n^{1/2}o_p(1). $$ Using Slutsky's theorem the RHS converges to $$ g'(\mu)N(0,\sigma^2) + 0, $$ thus $$ n^{1/2}(g(\bar{X}_n) - g(\mu)) \xrightarrow{D}N(0, \sigma^2 g'(\theta) ). $$ This result called the "Delta Method".

Additionally, if $X_n \to X$ then $g(X_n) \to g(X)$ for $g$ that is continuous a.e. This holds for convergence in probability, distribution and almost surely. Here you can find a proof.