2
$\begingroup$

Here is the Delta Method as I understand it: suppose $\sqrt{n}(X_n - \mu) \overset{d}{\to}\mathcal{N}(0, \sigma^2)$. Let $g$ be differentiable and $g^{\prime}(\mu) \neq 0$. Then $$\sqrt{n}[g(X_n)-g(\mu)]\overset{d}{\to}\mathcal{N}(0, \sigma^2[g^{\prime}(\mu)]^2)\text{.}$$

This is a follow-up to Uniform distribution order statistic converging to negative exponential distribution. I have already proven the following:

Suppose $Y$ is an exponential random variable with mean $\theta$. Assume $X_1, \dots, X_n$ are independent uniform $(0, \theta)$ random variables, $\theta > 0$ a constant. Let $X_{(n)}$ be the largest order statistic.

Show that $$n[X_{(n)}-\theta]\overset{d}{\to}(-Y)$$ as $n \to \infty$.

The next problem states

Let $Y_0$ be an Exponential$(1)$ random variable. Then, show $$n[\ln(X_{(n)})-\ln(\theta)]\overset{d}{\to}-Y_0$$

I can guess as how one would do this: set $g(x) = \ln(x)$, $g^{\prime}(\theta) = \dfrac{1}{\theta}$, so then we would get

$$n[X_{(n)}-\theta]\overset{d}{\to}(-\dfrac{1}{\theta}Y) \overset{d}{=}-Y_0\text{.} $$ But as you can see above, the Delta Method is only applicable for normal distributions. Is there a "generalized" Delta Method that I'm not aware of? I.e., suppose $\sqrt{n}(X_n - \mu) \overset{d}{\to}Y$. Let $g$ be differentiable and $g^{\prime}(\mu) \neq 0$. Then $$\sqrt{n}[g(X_n)-g(\mu)]\overset{d}{\to}g^{\prime}(\mu)\cdot Y\text{?}$$ It also bothers me that we're using $n$ in the problem rather than $\sqrt{n}$. What am I missing?

The solution mentions to use the Delta method (does not explain why) and does not elaborate beyond that it converges in distribution to $(-1/\theta)Y$.

2 Answers 2

1

Recall the derivation of Delta method, i.e., expand $g(X_n)$ at $X_n=\theta$ $$ g(X_n) = g(\theta) + g'(\theta)(X_n - \theta)+o_p(1), $$ rearranging and multiplying by $n$ you get $$ n(g(X_n)-g(\theta)) = g'(\theta)n(X_n - \theta)+o_p(1). $$ Thus, assuming that $n(X_n - \theta)\xrightarrow{d}Y$ and by applying Slutsky's theorem on the RHS you get that $$ n(g(X_n)-g(\theta)) \xrightarrow{d} g'(\theta)Y. $$

  • 0
    So really, the general result is this: suppose $h(n)$ is some function dependent only on $n$ and assume $g$ is differentiable with $g^{\prime}(\theta)$ existing. Then if $h(n) \cdot (X_n - \theta)\overset{d}{\to} Y$, $h(n) \cdot [g(X_n) - g(\theta)] \overset{d}{\to} g^{\prime}(\theta) \cdot Y$. Is this correct? Why is it that we require that $g^{\prime}(\theta) \neq 0$?2017-01-07
  • 0
    1. IMHO, yes. 2. if $g'(\theta)=0$ then the first order Taylor approximation of $g(Y_n)$ is a constant $g(\theta)$, thus you will need to use higher moments in order derive the limiting distribution of $a_n (g(Y_n) - g(\theta))$.2017-01-07
1

Claim:

Given:

$$n[X_{(n)}-\theta]\overset{d}{\to}(-Y)$$

show that:

$$n[\ln(X_{(n)})-\ln(\theta)]\overset{d}{\to}-Y_0$$

Solution:

$$n[\ln(X_{(n)})-\ln(\theta)]=n\log\left(1+\frac{n[X_{(n)}-\theta]}{n\theta}\right)$$

Now, $n[X_{(n)}-\theta]\overset{d}{=}(-Y)+o_p(1)$, and $\log(1+x)=x+o(x)$, so:

$$n[\ln(X_{(n)})-\ln(\theta)]=n\log\left(1+\frac{(-Y)+o_p(1)}{n\theta}\right)=\frac{(-Y)}{\theta}+o_p(1)\overset{d}{=}(-Y_0)+o_p(1)$$

Remarks:

  1. Some of the details are obscured a bit in parts here, but this gives a decent heuristic idea of how to find how asymptotic distributions change under transformations. Note that most derivations of the delta method also involve a Taylor expansion at some stage.
  2. Actually, in this case, finding the exact distribution for $n[\ln(X_{(n)})-\ln(\theta)]$ wouldn't be too awful, though might lead to a slightly more lengthy limit calculation.
  3. An even less rigorous, but perhaps more intuitive approach:

$$X_{(n)}\approx \theta - \frac{1}{n}Y\overset{d}{=}\theta - \frac{\theta}{n}Y_0$$

$$\implies \log X_{(n)} \overset{d}{\approx} \log\left(\theta - \frac{\theta}{n}Y_0\right)=\log\theta - \log\left(\frac{1}{1-\frac{1}{n}Y_0}\right)\approx \log\theta - \frac{1}{n}Y_0$$

using $\log\frac{1}{1-x}=x+o(x)$. The result comes from rearranging the above.