1
$\begingroup$

Let $X_1, ..., X_n$ i.i.d. from $X$ where $X$ is binary and we have p = P{$X=1$} ,$p\in (0,1)$. Let $\alpha = \frac{p}{1-p}$. Define $\bar{a} = \frac{\bar{X}}{1-\bar{X}}$, show that $\bar{\alpha}$ is unbiased and consistent for $\alpha$.

So to prove unbiased I show: $E[\bar{X}] = E[X]$ since we have $X$ binary, $p = E[X]$. By the continuous mapping theorem since the numerator $=p$ in probability and the denominator $=1-p$ we have unbiased $\bar{\alpha}$.

To prove consistency, we note $\bar{X} \rightarrow E[X]$ (in probability) by weak law of large numbers and the same for $1-\bar{X}$. By continuous mapping, $\alpha$ is consistent.

I feel like there there are flaws in my arguments somewhere.

  • 4
    The problem is that $\bar\alpha$ is not even defined on the whole probability space (as soon as $p\ne0$) since $[\bar X=1]$ has positive probability $p^n$. In particular $E(\bar\alpha)=+\infty$. (And you might want to review the statement of the continuous mapping theorem.)2012-11-05
  • 0
    So if for random variables X and Y $X\rightarrow a$ and $Y\rightarrow b$ and $g$ is continuous at $(a,b)$ then $g(X, Y) \rightarrow g(a,b)$. That means I should try to show individually that $1-\bar{X} \rightarrow 1-p$?2012-11-05
  • 0
    I've tried to work through the problem again and can't see why $E[\bar{\alpha}] = \infty$2012-11-05
  • 0
    Indeed $\bar\alpha_n\to\alpha$ almost surely when $n\to\infty$ hence the (sequence of) estimator(s) $(\bar\alpha_n)$ is consistent. But, for every $n$, as I said, $E(\bar\alpha_n)\ne\alpha$ hence none of the random variables $\bar\alpha_n$ is an *unbiased* estimator.2012-11-05
  • 1
    Because $\bar\alpha=+\infty$ on $A=[\bar X=1]$ and $P(A)\ne0$.2012-11-05
  • 0
    But $P\{X=1\} = p < 1$ strictly2012-11-05
  • 1
    And? Note that $[\bar X=1]=[X_1=X_2=\cdots=X_n=1]$.2012-11-05
  • 1
    $\Pr(\bar{X}=1)=p^n \gt 0$ so $\Pr(\bar{\alpha}=\infty)\gt 0$ so $E[\bar{\alpha}]=\infty$.2012-11-05

0 Answers 0