3
$\begingroup$

I need to calculate the Cramer Rao lower bound of variance for the parameter $\theta$ of the distribution $$f(x)=\frac{1}{\pi(1+(x-\theta)^2)}$$

How do I proceed I have calculated $$4 E\frac{(X-\theta)^2}{1+X^2+\theta^2-2X\theta}$$

Can somebody help

2 Answers 2

0

It is easier to work with a single observation $X_1$ to find the information function $I(\theta)$.

This is because we have the alternative expression $\displaystyle I(\theta)=n\,\text{E}_{\theta}\left[\frac{\partial}{\partial\theta}\ln f_{\theta}(X_1)\right]^2$.

For $x_1\in\mathbb R$ and $\theta\in\mathbb R$, one has the density of $X_1$

\begin{align}&f_{\theta}(x_1)=\frac{1}{\pi((x_1-\theta)^2+1)}\\&\vdots\\&\implies \left[\frac{\partial}{\partial\theta}\ln f_{\theta}(x_1)\right]^2=4\left[\frac{x_1-\theta}{1+(x_1-\theta)^2}\right]^2\\&\implies \text{E}_{\theta}\left[\frac{\partial}{\partial\theta}\ln f_{\theta}(X_1)\right]^2=4\text{E}_{\theta}\left[\frac{X_1-\theta}{1+(X_1-\theta)^2}\right]^2\tag{1}\end{align}

Now, \begin{align}\text{E}\left[\frac{X_1-\theta}{1+(X_1-\theta)^2}\right]^2&=\frac{1}{\pi}\int_{\mathbb R}\left[\frac{x-\theta}{1+(x-\theta)^2}\right]^2\frac{1}{1+(x-\theta)^2}\,\mathrm{d}x\\&=\frac{1}{\pi}\int_{\mathbb R}\frac{(x-\theta)^2}{(1+(x-\theta)^2)^3}\,\mathrm{d}x\\&=\frac{2}{\pi}\int_0^\infty\frac{t^2}{(1+t^2)^3}\,\mathrm{d}t\\&=\frac{1}{\pi}\int_0^\infty\frac{\sqrt u}{(1+u)^3}\,\mathrm{d}u\\&=\frac{1}{\pi}B\left(\frac{3}{2},\frac{3}{2}\right)=\frac{1}{8}\end{align}

That is, from $(1)$, we have $\displaystyle\text{E}_{\theta}\left[\frac{\partial}{\partial\theta}\ln f_{\theta}(X_1)\right]^2=4\times\frac{1}{8}=\frac{1}{2}\quad\forall\,\theta$.

So finally we get $\displaystyle I(\theta)=\frac{n}{2}\quad\forall\,\theta$

The Cramer-Rao lower bound for $\theta$ is then $1/I(\theta)=2/n\quad\forall\,\theta$.

-1

First of all you should notice that there's no suficient estimator for the center of the bell $\theta$. Let's see this. The likelihood for the Cauchy distribution is $$L(x;\theta) = \prod _i^n \frac{1}{\pi\left [ 1+(x_i-\theta)^2 \right ]}, $$ and its logarithm is $$\ln L(x;\theta) = -n \ln \pi -\sum_i^n\ln\left [ 1+(x_i-\theta)^2 \right ].$$ The estimator will maximize the likelihood and if there's a suficient estimator it's derivate can be factoriced i.e.: $$\frac{\partial L(x;\theta)}{\partial \theta} = A(\theta)\left[t(x)-h(\theta)-b(\theta)\right], $$ where $A(\theta)$ is a function exclusively from the parameter, $t(x)$ is function exclusively of your data, $h(\theta$) is what you want to estimate and $b(\theta)$ is a possible bias.

If you derivate you should notice that the Cauchy distribution can not be factorized, but Cramer-Rao lets you find a bound for the variance, that is $$ Var(t) \geq \frac{\left(\frac{\partial}{\partial \theta}(h+b)\right)^2}{E\left[(\frac{\partial}{\partial \theta}\ln L)^2\right]},$$ where the equality hold only if $\frac{\partial \ln L}{\partial \theta}$ can be factorized.

The most you can do is calculate the bound but it has no analytic closed solution for $\theta$