In general can one say that for a random variable X:
$E[\frac{1}{X}] = \frac{1}{E[X]}$ ?
I've worked out a few examples where this works but I'm not sure how widely this is useful...
In general can one say that for a random variable X:
$E[\frac{1}{X}] = \frac{1}{E[X]}$ ?
I've worked out a few examples where this works but I'm not sure how widely this is useful...
It is very rarely true. Let's do a random example. Let $X$ be uniform on $[1,3]$. Then $E(X)=2$. But $E\left(\frac{1}{X}\right)=\int_1^3 \frac{1}{x}\cdot\frac{1}{2}\,dx=\frac{\log 3}{2}\ne \frac{1}{2}.$
For a simpler example, let $X=1$ with probability $1/2$, and let $X=3$ with probability $1/2$. Then $E(X)=2$.
But $E(1/X)=(1/2)(1)+(1/2)(1/3)=2/3$.
Jensen's inequality for functions of RVs is $\mathbf{E} \varphi(x) \geq \varphi(\mathbf{E}X)$ for convex functions and $\mathbf{E} \varphi(x) \leq \varphi(\mathbf{E}X)$ for concave functions. Clearly $Y = \frac{1}{X}$ is a convex function, so the first inequality holds.
For such a case, it is a good idea to study Jensen's inequality.
Another counterexample to the one given by André Nicolas is this one. Consider $X$ to be a normal distribution with mean $\mu$ and variance one. Then $E[X]=\mu$ but not only is $E[\frac{1}{X}]$ not in general equal to $1/\mu$; rather, it does not exist.
Y = 1/X is a convex function when x > 0
In general if, X>0 , then the following inequality always will be satisfied:
E(1/X)>= 1/E(X)