0
$\begingroup$

It's basically a textbook question however I want to make sure about every derivation step.

X, Z are random variables.

Z follows Bernoulli distribution. The probability density function of Z: $h(z) = x\delta(1) + (1-x)\delta(0)$,

where $\delta(\cdot)$ is Delta distribution, meaning, $\delta(\cdot)=1, \text{ if } z=\cdot; \delta(1)=0, \text{ otherwise}$

The probability density function of X: $f_x$

Then we have $E[Z] = \int xf_xdx$.

I was wondering if it is, $E[Z] = E[1\times x + 0\times (1-x)]=E[1\times \int xf_xdx]=\int xf_xdx$

If it is not, would you please show me the derivation steps? Thanks!

Looking forward to your reply.

  • 0
    @BogdanLataianu: Yes, I did.2012-02-05

1 Answers 1

0

As suggested by leonbloy, I am converting my comment to an answer.

$X$ is a random variable that necessarily takes on values in $[0,1]$.

Conditioned on $X=x$, $0 \leq x \leq 1$, $Z$ is a Bernoulli random variable with parameter $x$. Thus, conditioned on $X=x$, the conditional expected value of $Z$ is $E[Z|X=x] = 1\times x + 0\times (1-x) = x.$ The conditional expected value of $Z$ depends on the value taken on by $X$, that is, it is a function of the random variable $X$, and this random variable is denoted as $E[Z|X]$. In this instance, it is obvious that $E[Z|X]=X$ itself. Now, the law of iterated expectations gives us that $E[Z]=E[E[Z|X]]$ where it should be noted that the outer expectation is the expectation of a function of $X$. Thus, assuming that $X$ is a continuous random variable, we have $E[Z]=E[E[Z|X]]=E[X]=\int_{-\infty}^\infty xf_X(x)\mathrm dx = \int_0^1 xf_X(x)\mathrm dx$ since $f_X(x) = 0$ for $x < 0$ or $x > 1$.

  • 0
    Thanks for the response. Marking this as the answer, however I would like to revise a bit. $E[Z]=E[E(Z|X)]=\int E[Z|X=x]f(x)dx=\int xf(x)dx$. Then the solution looks more right and does not depend on the obvious $E[Z|X]=X$.2012-02-05