0
$\begingroup$

It's basically a textbook question however I want to make sure about every derivation step.

X, Z are random variables.

Z follows Bernoulli distribution. The probability density function of Z: $h(z) = x\delta(1) + (1-x)\delta(0)$,

where $\delta(\cdot)$ is Delta distribution, meaning, $\delta(\cdot)=1, \text{ if } z=\cdot; \delta(1)=0, \text{ otherwise}$

The probability density function of X: $f_x$

Then we have $E[Z] = \int xf_xdx$.

I was wondering if it is, $E[Z] = E[1\times x + 0\times (1-x)]=E[1\times \int xf_xdx]=\int xf_xdx$

If it is not, would you please show me the derivation steps? Thanks!

Looking forward to your reply.

  • 2
    Conditioned on $X=x$, $Z$ is a Bernoulli random variable with parameter $x$, and thus $E[Z|X=x] = x$. Thus, $$E[Z]=E[E[Z|X]]=E[X]=\int_0^1 xf_X(x)\mathrm dx.$$ You need to make sure that $X$ takes on values in $[0,1]$ with probability $1$, else all the above will make no sense because you cannot have a Bernoulli random variable with parameter $1.2$, say.2012-02-05
  • 0
    you can start to write h(z)=... explicitly as a function. Also, $E[X] = \int xf_xdx$.2012-02-05
  • 0
    @DilipSarwate: that comment should be an answer2012-02-05
  • 0
    I don't quite understand the formula of the density h of Z. Maybe the OP or someone could explain?2012-02-05
  • 0
    @Tim I think it's like this $\delta(a)(z)=1$ if $z=a$ and $\delta(a)(z)=0$ if $z\neq a$ and then define $h(z)=x\delta(1)(z)+(1-x)\delta(0)(z)$ So $\delta(a)$ is a function that depends on $a$-you can call it alternatively and more clearly $\delta_a$2012-02-05
  • 0
    @BogdanLataianu: Thanks! So the probability measure of Z is actually a random one, as a mapping of X?2012-02-05
  • 0
    @Tim $E(Z|X)$ is a random variable and a mapping of X, not the p.m.f. of Z|X=x . Note that h(z) above is the p.m.f. of Z|X=x . Did you write h(z) explicitly? I don't want to spoil the hint for Shuai2012-02-05
  • 0
    @Shuai : You wrote $\delta(\cdot)=1$ if $z=\cdot$. You must have meant $\delta(z)=1$ if $z=\cdot$. And also: $\delta(z)=0$ otherwise rather than $\delta(1)=0$ otherwise.2012-02-05
  • 0
    @Tim Unless you meant the collection of p.d.f's Z|x is a random variable, which is an interesting question and I think it is true. But perhaps a sidetrack.2012-02-05
  • 0
    @BogdanLataianu: Yes, I did.2012-02-05

1 Answers 1

0

As suggested by leonbloy, I am converting my comment to an answer.

$X$ is a random variable that necessarily takes on values in $[0,1]$.

Conditioned on $X=x$, $0 \leq x \leq 1$, $Z$ is a Bernoulli random variable with parameter $x$. Thus, conditioned on $X=x$, the conditional expected value of $Z$ is $$E[Z|X=x] = 1\times x + 0\times (1-x) = x.$$ The conditional expected value of $Z$ depends on the value taken on by $X$, that is, it is a function of the random variable $X$, and this random variable is denoted as $E[Z|X]$. In this instance, it is obvious that $E[Z|X]=X$ itself. Now, the law of iterated expectations gives us that $E[Z]=E[E[Z|X]]$ where it should be noted that the outer expectation is the expectation of a function of $X$. Thus, assuming that $X$ is a continuous random variable, we have $$E[Z]=E[E[Z|X]]=E[X]=\int_{-\infty}^\infty xf_X(x)\mathrm dx = \int_0^1 xf_X(x)\mathrm dx$$ since $f_X(x) = 0$ for $x < 0$ or $x > 1$.

  • 0
    Thanks for the response. Marking this as the answer, however I would like to revise a bit. $$E[Z]=E[E(Z|X)]=\int E[Z|X=x]f(x)dx=\int xf(x)dx$$. Then the solution looks more right and does not depend on the obvious $E[Z|X]=X$.2012-02-05