7
$\begingroup$

In reading about the average case analysis of randomized quick sort I came across linearity of expectations of indicator random variable I know indicator random variable and expectation. What does linearity of Expectation mean ?

  • 0
    @Geek: well, my comment was not about the context of your question - it was about the (preferred) way one asks questions on MSE: **the question has to be well-thou$g$ht.** So, people who answer your question can rely upon this fact. If you $g$et t$h$e answers to the or$i$ginal question, it is not expected they you will update your question *asking for more* (well, of course you can update it if asked to clarify the meaning of the original question). Please, take care of this in the future. In the meantime, I updated my answer where I mentioned how the independence plays the role in products.2012-08-21

2 Answers 2

6

Let $\xi_1,\xi_2:\Omega\to\mathbb R$ be two random variables on the same probability space $(\Omega,\mathscr F,\mathsf P)$ . The expectation of either is defined by $ \mathsf E\xi_i:= \int_\Omega \xi_i(\omega)\mathsf P(\mathrm d\omega). $ The linearity of the expectation means that for any constants $\alpha_1,\alpha_2\in\Bbb R$ it holds that $ \mathsf E[\alpha_1\xi_1+\alpha_2\xi_2] = \alpha_1\mathsf E\xi_1+\alpha_2 \mathsf E\xi_2 $ which follows directly from the linearity of the Lebesgue integral in the definition of the expectation. Hence, the functional $\mathsf E$ defined over the space of random variables on the probability space $(\Omega,\mathscr F,\mathsf P)$ is linear.

For the independence over the product, yet again if $\xi_1,\xi_2,\dots,\xi_n$ are random variables on the same probability space as above, and they are mutually independent then $ \mathsf E\left[ \prod_{i=1}^n\xi_i\right] = \prod_{i=1}^n\mathsf E\xi_i $

  • 0
    Well, you may consider it as a technical condition saying that the notion of the expectation (of a sum/product of several random variables) and independence of random variables are well-defined.2012-08-21
4

The expectaion is a linear operator. This means it satisfies the linearity properties of a function/operator. The linearity is defined as

$af_1(x_1)+bf_1(x_2)=f_1(ax_1+bx_2)$

As an example to Expectaion

$E[\frac{1}{N}\sum_iX_i]=\frac{1}{N}\sum_iE[X_i]$ and assume that $E[X_i]=\mu$ $\forall i$ then you get

$\frac{1}{N}\sum_iE[X_i]=\mu$