0
$\begingroup$

Say I have two random variables X and Y from the same class of distributions, but with different means and variances (X and Y are parameterized differently). Say the variance converges to zero as a function of n, but the mean is not a function of n. Can it be formally proven, without giving the actual pdf of X and Y, that their overlap area (defined the integral over the entire domain of min(f,g), where f,g are the respective pdfs) converges to zero when n goes to infinity? Perhaps this is too obvious...?

  • 1
    Yes, use the Chebyshev Inequality (Wikipedia). Probably should have called the random variables $X_n$, $Y_n$.2012-02-09
  • 0
    What do you mean by "their overlap area"?2012-02-09

2 Answers 2

2

The answer is yes. Let's assume the means verify $\mu_X < \mu_Y$, and let $c =(\mu_X +\mu_Y)/2$ the middle point. The "overlap area" (?) is

$$\int_{-\infty}^{\infty} \min(f_X(x),f_Y(x)) dx = \int_{-\infty}^c \cdots dx + \int_{c}^{\infty} \cdots dx$$

The second term is:

$$\int_c^{\infty} \min(f_X(x),f_Y(x)) dx \le \int_c^{\infty} f_X(x) dx =P(X \ge c) \le P\left(|X - \mu_X| \ge \epsilon\right)\le \frac{\sigma_X^2}{\epsilon^2}$$

where $\epsilon = c/2$, and we've used the Chebyshev's inequality. Because the variance $\sigma_X^2$ tends to zero, so does this term; and the same goes to the other. Then, $\int_{-\infty}^{\infty} \min(f_X(x),f_Y(x)) dx \to 0$

0

As long as the means are different, when variances go to zero, overlap goes to nothing.

  • 0
    This is indeed the conclusion the OP is interested in and, presumably, cannot reach...2012-02-19
  • 0
    Maybe one could appeal to an argument based on measure theory, such that any point other than the two means has measure zero attached to it.2012-02-21
  • 0
    If you have a solution, please post it here. Otherwise, I suggest to avoid presenting as a solution a mere rephrasing of the question.2012-02-21