3
$\begingroup$

Find the probability that $x^2 - 2ax + b$ has complex roots if the coefficients $a$ and $b$ are independent random variables with the common density

  1. uniform, that is $1/h$, and
  2. exponential, that is $\alpha e^{-\alpha x}$

This comes down to finding $P(a^2 \lt b)$. But since $a$ and $b$ are both random variables, would it be $P(a^2\lt b) = P(x\lt k)P(y \lt k^2)$? That doesn't seem particularly correct.

  • 2
    I've tried to fix your TeX. Please check if I have introduced any mistakes. Moreover, you probably mean non-real roots, otherwise it would be trivially $1$ :)2011-04-18
  • 1
    For 1, is the range of $a,b \ [0,h] \text{ or } [\frac{-h}{2},\frac{h}{2}]$ or what?2011-04-18
  • 1
    I guess $a$ and $b$ are independent?2011-04-18
  • 0
    the range of 1) is 00. And yes, a and b are uncorrelated2011-04-18
  • 0
    The $a$ in the quadratic is probably not the same $a$ in $ae^{-ax}$ so you should change the latter $a$ to something else. As it is, sampling $a$ from the density $ae^{-ax}$ doesn't make much sense in this context.2011-04-19
  • 0
    Nitpick: The probability is 1. The polynomial always has complex roots, as the reals are a subset of the complex numbers. Perhaps the question should be "non-real" roots...2011-04-19

3 Answers 3