4
$\begingroup$

Suppose F(continuous) is the cdf of a non-negative random variable, and $c_n$,$d_n$ are two positive sequences going to zero as n$ \to \infty$, such that $\frac{c_n}{d_n}\to 1$. Can it be said that $\frac{F(c_n)}{F(d_n)}$ also goes to 1? Are there any additional assumptions which could make this statement true?

  • 0
    What do you think? If you had to guess, what would you say?2017-01-25
  • 0
    It seems to me that the continuity of the CDF would ensure this. Only I am not sure how I would prove it.2017-01-25
  • 0
    I must say I would expect so as well. When I have problems as this one, there's two things I always consider: 1. a proof by contradiction; 2. writing down the definition of continuity and fiddling with it. maybe you can get something out of this.2017-01-25
  • 0
    So far I've only been able to prove it in the case where F is differentiable at 0 and F'(0)>0.2017-01-25

1 Answers 1

0

It seems that there is no general way of answering this question without further information. One assumption which makes this statement true(which I have mentioned in reply to RSerrao's comment) is $F^{1}(0)>0$.In that case we can write $\frac{F(c_n)}{F(d_n)}$ as $\frac{F(c_n)}{c_n}$*$\frac{d_n}{F(d_n)}$*$\frac{c_n}{d_n}$and take limits.