0
$\begingroup$

I don't know that determine is the right word, but I try to explain. What I need to understand. :) So.. We know's that if a function fit this conditions:

  • Monotonically non-decreasing for each of its variables
  • Right-continuous for each of its variables.

$$ 0 \le F(x_1,\ldots,x_n) \le 1 $$ $$ \lim_{x_1,\ldots,x_n\to\infty} F(x_1,\ldots,x_n)=1 $$ $$ \lim_{x_i\to-\infty} F(x_1,\ldots,x_n) = 0,\text{ for all } i $$ then the function is or can be a cumulative distribution function.

In this logic the cumulative distribution function determine the random variable? How I can prove it in mathematical way? This is true, I understand in my own way, but not mathematically.

Maybe we can start that the cumulative distribution function determine the probability distribution and vica versa. But how I can prove it mathematically that, the probability distribution determine random variable?

Thanks for your explanation, I am really grateful:)

  • 2
    Not all functions satisfying the conditions you have stated are necessarily cumulative distribution functions (CDFs). You also need to have (for the case $n=2$) that for all $a and $c that $$P\{a and similarly for larger $n$. For example $$F(x,y)= \begin{cases} 1, & x \geq 1 ~ \text{or}~ y\geq 1,\\0, &\text{otherwise,}\end{cases}$$ is non-decreasing, right-continuous, etc but is not a valid CDF. Also, the CDF does _not_ determine a random variable. Note, for example, that $X\sim\text{Bernoulli}(0.5)$ and $1-X$ have same CDF.2012-11-03
  • 0
    thank you the answer, I not understand sure yet, but I try to understand. Can you explain with more detail why CDF not determine the random variable, with examples and expressions? thank you very much.2012-11-03
  • 1
    The CDF determines what _kind_ of random variable you have, but not _which_ random variable you have. _All_ random variables with CDF $$F(x)=\begin{cases}0,&x< 0,\\\frac{1}{2},&0\leq x<1,\\1,&x\geq 1,\end{cases}$$ are _called_ Bernoulli random variables with parameter $\frac{1}{2}$. If $X\sim\text{Bernoulli}(\frac{1}{2})$, then so is $Y=1-X$ a Bernoulli random variable with parameter $\frac{1}{2}$ and $P\{X=Y\}=0$. If $X=1$ iff the first toss of a fair coin was H and $Y=1$ iff the second toss was H, then $X$ and $Y$ are _independent_ Bernoulli random variables and $P\{X=Y\}=\frac{1}{2}$.2012-11-03
  • 1
    To add to my comment above, $X$ and $Y$ are the same _kind_ of random variable in my examples above, but they are not the same _variable_ ($X$ is _not_ the same as $Y$: if they were the "same" _variable_, then it would be the case that $P\{X=Y\}=1$.) The technical name for "being the same" is _almost surely_ (abbreviated a.s.) and one would say that $X=Y$ a.s. Also, in my two examples, $X$ and $Y$ have different relationships between them: in the first example, $Y$ is a function of $X$ while in the second example, $X$ and $Y$ are mutually independent.2012-11-03
  • 0
    I understand now, thank you, please write an answer and I accept it, because you are the first.2012-11-05

1 Answers 1

2

In general the CDF does not determine the distribution function. Consider for instance the uniform distributions over $[a,b]$ and over $(a,b)$. The distribution functions are different but it is straightforward to check that the CDFs are identical.

  • 2
    and similarly a uniform distribution over the irrationals in $(a,b)$. But they are identical distributions except for a set of probability $0$.2012-11-03
  • 0
    thank you I understand now, this is a good example to. )2012-11-05