4
$\begingroup$

Suppose we have a set $M$ of random variables on a probability space. Then we defined boundedness of $M$ as,

$M$ is bounded if $\sup_{X\in M}P(|X|>N) \to 0$ as $ N \to \infty$.

This definition means that the measure of the set where the elements of $ M $ are big is very small, in fact tends to zero. I have three questions to this definition and some further conclusions.

Now suppose we have a unbounded set. My questions are:

  1. Unbounded would mean, that for every $N>0$ there exists an $ \epsilon >0 $ and a $ X \in M $ such that $ P(|X| > N) \ge \epsilon $. Is this conclusion right?
  2. If theres a sequence $ (X_n) $ of random variables, unbounded and positive, then there's a subsequence $ (X_{n_k}) $ and a $\lambda>0 $ such that $ P(|(X_{n_k})| > k)\ge \lambda $ for every $ k \in \mathbb{N}$.

My observations so far to 2. After the comment of Srivatsan (see below), we therefore have: \exists \epsilon > 0 sucht that for all N>0 exists a $ X_n $ such that P(X_n(\omega) > N) \ge \epsilon Put $ N=1 $, hence there is a $ X_{n_1} $ sucht that P(X_{n_1} > 1 ) \ge \epsilon. Now put $ N=2 $, hence there is a $ X_i $ such that $ P(X_i > 2) \ge \epsilon$. Now the problem is, why do I know that $ i> n_1 $ ? Otherwise it isn't a subsequence.

Thanks for your help

hulik

  • 0
    I deleted question 3. There was a mistake! I also updated my thoughts about 2. I would appreciat it much if somone could help me.2011-12-19

1 Answers 1

4

(1) is not correct. Let $Y$ be a single random variable such that, for every $N$, we have $P(|Y| > N) > 0$; for example, a normal random variable. (I would call such a random variable "unbounded", but that conflicts with your terminology.) Then let $M = \{Y\}$. $M$ is bounded in your sense (exercise: verify this, using continuity of probability) but it is still true that for every $N$ there exists $X \in M$ (namely $X=Y$) and $\epsilon > 0$ (namely \epsilon = \frac{1}{2}P(|Y|>N)) such that $P(|X| > N) > \epsilon$.

Srivatsan's comment above gives a corrected statement; I just wanted to show explicitly that the statement in the original question (for every $N>0$ there exists an $\epsilon >0$ and $X \in M$ such that P(|X|>N) \ge \epsilon) is not correct.

Regarding (2): You know that there is an $\epsilon$ such that for any $N$ there is an $X_n$ with $P(|X_n|>N) > \epsilon$. In fact, for any $N$ there are infinitely many such $X_n$; thus you can always choose one that occurs later in the sequence than all the ones chosen so far. To see there must be infinitely many such $X_n$, suppose there were only finitely many and show that in fact we would have to have $\sup_n P(|X|>N) \to 0$ as $N \to \infty$. (Warmup: what if there were only one such $X_n$? Now what if there were only two?)

Hint: Fix a random variable $X$. This is a measurable function $X : \Omega \to \mathbb{R}$; for every $\omega \in \Omega$, $|X(\omega)|$ is some real number, and so there is an integer (depending on $\omega$) which is greater than it. It follows that $\bigcap_{N \in \mathbb{N}} \{|X| \ge N\} = \emptyset$. Now "continuity from above" (which follows from countable additivity) implies that $\lim_{N \to \infty} P(|X| \ge N) = 0$. This is the key step you need.

  • 0
    @ Nate Eldrege: Thanks for your help! But there still some question around: The case for just one such $X$ is clear. But how could I prove the general case? Suppose there are k such $ X_i $, then I know, that P(|X_i|>N) \to 0 as $ N\to \infty$ as in the case for just one such $ X $, right? Therefore the sup over all this is $0$. This is the whole arguement for the general case, right? Or am I wrong? How else should I prove the general case?2011-12-23