2
$\begingroup$

I was reading the solution to this problem and noticed it used $X = \sum_{i=1}^n X_i$ and $Y = \sum_{i=1}^n Y_i$. I think I understand all other parts except this one. Would you please explain why $X = \sum_{i=1}^n X_i$ and $Y = \sum_{i=1}^n Y_i$?

enter image description here

2 Answers 2

4

$X_k$ and $Y_k$ are always either $0$ or $1$. $X_k=1$ precisely when the $k$-th die shows a $1$, and $Y_k=1$ precisely when the $k$-th die shows a $2$. When you sum the $X_k$’s, you add $1$ for each die that shows a $1$ and $0$ for each die that shows something else, so the total is simply the number of dice showing a $1$. Similarly, when you sum the $Y_k$’s, you add $1$ for each die that shows a $2$ and $0$ for each die that shows something else, so the total is simply the number of dice showing a $2$.

For example, suppose that $n=8$, and the $5$ dice show $1,3,1,2,6,4,2,2$; then $X_1=X_3=1$, and all of the other $X_k$ are $0$, while $Y_4=Y_7=Y_8=1$, and the other five $Y_k$ are all $0$. Thus,

$$\begin{align*} X&=X_1+X_2+X_3+X_4+X_5+X_6+X_7+X_8\\ &=1+0+1+0+0+0+0+0\\ &=2\;, \end{align*}$$

which is indeed the number of dice showing a $1$, and

$$\begin{align*} Y&=Y_1+Y_2+Y_3+Y_4+Y_5+Y_6+Y_7+Y_8\\ &=0+0+0+1+0+0+1+1\\ &=3\;, \end{align*}$$

the number of dice showing a $2$.

  • 0
    i've been told that capital X and Y are random variables, I never knew you can obtain the actual value of X and Y by summing $X_k$ and $Y_k$. This summation method to obtain X and Y would not work if X and Y are, say, binomially distributed correct?2012-11-21
  • 0
    @user133466: The distributions of $X$ and $Y$ are completely determined by those of the $X_k$ and the $Y_k$, respectively; if they’re binomial, they’re binomial, and if not, not. The random variables $X$ and $Y$ are **defined** to be these sums of other random variables. You can always define random variables in such fashion.2012-11-21
  • 0
    say we have $X$ ~ $Bin(n,p)$ what would big $X$ equal to in this case?2012-11-21
  • 0
    @user133466: I don’t understand the question. As I said, the distribution of $X=\sum_{k=1}^nX_k$ depends entirely on the distributions of the random variables $X_k$, and you’ve not said anything about them.2012-11-21
  • 0
    Does that mean if we have $X_1,X_2,...X_n$ are independent Poisson, $X$ would still equal to $\sum_{i=1}^n X_i$?2012-11-21
  • 1
    @user133466: We **define** $X$ to be $\sum_{i=1}^nX_i$, so of course it’s equal to $\sum_{i=1}^nX_i$. I’m really not sure what you’re trying to ask. If the random variables $X_i$ are $1$ when something happens and $0$ when it doesn’t, then $X$ will count the number of successes, regardless of the distribution(s) of the $X_i$’s.2012-11-21
  • 0
    let us [continue this discussion in chat](http://chat.stackexchange.com/rooms/6487/discussion-between-user133466-and-brian-m-scott)2012-11-21
3

Here $X$ is the number of $1$'s when you roll a fair die $n$ times.

Recall that the random variables $X_1, X_2, X_3, \dots, X_n$ were defined as follows. $X_1=1$ if we get a $1$ on the first roll, and $X_1=0$ otherwise.

$X_2=1$ if we get a $1$ on the second roll, and $X_2=0$ otherwise.

$X_3=1$ if we get a $1$ on the third roll, and $X_3=0$ otherwise.

In general, $X_i=1$ if we get a $1$ on the $i$-th roll, and $X_i=0$ otherwise.

Then $$X_1+X_2+X_3+\cdots+X_n$$ is the total number of times you got a $1$, so it is the number of $1$'s that you got. But $X$ is by definition the number of $1$'s ou got.

Every time you get a $1$ on the die you write down a $1$, every time you don't get a $1$, you write down a $0$. So the sum of the numbers you write down is just the total number of $1$'s that you rolled.

Exactly the same idea works for $Y$. You write down a $1$ if you roll a $2$, and write down a $0$ if you don't roll a $2$. The sum of all the numbers you wrote down is the total number of $2$'s that you rolled.

This trick of using what are called indicator random variables $X_i$ is useful in many calculations.

  • 0
    i've been told that capital X and Y are random variables, I never knew you can obtain the actual value of X and Y by summing $X_k$ and $Y_k$. This summation method to obtain X and Y would not work if X and Y are, say, binomially distributed correct?2012-11-21
  • 0
    Actually, $X$ and $Y$ **are** binomially distributed random variables. Each time, for $X$, the probability of "success" is $1/6$. Every binomially distributed random variable is the sum of indicator random variables. For example, toss a coin $10$ times in a row. Let $X_i=1$ if on the $i$-th toss you get a head, and $X_i=0$ if you get a tail. Then $X_1+X_2+\cdots+X_{10}$ is the total number of heads. This as you know has binomial distribution.2012-11-21
  • 0
    looks like I need to look into indicator random variable, the explanation on [wikipedia][2] was not easy to understand. Do you know if there's a better explanation for the usage of indicator random variable? [2]: http://en.wikipedia.org/wiki/Indicator_function2012-11-21
  • 0
    No, sorry. They are used in many ways, sometimes very complicated. The simplest is for finding the mean and variance of the binomial. Since $X=X_1+\cdots+X_n$, we have $E(X)=E(X_1+\cdots+X_n)$. But expectation of a sum is sum of the expectations, and $E(X_i)$ is **very** easy to find. A not much harder argument gives you the variance of the binomial.2012-11-21
  • 0
    I understand $E(X)=E(X_1+...+X_n)$ but I did not know $X = X_1 + ... +X_n$. Does that mean if we have $X_1,X_2,...X_n$ are independent Poisson, $X$ would still equal to $\sum_{i=1}^n X_i$?2012-11-21
  • 0
    I don't see how $E(X) = E(X_1+...+X_n)$ is related to $X=X_1+...+X_n$ ...2012-11-21
  • 0
    In general, for **any** random variables $U$ and $V$, we have $E(U+V)=E(U)+E(V)$.2012-11-21
  • 0
    let us [continue this discussion in chat](http://chat.stackexchange.com/rooms/6486/discussion-between-user133466-and-andre-nicolas)2012-11-21
  • 0
    @user133466: Sorry, after one unpleasant experience, and one barely tolerable one, I avoid chat.2012-11-21
  • 0
    still, thank you for your help!!2012-11-21