0
$\begingroup$

Toss a coin n times. Let $X_n= \#heads-\#tails$ If the probability of heads is p, find the distribution and expectation of $X_n$.

I know how to find a distribution if I roll a dice twice, three, or four times. \begin{array}{l|c} %l/c/r = left-align/centre/right-align | for a vertical bar \mathrm{Outcome} & X = \mbox{#heads} - \mbox{#tails}\\\hline HH & 2 \\ HT & 0 \\ TH & 0 \\ TT & -2 \\ \end{array}

$F(x)= P(X\leq x)$ = \begin{cases} 0, & \text{if } x < -2, \\ \frac{1}{4}, & \text{if } -2 \leq x < 0 \\ \frac{3}{4}, & \text{if } 0 \leq x <2\\ 1, & \text{if } x \geq 2 \end{cases}

How would I do a distribution to the n times?

I know that the expectation of X is:

$E\{X\}$=$$\sum_{i=1}^{\infty} x_iP\{\Lambda_i\} $$

How would I tackle this?

  • 0
    Why $\frac14$ and $\frac34$?2017-02-27
  • 0
    $\frac{1}{4}$ since $x<0$ which is only TT. $\frac{3}{4}$ since $x<2$ which is HT,TH, and TT.2017-02-27
  • 0
    No, for example TT has probability $(1-p)^2$.2017-02-27
  • 0
    I got the values because of $F(x)=P(X \leq x) \rightarrow P(X\leq0)= \frac{1}{4}$2017-02-27
  • 0
    Again, $P(X<0)=(1-p)^2$, not $\frac14$.2017-02-27
  • 0
    Oh okay, I got the information from another user and that was how I was getting it2017-02-27

3 Answers 3

1

Note that, denoting by $h$ the number of heads and by $t$ that of tails, you have that $h-t=h-(n-h)=2h-n$. This means that the resulting probabilities can be easily obtained in terms of the simple Bernoulli distribution giving the probability of $h$ heads out of $n$ tosses ${n\choose h}p^h(1-p)^{1-h}$, "doubled and reduced by $n$". E.g. the expectation of the difference between heads and tails is $2pn-n=n(2p-1)$.

2

Let another variable $Y_n$ be $\frac{X_n+n}{2}$.

Then when a heads is rolled, $Y$ goes up by $1$, while if tails is rolled, $Y$ "goes up" by $0$. So $Y_n$ is the sum of $n$ independent binary trials, with "success" probability $p$.

This is the Bernoulli distribution, and is closely related to the binomial expansion for $(1-p)^n$: $$ P(Y_n=k) = \binom{n}{k}p^k(1-p)^{n-k} $$

From that, $$ P(X_n=k) = \binom{n}{\frac{k+n}{2}}p^\frac{n+k}{2}(1-p)^{\frac{n-k}{2}} $$ for $n+k$ divisible by $2$, and of course $0$ otherwise.

1

It may be more useful to characterize $X_n$ in terms of more familiar probability distributions; i.e., if $X_n$ counts the number of heads minus the number of tails in $n$ independent Bernoulli trials, then this is the same as twice the number of heads minus the total number of trials $n$. So define $Y_n \sim \operatorname{Binomial}(n,p)$ as the random number of heads obtained, hence $$X_n = 2Y_n - n.$$ Now it should be straightforward to calculate the expectation.

As for specifying the probability mass function, we have $$\Pr[X_n = 2y - n] = \Pr[Y = y] = \binom{n}{y} p^y (1-p)^{n-y}, \quad y = 0, 1, 2, \ldots, n.$$ Inverting this gives $$\Pr[X_n = x] = \begin{cases}\binom{n}{(n+x)/2} p^{(n+x)/2} (1-p)^{(n-x)/2}, & x \in S \\ 0, & x \not\in S, \end{cases}$$ where $S = \{-n, -n+2, \ldots, n-2, n\}.$