1
$\begingroup$
  1. David repeatedly flips a fair coin. Find the expected value of the total number of heads he will flip before flipping two consecutive tails.

I know that expected value is the sum of value multiplied by probability and all the probability sums up to $1$, however I find this somehow hard to approach so can anyone help me..

  • 0
    One approach is to try setting up a Markov Chain and deriving a set of equations to solve for the waiting time from the starting state.2017-02-13
  • 0
    Is casework another approach? This problem is designed for 10th grade or below so I think that Chain thing is not valid2017-02-13
  • 0
    Do you know about conditional expectations? You should be able to get an elegant solution by conditioning on the number of flips given whether or not you flip a heads directly after flipping the first heads. This essentially is using the fact that the system probabilistically renews after each tails is flipped.2017-02-13

4 Answers 4

0

Let $T$ be the number of consecutive tails flipped in the past two flips.

$T$ can take 3 values $\in \{ 0 , 1 , 2 \}$.

We define $X$ to be the the number of heads and define its expected value recursively as follows:

$\mathbb{E}(X|0) = \frac{1}{2} \cdot \{1 + \mathbb{E}(X|0) \} + \frac{1}{2} \cdot \{0 + \mathbb{E}(X|1)\} $

$\mathbb{E}(X|1) = \frac{1}{2} \cdot \{0\} + \frac{1}{2}\cdot \{1 + \mathbb{E}(X|0) \} $

On Solving, we find $\mathbb{E}(X|0) = 3$

Hence, there are 3 expected heads before two consecutive tails.

Detailed explanation: We begin at state "0", which means we don't have any tails in the last two flips. We can either go to state $1$ with $\frac{1}{2}$ probability by flipping a tails, adding essentially nothing to our expected value or we can flip a heads with probability $\frac{1}{2}$ and stay in the same state but we add $1$ to our expected value.

Similarly, when we reach state one, we can flip a tails and the game ends, with probability $\frac{1}{2}$ (adding 0 to our expected value) and again with probabbility $\frac{1}{2}$ we can go into state $0$ since the we need to start all over again in order to get two consecutive tails, adding $1$ to the expected number of heads in the process.

3

Here is a combinatorial approach with an answer based upon generating functions. We develop a generating function \begin{align*} G(z)=\sum_{n=2}^\infty g_n z^n \end{align*} with $g_n$ counting the number of valid words of length $n$. These are binary words from a two character alphabet $V=\{T,H\}$ which have precisely one run of tails of length $2$ at the end of the word.

We can use $G(z)$ to calculate the expectation value $E(Z)$ since \begin{align*} E(Z)=\sum_{n=2}^\infty n \frac{g_n}{2^n} =\frac{1}{2}\cdot\left.\frac{d}{dz}\left(G(z)\right)\right|_{z=\frac{1}{2}}\tag{1} \end{align*}

We observe

  • The generating function $G(z)$ starts with $g_2z^2=z^2$ since the shortest valid word is $\color{blue}{TT}$.

  • There is only one valid word with length $3$, namely $H\color{blue}{TT}$.

  • Each valid word with length $n\geq 3$ ends with $H\color{blue}{TT}$.

The idea is to build valid words of length $n\geq 3$ by creating valid subwords of length $n-3$ and append $H\color{blue}{TT}$. In order to do so we start with a generating function for words of a two character alphabet $V=\{T,H\}$ which counts words with no consecutive equal characters at all.

These words are called Smirnov or Carlitz words. See example III.24 Smirnov words from Analytic Combinatorics by Philippe Flajolet and Robert Sedgewick for more information. (You might also find this answer helpful.)

The generating function $A(z)$ counting Smirnov words over a two character alphabet is according to the reference \begin{align*} A(z)=\left(1-\frac{2z}{1+z}\right)^{-1} \end{align*}

The coefficient of $z^n$ of $A(z)$ gives the number of Smirnov words of length $n$, i.e. the number of words with no consecutive equal heads and tails.

Since there is no restriction to the distribution of heads, we can replace each character "H" in a Smirnov word by one or more "H"s, which means to replace \begin{align*} z\longrightarrow z+z^2+z^3+\cdots=\color{blue}{\frac{z}{1-z}} \end{align*} in the corresponding generating function $A(z)$.

Based upon $A(z)$ we obtain this way a generating function $B(z)$ with \begin{align*} B(z)&=\left(1-\frac{z}{1+z}-\frac{\color{blue}{\frac{z}{1-z}}}{1+\color{blue}{\frac{z}{1-z}}}\right)^{-1}\\ &=\left(1-\frac{z}{1+z}-z\right)^{-1}\\ \end{align*}

The coefficient of $z^n$ of $B(z)$ gives the number of words of length $n$ with no consecutive equal tails.

Since valid words of length $n\geq 3$ end up in $H\color{blue}{TT}$ we respect this fact by multiplying $B(z)$ with $z^3$. The special case, the word $TT$ which does not end in $H\color{blue}{TT}$ is additionally to respect and we obtain the generating function $G(z)$ counting all valid words as \begin{align*} G(z)&=z^2+z^3B(z)\\ &=z^2+z^3\left(1-\frac{z}{1+z}-z\right)^{-1}\\ &=\frac{z^2}{1-z-z^2}\tag{2}\\ &=z^2+z^3+2z^4+3z^5+5z^6+\cdots \end{align*} Note that (2) is essentially the generating function of the Fibonacci numbers. The coefficients give the number of valid words of length $n$ \begin{align*} [z^2]G(z)=1:&\quad TT\\ [z^3]G(z)=1:&\quad HTT\\ [z^4]G(z)=2:&\quad HHTT, THTT\\ [z^5]G(z)=3:&\quad HHHTT, HTHTT, THHTT\\ [z^6]G(z)=5:&\quad HHHHTT, THHHTT, HTHHTT, HHTHTT, THTHTT \end{align*}

Finally, we obtain from $G(z)$ according to (1) the expectation value \begin{align*} \color{blue}{E(Z)}&=\frac{1}{2}\cdot\left.\frac{d}{dz}\left(G(z)\right)\right|_{z=\frac{1}{2}}\\ &=\frac{1}{2}\cdot\left.\frac{d}{dz}\left(\frac{z^2}{1-z-z^2}\right)\right|_{z=\frac{1}{2}}\\ &=\frac{1}{2}\cdot\left.\frac{z(2-z)}{(1-z-z^2)^2}\right|_{z=\frac{1}{2}}\\ &=\color{blue}{6} \end{align*}

  • 0
    Given that the OP said this problem was for 10th graders, this approach seems to be overkill for their purposes. With that said, this is an absolutely wonderful answer and I'm sure other readers (like myself) will find it very informative!2017-02-19
  • 1
    @David: Thanks and I agree with respect to an overkill. In fact I had also two aspects for10th graders in my mind. To show there can be very different approaches to solve a problem and at least the term generating function could be remembered as useful for future usage.2017-02-19
2

Excuse my brevity I'm on my phone.   Let $p$ be the probability of flipping a heads.   Let $N$ be the number of flips until you stop.   Let $Y_1$ be the number of flips until we get our first heads.   Note: we know the distribution of $Y_1$ to be geometric.   Let $Y_2$ be the number of flips until the second heads.   Clearly, for any realization where $Y_2=Y_1+1$ we have that $N=Y_2$. Now, by conditioning we have that $$\begin{align}\mathsf E(N) &= {\mathsf E(N\mid Y_2=Y_1+1)~\mathsf P(Y_2=Y_1+1) + \mathsf E(N\mid Y_2>Y_1+1)~\mathsf P(Y_2>Y_1+1)} \\ & = \mathsf E(Y_1+1)~p+(\mathsf E(Y_1+1)+\mathsf E(N))~(1-p)\end{align}$$ Then just solve the above equation.   The only unknown is $\mathsf E(N)$.   The second equality uses the fact that if $Y_2>Y_1+1$ then the ($Y_1+1$)st flip is a tails and so the process "resets" and the expected number of additional flips is as if you've never flipped the coin before, i.e., $\mathsf E(N)$.

0

$E(X)=(1(\frac{1}{2}+\frac{1}{2^2})+2(\frac{1}{2^2}+2C_1(\frac{1}{2^3})+\frac{1}{2^4})+3(\frac{1}{2^3}+3C_1(\frac{1}{2^4})+3C_2(\frac{1}{2^5})+3C_3(\frac{1}{2^6}))+.......)\frac{1}{2^2} $

$= (\sum_{n=1}^\infty n(\frac{1}{2^n})(nC_0(\frac{1}{2^0})+nC_1(\frac{1}{2^1})+nC_2(\frac{1}{2^2})+nC_3(\frac{1}{2^3})+.....+nC_n(\frac{1}{2^n})))\frac{1}{2^2}$

$=(\sum_{n=1}^\infty n(\frac{1}{2^n})(\frac{3}{2^n}))\frac{1}{2^2}$

$=(\sum_{n=1}^\infty n(\frac{3}{4^n}))\frac{1}{2^2}$

$=3$