3
$\begingroup$

I keep tossing a fair coin until I see HTHT appears. What is the probability that I have already observed the sequence TTH before I stop?

Edit:

Okay. This problem is the same as before. I am trying to think of the following. Could anyone please tell me if this is also the same as before, and if not, how to solve it.

Two players A and B toss two different fair coins. A tosses his coin until he sees HTHT appearing for the first time, and records the number of tosses, $X$. B tosses his coin until he sees TTH appearing for the first time, and records the number of tosses made, $Y$. What is the probability that $Y?

  • 0
    @Qiang Li: Better. This new question is different from the one I answered before.2011-02-05

2 Answers 2

2

By the standard techniques explained here in a similar context, one can show that the first time $T_A$ when the word A=HTHT is produced has generating function $ E(s^{T_A})=\frac{s^4}{a(s)},\qquad a(s)=(2-s)(8-4s-s^3)-2s^3, $ and that the first time $T_B$ when the word B=TTH is produced has generating function $ E(s^{T_B})=\frac{s^3}{b(s)},\qquad b(s)=(2-s)(4-2s-s^2), $ The next step would be to develop these as series of powers of $s$, getting $ E(s^{T_A})=\sum_{n\ge4}p_A(n)s^n,\quad E(s^{T_B})=\sum_{k\ge3}p_B(k)s^k, $ and finally, to compute the sum $ P(T_B An alternative solution is to consider the Markov chain on the space of the couples of prefixes of the words A and B and to solve directly the associated first hitting time problem for this Markov chain, as explained, once again, here.

(+1 to Arturo's and Mike's comments.)

Added later Introduce the decomposition into simple elements $ \frac1{(1-s)b(s)}=\sum_{i=1}^4\frac{c_i}{1-s\gamma_i}. $ Then, one can show that $ P(T_B by decomposing the rational fractions $1/b(s)$ and $1/a(s)$ into simple elements, by relying on the fact that all their poles are simple, and by using some elementary algebraic manipulations.

At this point, I feel it is my duty to warn the OP that to ask question after question on math.SE or elsewhere along the lines of this post is somewhat pointless. Much more useful would be to learn once and for all the basics of the theory of finite Markov chains, for instance by delving into the quite congenial Markov chains by James Norris, and, to learn to manipulate power series, into the magnificent generatingfunctionology by Herbert Wilf.

  • 0
    @Didier, finally I got it. I was being dumb at the summation of k. I was wondering there seems to be quite some algebraic manipulation here. Is there any foresight to see the reason of dividing $(1-s)$ in $1/b(s)$, also about the final $\frac{c_i}{a(\gamma_i)}$? I am just thinking about more general patterns here. Thanks again!2011-02-08
1

This answer addresses the question as originally posted; I made it a community wiki because it is really an extended comment that would not have fit as a comment.

The previous question asked, inter alia, the following: given a two-sided coin with probability $p$ of showing heads, and the coin is tossed until either HTHTH or HTHH appears, how to compute the probability that the pattern we got was HTHTH.

Mike Spivey's answer cast the problem as a two-player game, in which one player wins if the first pattern to show up is HTHTH, and the second player wins if the pattern that shows up first is HTHH, and showed how to obtain the probability that either the first player wins, or the second player wins.

The current problem has a fair coin, and asks that the coin be tossed until a particular pattern, HTHT, shows up, and then asks what is the probability that another pattern, TTH, actually occurs before HTHT shows up.

Think of this problem as a two-player game again: the coin is tossed, until either HTHT or TTH show up. The first player wins if HTHT shows up first, the second if TTH shows up first. The exact same approach as used in the previous question works here to obtain the probability that the second player wins, i.e., the probability that TTH will show up first.

Now, you state in comments that you don't see this problem as the same as that game, because in this problem you keep tossing until HTHT appears, instead of stopping if TTH shows up. Well, take the game described above, and then imagine that the two players are utterly bored and so, if player two wins, they keep tossing the coin until HTHT shows up. Will that change the probability that the second player wins? No. The second player still wins if TTH shows up first, regardless of whether you keep tossing the coin or not after he wins. So the probability that TTH showed up at least once before you hit HTHT is the same, whether you keep tossing after TTH shows up or not, because the number of sequences in which TTH shows up but HTHT never does is negligible, so it will not affect the probability. (The probability that a particular finite pattern never shows up is $0$).

So in the end, the answer to your question is still "the probability that TTH showed up before HTHT shows up is equal to the probability that Player Two wins the game".

  • 0
    I h$a$ve changed my original post a bit. Please let me know your thoughts. Many thanks.2011-02-04