Consider the following procedure (whose input is $N$) that picks increasing numbers from the set $\{ 1, \dots, N \}$ until it picks $N$:
i := 0 K_i := 1 while K_i < N pick a number K_{i+1} from the set { K_i, ..., N } uniformly at random i = i + 1
How many times do we expect the loop to be executed?
I expect $O(\log N)$ times myself: every round, we expect about half of the remaining numbers to be removed, which should result in $O(\log N)$ rounds. Unfortunately, I have no idea how to prove this. Markov's inequality trivially gives you that you will only throw away a constant fraction with constant probability, but if you wish this to happen $O(\log N)$ times in a row, your probability estimate goes to 0 as $N$ goes to infinity.
Apparently (while I was randomly looking through books for help) the above problem is quite suited to a Markov chain formulation, except I am not familiar with them and have no idea how to use that to get the expected value I want. When the book told me I was supposed to evaluate determinants through Cramer's rule and provided an example which was not at all clear to me, I gave up.
The above problem came up when analyzing a probable counterexample for an algorithm I have been working on. The actual probabilities involved are not uniformly random, but I think the above problem captures the essence of my problems, so an answer would probably help me enough to solve my own problem. If the number of rounds indeed increases as $N$ increases, then my (counter)example would indeed be a counterexample.