1
$\begingroup$

I would like to learn (more) rigorous approaches to problems I always solve intuitively (as intuition only takes you there some of the time).

Let $P=c_1,c_2,\dots,c_n$ be a permutation on $N$ distinct elements (lets say it is a permutation of integers from $1$ to $N$), and $E_n$ the event that all elements preceding $c_n$ are less than it. $$E_n=\forall_{1\le i

Intuitively, $P(E_n)=\frac{1}{n}$. But how could I derive this through conditional and joint probabilities? Because $E$ is really $$E_n=\forall_{1\le i

Would some be so kind to provide a probabilistic "decomposition", or analysis of this event? How to derive this probability the long way? And what approach would be best suited here, perhaps my definition of $E_n$ is not appropriate.

I think there is a way to derive this through conditional probabilities. Because

given that we do not care about the order of the last $N-n$ elements, what is the probability that that $c_n$ is the largest out of the first $n$ elements of the permutation.

How could I express this mathematically, with events, and compute the needed probabilities? Again, I know how to solve this by thinking about the specifics of this problem. I would like a mathematical derivation, so I can learn more, "the long way".

  • 0
    To check: the permutation is on $N\geq n$ elements, and should be viewed as $(c_1,\dots,c_n,\dots,c_N)$?2017-02-01
  • 0
    Yes, its a sequence of $N$ distinct elements. Lets say they are numbers from 1 to N. I will edit the question, original formulation was missing the ordering you mention.2017-02-01

1 Answers 1

1

I am going to change a little bit your notations, to write the (uniformly randomly chosen) permutation as $\sigma\colon [N]\to [N]$.

We have $$ \Pr[ E_n ] = \sum_{m=1}^N \Pr[ E_n \mid \sigma(n)=m ]\Pr[\sigma(n)=m ] = \frac{1}{N}\sum_{m=1}^N \Pr[ E_n \mid \sigma(n)=m ] $$ by the law of total probability, and the fact that $\Pr[\sigma(n)=m ] = \frac{1}{N}$ for every $m\in[N]$ (as the permutation is chosen u.a.r.).

Now, what is $\Pr[ E_n \mid \sigma(n)=m ]$? Since we condition on the $n$-th "slot" being equal to $m$, we need to have exactly $n-1$ numbers lesser than $m$ in the previous slots. A counting argument immediately shows that there are $\binom{m-1}{n-1}$ such possibilities, out of $\binom{N-1}{n-1}$ choices of assigning the $n-1$ previous slots from the $N-1$ unassigned numbers.

From this counting, we get that the fraction of permutations mapping $n$ to $m$ such that the first $n-1$ numbers are all mapped into $\{1,\dots,m-1\}$ is exactly $$ \frac{\binom{m-1}{n-1}}{\binom{N-1}{n-1}} $$ (note that by definition of the binomial coefficients, this is indeed $0$ if $m , and therefore \begin{align*} \Pr[ E_n ] &= \sum_{m=1}^N \Pr[ E_n \mid \sigma(n)=m ]\Pr[\sigma(n)=m ] = \frac{1}{N}\sum_{m=1}^N \frac{\binom{m-1}{n-1}}{\binom{N-1}{n-1}} = \frac{1}{N}\sum_{m=n}^N \frac{\binom{m-1}{n-1}}{\binom{N-1}{n-1}} \operatorname*{=}^{(\dagger)} \frac{1}{n} \end{align*}

To conclude, it thus only remains to show $(\dagger)$, or equivalently $ \sum_{m=n}^N \frac{\binom{m-1}{n-1}}{\binom{N-1}{n-1}} = \frac{N}{n} $. But this follows from the Hockey-stick identity, which states that $$ \sum_{m=n}^N \binom{m-1}{n-1} = \sum_{m=n-1}^{N-1} \binom{m}{n-1} = \binom{N}{n}. $$