If the random variable $X$ can have positive integer values and $K=0,1,2,\ldots$ and
$P(X>k+1|X>k)=\left(\frac{k+1}{k+2}\right)^2$
find $\mathbb{E}(X).$
If the random variable $X$ can have positive integer values and $K=0,1,2,\ldots$ and
$P(X>k+1|X>k)=\left(\frac{k+1}{k+2}\right)^2$
find $\mathbb{E}(X).$
Let $p_k=P(X)=k$. Then $E(X)=p_1+2p_2+3p_3+4p_4+\cdots.$ Note that we are "adding" together one $p_1$, two $p_2$, three $p_3$, and so on. Let's add these together another way, using informal reasoning. First remove one each of $p_1$, $p_2$, and so on. The sum of the numbers removed is clearly $1$, since $X$ takes on only positive integer values. Note that this sum could also be called $P(X>0)$.
What's left over from our original sum after we removed $p_1+p+2+p_3+\cdots$ is $p_2+2 p_3+3p_4 + 4p_5+\cdots.$ Remove one each of $p_2$, $p_3$, $p_4$, and so on. The sum of the numbers removed is $p_2+p_3+p_4+\cdots$, which is just $P(X>1)$. What's left over is $p_3+2p_4+3p_5+4p_6+\cdots.$ Remove one each of $p_3$, $p_4$, $p_5$, and so on. The sum $p_3+p_4+p_5+\cdots$ is just $P(X>2)$. What's left over is $p_4+2p_5+3p_6+4p_7+\cdots.$ Continue. We conclude that $E(X)=P(X>0)+P(X>1)+P(X>2)+P(X>3)+\cdots.$
So in our problem, we will be finished once we know $P(X>0)$, $P(X>1)$, $P(X>2)$, and so on.
Of course $P(X>0)=1$. By the conditional probability information we were given, $P(X>1|X>0)=\dfrac{1^2}{2^2}$. So $P(X>1)=P(X>1|P(X>0)P(X>0)=\dfrac{1^2}{2^2}.$ Similarly, $P(X>2|X>1)=\dfrac{2^2}{3^2}$. So $P(X>2)=P(X>2|P(X>1)P(X>1)=\dfrac{2^2}{3^2}\dfrac{1^2}{2^2}=\dfrac{1^2}{3^2}.$ Similarly, $P(X>3|X>2)=\dfrac{3^2}{4^2}$. So $P(X>3)=P(X>3|P(X>2)P(X>2)=\dfrac{3^2}{4^2}\dfrac{1^2}{3^2}=\dfrac{1^2}{4^2}.$ Note the very nice cancellations. The pattern is clear, we have $P(X>k)=\dfrac{1^2}{(k+1)^2}.$ It follows that $E(X)=1+\frac{1}{2^2}+\frac{1}{3^2}+\frac{1}{4^2}+\cdots.$
The infinite series on the right is a famous one that you may have seen before. By a result of Euler, the sum is equal to $\dfrac{\pi^2}{6}$. The result is of only marginal significance in probability theory, but I strongly urge you to look up at least the Wikepedia article referenced above.
I assume you mean that $X$ can only have positive integer values.
Hints: One deduces from the conditional expectation formula that $\tag{1}P[X>k+1]=\bigl(\textstyle{k+1\over k+2}\bigr)^2P[X>k].$
Now use the fact that for a nonnegative integer-valued random variable $X$, $\Bbb E(X)=\sum\limits_{i=1}^\infty P[X\ge i].$
Alternatively, you can use the recursion formula $(1)$ and the formula $P[X=k+1]=P[X>k]-P[X>k+1]$ to explicitly evaluate the probability mass function of $X$ (note P[X>0]=1). Then find $\Bbb E(X)$ using the standard definition.
Here is a detailed solution for the alternative approach:
First, note that $P[X>0]=1$. Using the recursion formula $(1)$ repeatedly: $ \eqalign{ \textstyle P[X>1]&=\textstyle({1\over2})^2 P[X>0]=({1\over2})^2 \cr P[X>2]&=\textstyle({2\over3})^2 P[X>1]=({2\over3})^2\cdot ({1\over2})^2 = ({1\over3})^2 \cr P[X>3]&=\textstyle({3\over4})^2 P[X>2]= ({3\over4})^2\cdot({1\over3})^2 = ({1\over4})^2 \cr &\ \vdots }$
In general, for $k>0$: $ P[X>k ]=(\textstyle{1\over k+1 })^2 $ We can now calculate values of the mass function: $\eqalign{ p[X=1]&=P[X>0]-P[X>1]=\textstyle {1\over1^2}-{1\over 2^2}\cr p[X=2]&=P[X>1]-P[X>2]=\textstyle {1\over2^2}-{1\over 3^2}\cr p[X=3]&=P[X>2]-P[X>3]=\textstyle {1\over3^2}-{1\over 4^2}\cr &\ \vdots } $
In general, $ P[X=k]=\textstyle {1\over k^2}-{1\over (k+1)^2 } $
So, $ \eqalign{ \Bbb E(X)&=\sum_{k=1}^\infty k\Bigl(\, {\textstyle {1\over k^2}-{1\over (k+1)^2 }}\,\Bigr)\cr} $
Now $\eqalign{ &\sum_{k=1}^M k\Bigl(\, {\textstyle {1\over k^2}-{1\over (k+1)^2 }}\,\Bigr)\cr &= \textstyle 1(1-{1\over 4})+ 2({1\over4}-{1\over 9})+3({1\over9}-{1\over 16})+\cdots + M ({1\over M^2}-{1\over (M+1)^2})\cr &= \textstyle 1+(-{1\over 4} + 2\cdot{1\over4} ) +(-2\cdot {1\over 9}+3\cdot {1\over9})+(-3\cdot{1\over 16} +4\cdot{1\over16})+\cdots -{M\over (M+1)^2} \cr &=\textstyle1+{1\over 4}+{1\over 9}+{1\over 16}+ \cdots +{1\over M^2} -{M\over (M+1)^2}\cr }$
Taking the limit as $M\rightarrow\infty$, we obtain $ \Bbb E(X)=\sum_{k=1}^\infty {1\over k^2} ={\pi^2\over6}. $
Although I prefer André Nicolas' nice proof, here is another (nonrigorous) way to see that $\tag{2}\Bbb E(X)=\sum_{k=1}^\infty \,k\, p_k = \sum\limits_{i=1}^\infty P[X\ge i]$ holds if $X$ takes the values $1$, $2$, $\ldots\,$, with respective probabilities $p_1$, $p_2$, $\ldots$.
Consider the array of numbers $\tag{3} \matrix{ p_1&p_2&p_3&p_4&p_5&\cdots\cr \phantom{p_1}&p_2&p_3&p_4&p_5&\cdots\cr \phantom{p_1}&\phantom{p_2}&p_3&p_4&p_5&\cdots\cr \phantom{p_1}&\phantom{p_2}&\phantom{p_3}&\vdots&\phantom{p_5}& \cr } $
The sum of the row sums in $(3)$ is the right hand side of $(2)$ and the sum of the column sums is the left hand side of $(2)$.