0
$\begingroup$

The setup of this problem is the following.

Tom is playing a game online. He keeps playing until he wins one game. Winning in the $n$th game will give a payout of $\frac{$100}{n}$. Each game is won independently with probability $p$.

I'm trying to find the expected winnings.

The number of games played before winning one follows a geometric distribution with parameter $p$. So Tom will play $\frac{1}{p}$ games, on average, before winning one. I intuitively imagine the winnings then to be $\$100 \times p$. Why exactly is this though?

If we let $X$ be the geometrically distributed random variable representing the number of games played when the first is won, then we can define the variable $Y = f \circ X$ where $f(n) = \frac{$100}{n}$ to represent the winnings. Directly trying to find the expectation of $Y$ gives $$ \operatorname{E}(Y) = \sum_{n=1}^\infty \frac{100}{n} (1-p)^{n-1} p $$ which I am not sure how to evaluate. How should I proceed to find the expectation, formally?

I noticed that the function $f$ is injective. Does this have a particular significance when trying to find expected values? Is there a more general theorem that relates the expectation of a function $f$ of a random variable $X$ to the expectation of $X$ itself?

  • 0
    hint: $\sum_{k=1}^{\infty} \frac{x^k}{k} = \log \frac{1}{1-x}$2017-02-21

2 Answers 2

1

Hint: Use the Taylor expansion of the logarithm \begin{align} \log(1-x) = -\sum_{n=1}^{\infty}\frac{x^n}{n} &&|x|<1. \end{align}

Advanced hint: \begin{align} 100p\sum_{n=1}^{\infty}\frac{(1-p)^{n-1}}{n} = \frac{100p}{1-p}\sum_{n=1}^{\infty}\frac{(1-p)^{n}}{n}. \end{align}

  • 0
    I don't see how I'm supposed to reconcile the fact that exponent $n-1$ in my situation is different from the denominator $n$.2017-02-22
0

Others have already told you how to calculate the expectation, but regarding your injectivity and expectation of $f$ questions, the only significance of lack of injectivity is that its possible that $f(X)=a$ for more than one value of $a$, so that $\mathbb{P}[f(X)=a]=\sum_{x:f(x)=a}\mathbb{P}[X=x]$ rather than just $\mathbb{P}[X=f^{-1}(a)]$. This never complicates the maths though, as you will always write $\mathbb{E}[f(X)]=\sum_{x}f(x)\mathbb{P}[X=x]$, and injectivity or lack thereof of $f$ rarely factors into evaluating this sum.

  • 0
    Don't you mean to say that the only significance of injectivity is that given $a$, $f(x) = a$ holds for exactly one $x$?2017-02-22
  • 0
    Ah, what I meant to say was "lack of injectivity". Thank you, will edit now.2017-02-22