6
$\begingroup$

I have a general question about expected values:

For a discrete random variable, $E[X] = \sum_{i=1}^{\infty} x_{i}p_{i}$ and $E[X] = \int_{-\infty}^{\infty} xp(x) \ dx$ for a continuous random variable $X$.

But what is the motivation for these definitions? Is it essentially defined because of the following: Suppose I perform an experiment a large number of times and record the results. I then take the average of the results. I want to find what this average approaches as the number of trials increases. Thus by trial and error I find that the definition of $E(X)$ works. This is assuming I am taking a frequentist view of probability. What does this all have to do with computing the area under some function?

  • 0
    @HorstGrünbusch The phrase "e$x$pected value" suggests to me the value you should expect to get in a single game, because that's all you're guaranteed to be playing. Many games in life we do not get to play more than once. Nothing about the phrase indicates long-term averaging, which is what the concept is actually about.2014-08-28

3 Answers 3

4

Let's go way back to the time of Fermat. You're a gambler. You play a game in which there are a bunch of events $i$ that randomly occur and give you $x_i$ units of money, and the gamemaster charges you $C$ units of money to play. You'd like to know whether he's ripping you off. So you play (or simulate) the game many, many times, and find that if you play the game $N$ times for large $N$, it turns out that approximately $p_i N$ of the time you get event $i$, where $p_i$ is some constant. So your total profit after playing the game $N$ times is approximately

$\sum_i x_i p_i N - NE = N \left( \sum_i x_i p_i - C \right).$

Now, if this number were positive, the gamemaster would be losing money in the long run, so would quickly go out of business. If this number were negative, you'd know you were getting ripped off, so you'd probably stop playing. The only way that the game is fair in the long run is if $C$ is exactly equal to the expected output of the game.

If the space of possible events is continuous rather than discrete, then the sum needs to be replaced by an integral. This just expresses the fact that Monte Carlo integration works.

2

Professor Peter Whittle, Cambridge University, argued that average (expected value) was a more intuitive notion than probability, frequentist or not. For this reason he axiomatized expectation rather than probability. Kolmogorov axiomatized probability in 1933, and this is regarded as a milestone (millstone?) in the history of Probability Theory.

Whittle wrote a book in 1970 called Probability via Expectation. This book has been translated into Russian and other languages, and is now in its 4th Edition, 2000. Read the introductory chapter here: Probabilty via Expectation

Now, if you think that Peter Whittle's view of randomness is eccentric, then read what the great Rudolf Kalman, of Kalman Filter fame, has to say about Kolmogorov probability here:

www.scribd.com/document/62685671/Kalman-Probability-in-the-Real-World

where he answers the question: Why is probability not a satisfactory way of looking at randomness?

Axiomatic Probability or Expectation? Take your pick -- randomly, of course.

1

Integrals can mean many things, area under a curve is just one. It can also be an antiderivative. If the measure space has total measure $1$, an integral can be the weighted mean of a function. In the case of expected value, the measure space, a probability space, has measure $1$: $ \sum_{i=1}^\infty\;\;p_i=1 $ or $ \int_{-\infty}^\infty\;p(x)\;dx=1 $ The "expected value" is really the expected average value of a random variable. The expected value may never actually occur. For example, the expected value of a standard six-sided die is $3.5$, but since the possible values are in $\{1,2,3,4,5,6\}$, $3.5$ will never occur, so it is never really "expected".