Let's assume I have an option to bet on outcome in 10 soccer matches. Each combination costs $1. I find 20 000 combinations that have probability of happening 1/250 000 and pays 500 000 (hence EV 1). Max one combination can occur. Is it possible to calculate with a bankroll of x dollars, how many of the combinations I should play to gain maximum bankroll growth?
Imaginary bet - maximum expected growth
-
0Let's assume ties. – 2012-12-11
1 Answers
A better model might be that there is a population of $20,000$ bins, which you can bet on any or all of. A ball is tossed at the bins and enters each one with probability $\frac1{250,000}$ and misses them all with probability $\frac {230,000}{250,000}$. If the ball enters a bin you bet on, the payoff is $500,000$ for every unit bet. Note this is an expected value of $2$, not $1$, for every unit bet.
As the bet is in your favor, from an expected value perspective you want to get all your money on the table. Two extremes are evident: all your money on one bit or spread your money evenly across the bins. The first returns $500,000$ with probability $\frac 1{250,000}$. The second returns $25$ with probability $\frac 2{25}$
The expected value of the two approaches is the same, but the variance is much higher in the first case. "Maximum bankroll growth" is not well defined. Most likely you will lose everything the first play. If you play a fraction of your bankroll each time you can prevent losing everything and with enough games get a high probability of profit, but that will slow down the expected growth.
-
0@MartinWiklund: If you can only bet $1$ per bin, from an expected value view you should bet all the bins you can. The Kelly criterion applies if your utility function is non-linear, which is usually not considered in math problems unless specified. Otherwise bet all you can and go for the maximum expected value. An extreme example is [here](http://math.stackexchange.com/questions/240619/coin-tossing-game-optimal-strategy/240630#240630) – 2012-12-11