1
$\begingroup$

Say you have an amount of 350 creditsand you have 4 options:

1) put 50 credits and in case of a win you double your credits to 100 and therefor you have total a of 400 credits, but in case of a loss you still have 300 credits

2) put 150 credits and in case of a win you double your credits to 300 and therefor you have total a of 500 credits, but in case of a loss you still have 200 credits

3) put 250 credits and in case of a win you double your credits to 500 and therefor you have total a of 600 credits, but in case of a loss you still have 100 credits

4) put all 350 credits and in case of a win you double your credits to 700 and therefor you have total a of 700 credits, but in case of a loss you will have 0 credits

What I'm wondering is is there a mathematical "algorithm" which would give me the best acceptable option?

  • 1
    What is the probability of winning? Is winning based on the toss of a fair coin? Or, as is usual in gambling games, are the odds stacked against you? If so, on average the more you bet the more you lose.2012-06-22
  • 0
    @AndréNicolas: it's based on the odds.2012-06-22
  • 0
    If the probability of a win is $\lt 1/2$, the on average best thing is to bet the least amount. Also depends how rich you are, more precisely on the marginal utility (to you) of money. Also, maybe the $350$ is rent due tomorrow, and you will be out on the street if you lose it. Maybe you owe the Mob $700$, and will be in serious trouble if you don't pay tomorrow. The rational thing to do depends on such external factors.2012-06-22
  • 0
    The 350 is the "bank" which can be spent with no problem and nothing "bad" will happen. I mean, I'm not trying to figure the full proof gambling plan here (cause there is none ofc) am just curios if this can be calculated. I do remember this task as something similar to when you're on jeopardy! show and you can either take the money or answer for more and double the amount.2012-06-22
  • 1
    What does "it's based on the odds" mean? Anyway, as Andre said, if your probability of winning is under 50%, best bet 50; if your probability of winning is over 50%, best bet 350.2012-06-22

2 Answers 2

2

When problems in gambling are posed in a mathematical sense, it is usually assumed that utility is linear in expectation, so you are indifferent between having 350 and a 50% chance of having 350-x and a 50% chance of having 350+x for any x. This is particularly the spirit of André Nicolas' comments. If that is true and the bet is fair, math can't help you-all these options (as well as the one of not playing) are equivalent.

Many (most?) people would find being broke much more of a downgrade than being twice as rich is an upgrade, so would not make this bet with their entire net worth. In this case, they should not take this bet for their net worth.

On the other hand, many people reason that losing \$1 won't make any difference to their life, but winning $x,000,000 would be wonderful, so they play the lottery.

There is a whole part of economics/psychology that addresses this, but you are in the wrong place for it.

  • 0
    (Tongue in cheek comment): Math doesn't help because math assumes linear utilities. Assume concave utilities, and math will explain risk adverse behavior and buying of lotteries.2012-06-22
  • 0
    @Aditya: I think it will explain risk adverse behavior, but not buying lotteries. Yes, I buy insurance (in some cases) but don't buy lottery tickets.2012-06-22
  • 1
    Agreed. To explain buying of lottery, you need to add additional utility for participating in the game---the _thrill_ of the possibility of winning.2012-06-22
  • 1
    @Aditya: I agree. Math doesn't consider that, as part of the linearity assumption, which is why I referred OP to other domains. But I think concavity will prove you shouldn't buy lottery tickets-it says you should always be risk adverse.2012-06-22
  • 0
    @Aditya: the *thrill* of winning is one non-mathematical part of the problem of many.2012-06-22
0

The "best choice" is the one that maximizes your expected utility. It therefore depends on your utility function, i.e. your preferences. For example, if there's something you really want right now that costs 700 credits, the last is the best option. If it only costs 500 but other than that money is worth more to you when you have less of it (as is the case for most people), the second option is better assuming a 50% win chance. And so on. If there's a different win chance, that has to be taken into account as well.