5
$\begingroup$

The pirate game is a popular problem that is often asked in interviews (which is how I stumbled upon it). The problem asks

There are 5 rational pirates, A, B, C, D and E. They find 100 gold coins. They must decide how to distribute them. The pirates have a strict order of seniority: A is superior to B, who is superior to C, who is superior to D, who is superior to E. The pirate world's rules of distribution are thus: that the most senior pirate should propose a distribution of coins. The pirates, including the proposer, then vote on whether to accept this distribution. If the proposed allocation is approved by a majority or a tie vote, it happens. If not, the proposer is thrown overboard from the pirate ship and dies, and the next most senior pirate makes a new proposal to begin the system again. Pirates base their decisions on three factors. First of all, each pirate wants to survive. Second, given survival, each pirate wants to maximize the number of gold coins he receives. Third, each pirate would prefer to throw another overboard, if all other results would otherwise be equal. The pirates do not trust each other, and will neither make nor honor any promises between pirates apart from the main proposal.

Let us consider this problem without the third condition (in my opinion, the pirates wouldn't be very rational if they can't make any proposals). What would the solution be in this case?

Considering a case with only three pirates, A, B, and C, the original solution proposes that A gives only a single coin to C to maximize his gains. This is because if only B and C remain, B can hoard all the gold and still maintain a tie vote.

However, C knows that B will always vote "no" (since he gets everything otherwise), so can't he tell A that he will vote "yes" with a probability $\frac{x}{100}$ where $x$ is the number of coins that he receives in the distribution (let's assume that A and C have a fully trust worthy way of doing this). If we treat death as equivalent to obtaining no gold, then wouldn't the expected gold pieces that C receives be $50$?

This is just one strategy, which does not include B. What is the optimal strategy for each pirate even in this simple case?

  • 2
    Just a quick comment - A and C could arrange that A will place $x$ gold coins and $100-x$ wooden coins in a bag, and that C will determine his vote by picking a coin from the bag - if it's gold he votes 'yes', if it's wooden he votes 'no'. The rules of this game generally state that surviving with any number of coins is infinitely preferable to death, though. If you were to follow that rule A should always pick a strategy that gives zero probability of death - in other words, he wouldn't agree to the probabilistic strategy.2011-11-22
  • 0
    Hmm... you're right. But C doesn't really need A's approval to use this strategy.2011-11-23
  • 1
    They asked the original problem, but failed to mention the third condition (probably by oversight). I managed to figure out the correct solution to the original problem, but it was only after the interview that I thought about this some more and read that Wikipedia article (which included the third condition).2011-11-23

2 Answers 2