Edit: This question is a lot shorter than it is. Don't get intimidated. If you know backgammon, just skip to question 2.
In Backgammon, each game is played for one point (or one dollar) between two players. There is a die, called the doubling cube, which has the numbers $2, 4, 8, ...$, in the middle of the board (it is not 'owned' by anyone). The players take turns rolling two regular dice (not the doubling cube) and moving. But before each roll, a player can 'offer the cube' (or 'offer to double', 'double', etc.) which is basically saying "Hey, why don't we play this game for 2 points". The other player can drop, or refuse the cube, and lose one point, or may accept, or take, the cube, in which case the game continues for twice as many points as before.
A player's equity in the game is the probability that he'll win a cubeless game, where cubeless means neither player can offer the cube (or a one point game). (For backgammon players who know the rules, I'm ignoring gammons and backgammons, so the equity equals the probability of winning. Edit: As @Henning Makholm's first answer indicates, I also do not want to include the equity of owning the doubling cube.
I have two questions, but I know the answer to the first one and I think I'm calculating it right.
1) What equity does the player receiving the cube require in order to accept it? Answer, I'm told, is $.25$?
The receiving player will accept the double when the expected value of taking it is greater than the expected value of dropping (which automatically loses him $1$ point). $p$ is the probability (equals equity) of the receiving player winning (are you guys following all this?).
$E(take) \ge E(drop) \\ 2p-2(1-p) \ge -1 \\p > .25$
I'm nearly certain that's correct, so
2) What equity is required for a player to offer the cube (offer to double the game's stakes)?
How is that calculated? We don't know whether the cube's recipient will accept or not.
I start the same as question 1: The giver will double when his expected value of doubling is greater than his EV of not doubing (duh!). If $EV(rolling)$ is $p-(1-p)$, and $EV(doubling)$ is $2p-2(1-p)$, then
$E(doubling) > E(rolling) \\ 2p-2(1-p) > p-(1-p) \\ p > .5$
Which can't possibly be correct. While I'm not BG expert, I did used to play for (small amounts) of money in NYC. There is no way in heck that I would double with 51% chances.
OK, that's all I got. How do we figure this out? Thanks.