PDA

View Full Version : Paradox of the wallet


Thythe
05-16-2005, 02:31 PM
Probably posted on here before, but after 15 seconds of searching I didn't find it...

Two people take out their wallets and set them on the table. If they each decide to play, the money is counted up, and whoever has the most money of the two, gives it to the other. Assume that each player has a random amount of money for this purpose. Player A reasons that if he plays and loses, he will only lose the money in his wallet now known as $X. If he wins, however, he will win more than in his wallet, or $(X+Y) where Y is greater than or equal to 1 cent. Player A reasons that he has a 50% chance of winning this game and it is thus +EV for him to play. Player B reasons the same...how can this be?

LetYouDown
05-16-2005, 02:47 PM
Can we assume a finite range of amounts of money for each player? Otherwise I don't think this problem is solve-able.

Paul2432
05-16-2005, 03:11 PM
I think part of the problem here is the statement that player A feels he has a 50% chance of winning. Nothing in the problem statement suggests this will be the case. Because play is voluntary, suppose player B adopts the strategy of only playing when his wallet is empty. Then player A has a 0% chance of winning.

Paul

disjunction
05-16-2005, 03:37 PM
[ QUOTE ]

Can we assume a finite range of amounts of money for each player? Otherwise I don't think this problem is solve-able.

[/ QUOTE ]

Bingo. If you assume a random finite distribution from $1 to $x, your probability of winning depends on the number you picked and the it doesn't seem that the "paradox" will work.

If you assume a random number from 0 to infinity, your probability of winning is not 50% and the problem becomes weird.

gaming_mouse
05-16-2005, 05:20 PM
[ QUOTE ]


If you assume a random number from 0 to infinity,

[/ QUOTE ]

You cannot have a uniformly distributed random number between 0 and infinity. IMO, this is the "flaw" in the argument.

disjunction
05-16-2005, 06:01 PM
[ QUOTE ]
You cannot have a uniformly distributed random number between 0 and infinity. IMO, this is the "flaw" in the argument.

[/ QUOTE ]

Yeah, I was going to put in parenthesis "(whatever that means"), but I thought it was redundant with me saying the problem becomes weird. It does not present a flaw in my argument, because I gave the finite case, and the problem remains undefined for the infinite case.

EDIT: Just realized that you're probably referring to the original post. BTW, "problem becomes weird" is a technical term /images/graemlins/smile.gif

gaming_mouse
05-16-2005, 06:07 PM
[ QUOTE ]
Just realized that you're probably referring to the original post.

[/ QUOTE ]

yeah, i was. i was basically just clarifying what you were getting at with "weird."

chaosuk
05-16-2005, 09:44 PM
If both players play sub-optimally w.r.t game theory.

Suppose, in a similar game, both players are randomly assigned a number between 1 and 10 assigned to them. They must independtly decide whether or not to put up an amount of cash to bet against the other persons number, highest wins. Both players always play 7,8,9,10. Both players are given a 9. They are both making +EV bets by agreeing to play, since they are both going to win more often than they lose. But of course they are both playing sub-optimal strategies, since they could both improve their game by not playing the 7, since it never wins. Then of course we only end up both players only playing the 10. Do you think they need a blind structure?

chaos

Jazza
05-16-2005, 10:20 PM
if player A wins, won't he win less than X not more than X?

Thythe
05-16-2005, 10:29 PM
[ QUOTE ]
if player A wins, won't he win less than X not more than X?

[/ QUOTE ]

No, since you only win if the other player has more than you. Interestingly, though, if you switch the game around and say you win when you have more than the other player, this game would appear to be -EV for both players. Clearly this can not be true either.

chaosuk
05-16-2005, 11:10 PM
'.....this game would appear to be -EV for both players. Clearly this can not be true either.'


there is an important distinction to be made between a game being negative EV for both players and both players making negative EV decisions. It's down to the observer.

It's easy to illistrate, it happens in poker all the time. One player decides to make a very rare bluff, but the chances of it winning are very remote since his oppo will almost certainly call. His decision to bet has -EV. His opponent however, also made a -EV call since his opponent bluffs so rarely. Both players made -EV decisons based on the balance of the hands they contained. Without perfect information both players can make +EV decision or -EV decisions.

signing off.

alThor
05-16-2005, 11:14 PM
[ QUOTE ]
Player A reasons that he has a 50% chance of winning this game and it is thus +EV for him to play. Player B reasons the same...how can this be?

[/ QUOTE ]

This is related to the paradox where there are two envelopes, one containing twice as much as the other. The bottom line is that we cannot assume that the probability of winning is independent of the amounts in our wallets.

alThor

NMcNasty
05-17-2005, 03:53 AM
This is the same as the exchange paradox (the one with the envelopes). The problem arises in that you are using X as a variable for two different values. Since in one case X is larger than in the other, you would basically be saying that X > X which is logically impossible. If you use numbers the paradox is a lot more clear:
Case 1: Player has $1, opponent has 2$. Since you already have $1 you only gain $2 in this case.
Case 2: Player has $2, opponent has 1$. You lost the $2 you started with.
Don't add or subtract from what you started with because the money is yours. You aren't starting with zero, then getting the money in your own wallet, then adding or subtracting from that number. You are simply adding or subtracting from zero because the exchange will net you either a gain or a loss.

So your EV is Pr(case 1)*($2)+ Pr(case2)*(-$2).
Its a premise of the original problem that its a coinflip as to which case will happen, although that part wasn't apparently clear. So, the equation becomes
.5*2+.5*-2 = 0.

PairTheBoard
05-17-2005, 05:51 AM
Put simply, when you win you win the larger amount, but when you lose you lose the larger amount.

Same thing with the envelope switching problem.

PairTheBoard

Jazza
05-17-2005, 04:27 PM
[ QUOTE ]
[ QUOTE ]
if player A wins, won't he win less than X not more than X?

[/ QUOTE ]

No, since you only win if the other player has more than you. Interestingly, though, if you switch the game around and say you win when you have more than the other player, this game would appear to be -EV for both players. Clearly this can not be true either.

[/ QUOTE ]

oh my bad, i thought the case was the later

i should learn to read good..