PDA

View Full Version : The Two Envelope Paradox: An Experiment in EV


08-27-2005, 11:13 PM
From http://www.anc.ed.ac.uk/~amos/doubleswap.html:

You are taking part in a game show. The host introduces you to two envelopes. He explains carefully that you will get to choose one of the envelopes, and keep the money that it contains. He makes sure you understand that each envelope contains a cheque for a different sum of money, and that in fact, one contains twice as much as the other. The only problem is that you don't know which is which.

The host offers both envelopes to you, and you may choose which one you want. There is no way of knowing which has the larger sum in, and so you pick an envelope at random (equiprobably). The host asks you to open the envelope. Nervously you reveal the contents to contain a cheque for 40,000 pounds.

The host then says you have a chance to change your mind. You may choose the other envelope if you would rather. You are an astute person, and so do a quick sum. There are two envelopes, and either could contain the larger amount. As you chose the envelope entirely at random, there is a probability of 0.5 that the larger check is the one you opened. Hence there is a probability 0.5 that the other is larger. Aha, you say. You need to calculate the expected gain due to swapping. Well the other envelope contains either 20,000 pounds or 80,000 pounds equiprobably. Hence the expected gain is 0.5x20000+0.5x80000-40000, ie the expected amount in the other envelope minus what you already have. The expected gain is therefore 10,000 pounds. So you swap.

Does that seem reasonable? Well maybe it does. If so consider this. It doesn't matter what the money is, the outcome is the same if you follow the same line of reasoning. Suppose you opened the envelope and found N pounds in the envelope, then you would calculate your expected gain from swapping to be 0.5(N/2)+0.5(2N)-N = N/4, and as this is greater than zero, you would swap.

But if it doesn't matter what N actually is, then you don't actually need to open the envelope at all. Whatever is in the envelope you would choose to swap. But if you don't open the envelope then it is no different from choosing the other envelope in the first place. Having swapped envelopes you can do the same calculation again and again, swapping envelopes back and forward ad-infinitum. And that is absurd.

That is the paradox. A simple mathematical puzzle. The question is: What is wrong? Where does the fallacy lie, and what is the problem?

BluffTHIS!
08-28-2005, 12:10 AM
Read King Yao's Weighing the Odds in Hold'em Poker for the chapter entitled "The Monty Hall Problem". Same thing.

JoshuaD
08-28-2005, 12:29 AM
[ QUOTE ]
Read King Yao's Weighing the Odds in Hold'em Poker for the chapter entitled "The Monty Hall Problem". Same thing.

[/ QUOTE ]

I understand that problem, but I don't see the connection.

edit: O I C.

08-28-2005, 12:42 AM
[ QUOTE ]
Read King Yao's Weighing the Odds in Hold'em Poker for the chapter entitled "The Monty Hall Problem". Same thing.

[/ QUOTE ]

I dont think its the same thing. It is always correct to rechoose in the "Monty Hall problem" as you increase your odds of success from 1/3 to 1/2. The OP has somehow applied this reasoning to a dilema which I dont think you can do. Your odds stay 1/2 in the OP. I dont have the math knowledge to show why the OP is wrong but I think it is. Your average return on rechoosing envelopes should be zero. Maybe not opening the first envelope is the problem, I dont know.

BluffTHIS!
08-28-2005, 02:41 AM
You are right. It isn't the same, as I just reread the OP. You have a guaranteed win here which isn't the case in the MH problem. If you are always offered the choice of the second envelope, then it doesn't matter which one you pick or whether you reselect.

PLOlover
08-28-2005, 03:37 AM
http://mathproblems.info/prob6s.htm
[ QUOTE ]
Suppose the envolopes contained $1 and $2. By switching you would either gain $1 or lose $1. Regardless of the actual amounts in the envelopes the point remains the same, that the average gain of the two possibilities is zero.


[/ QUOTE ]

+1 or -1

EV .5(.5)+.5(2) -1 = 1/2 or 2 , give up 1
EV .5(1)+.5(4) -2 = 1 or 4 , give up 2

So it seems that +1 or -1 is represented as double or half.

What we have is two variables, but turning into three. For example, we get envelope with 2 dollars, the possible amounts are 1,2,4.

But they should really be represented as -1,0,1 or 1,2,3. Note how in -1,0,1 you can gain or lose one.

somehow we have to come up with a metric where half or double is +1 -1.

08-28-2005, 03:59 AM
The answer here is a bit tricky

If you switch no matter what amount you see in the first envelope, your EV overall will remain unchanged

The key here is that the game-maker cannot make it so that there will always be a 50/50 that the other envolope will be the larger amount, given the amount that you see in the first envelope

Here's what I mean: Suppose he includes $10 as a possible amount. That means he has to include $5 as a possible amount, or else when you have $10 you will know the other evelope is $20. This also means he has to include $2.5, and $1.25, and $.625 etc.

Same thing for the high side, he will have to include $20, and therefor $40, and therefor $80 etc.

And they will all have to have the same probability, this is the only way to ensure that given one amount in one envelope there is a 50/50 that the other evelope is double (Actually, you could alternate probabilities, so that only the amounts 4x and (1/4)x have to be the same probability, but the same problem remains).

The problem here is if some one created a game to ensure this 50/50 rule, there is no upper bound on the amount of money that could be in the envelope, and in fact the average amount of money in an envelope would be infinity

So, in real life, you may be able to guess what the game-maker's upper and lower limits are about, and therefore make a EV-maximizing decision which may be either switching (probably because you think the amount is on the low side of his range) or not switching (probably because you think there's a reasonalbe chance you have hit the upper limit already)

I suck bad at explaining things, so I apologize if I confused anyone

PairTheBoard
08-28-2005, 06:12 AM
The explanation for why the EV calculation is wrong is simple but uncomfortable for a lot of people.

Once the First Envelope is opened there is no longer a 50% probability the Second Envelope contains double and a 50% probability the Second Envelope contains half.

After the First Envelope is opened there is Now Either a 100% probabilty the Second Envelope contains Double, OR a 100% probabilty the Second Envelope contains Half, we just don't know which.

Most people refuse to accept this but that's the way it is. If you apply this principle in less clearly necessary ways you are likely to actually get Flamed for it, as I was when I did so in this Probabilty Forum Thread.

Which Twin Has the Tony? (http://forumserver.twoplustwo.com/showthreaded.php?Cat=&Number=2750683&page=&view=&s b=5&o=)

PairTheBoard

Jman28
08-28-2005, 07:53 AM
Interesting side-problem (http://www.csua.berkeley.edu/~emin/writings/more_envelopes.html)

08-28-2005, 08:07 AM
[ QUOTE ]
Interesting side-problem (http://www.csua.berkeley.edu/~emin/writings/more_envelopes.html)

[/ QUOTE ]

Agreed, that is interesting

PairTheBoard
08-28-2005, 09:02 AM
[ QUOTE ]
Interesting side-problem (http://www.csua.berkeley.edu/~emin/writings/more_envelopes.html)

[/ QUOTE ]

Ok, how's this for an intuitive psychological reconciliation for the side problem?

Before opening the envelopes you have an infinite expected value for both envelopes. It's almost a shame you have to actually take the money in one of the envelopes because as soon as you do, your expected infinite wealth is reduced to a paltry finite amount. This is why after opening the First Envelope you always want to switch. Because with the uncertainty of the Second Envelope you still have a tiny bit of that infinity left in it.

PairTheBoard

Jim T
08-28-2005, 06:23 PM
[ QUOTE ]
The explanation for why the EV calculation is wrong is simple but uncomfortable for a lot of people.

Once the First Envelope is opened there is no longer a 50% probability the Second Envelope contains double and a 50% probability the Second Envelope contains half.

After the First Envelope is opened there is Now Either a 100% probabilty the Second Envelope contains Double, OR a 100% probabilty the Second Envelope contains Half, we just don't know which.

Most people refuse to accept this but that's the way it is. If you apply this principle in less clearly necessary ways you are likely to actually get Flamed for it, as I was when I did so in this Probabilty Forum Thread.

Which Twin Has the Tony? (http://forumserver.twoplustwo.com/showthreaded.php?Cat=&Number=2750683&page=&view=&s b=5&o=)

PairTheBoard

[/ QUOTE ]

Isn't that essentially the same as Schrodinger's cat paradox (http://www.windows.ucar.edu/tour/link=/kids_space/scat.html&edu=high)? The cat is either alive or dead. We just don't know which.

Piz0wn0reD!!!!!!
08-28-2005, 07:47 PM
this has been posted 09238809274 times already. The way i see it is that there is no risk involved, therefor you are freerolling.

PairTheBoard
08-29-2005, 06:44 AM
[ QUOTE ]
[ QUOTE ]
The explanation for why the EV calculation is wrong is simple but uncomfortable for a lot of people.

Once the First Envelope is opened there is no longer a 50% probability the Second Envelope contains double and a 50% probability the Second Envelope contains half.

After the First Envelope is opened there is Now Either a 100% probabilty the Second Envelope contains Double, OR a 100% probabilty the Second Envelope contains Half, we just don't know which.

Most people refuse to accept this but that's the way it is. If you apply this principle in less clearly necessary ways you are likely to actually get Flamed for it, as I was when I did so in this Probabilty Forum Thread.

Which Twin Has the Tony? (http://forumserver.twoplustwo.com/showthreaded.php?Cat=&Number=2750683&page=&view=&s b=5&o=)

PairTheBoard

[/ QUOTE ]

Isn't that essentially the same as Schrodinger's cat paradox (http://www.windows.ucar.edu/tour/link=/kids_space/scat.html&edu=high)? The cat is either alive or dead. We just don't know which.

[/ QUOTE ]

I'm not sure. What do you think?

PairTheBoard

PairTheBoard
08-29-2005, 06:49 AM
[ QUOTE ]
Interesting side-problem (http://www.csua.berkeley.edu/~emin/writings/more_envelopes.html)

[/ QUOTE ]

What do you think about the Side Problem Jman? Two guys hold the two envelopes. Both look inside. They switch and Both Improve their EV in the switch? Kind of mind boggling.

PairTheBoard

08-29-2005, 07:58 AM
Heya PairTheBoard,

What you say seems, to me, to swipe under the carpet, the fact that one event (the first) is random according to probabilities, the other not, as it is dependent/conditionned by the first (you must take the envelope that's left, no choice/probabilities involved here!) and therefore has no more/less probability than the original event.

Excuse my butting in, but I am a newbie /images/graemlins/smile.gif and willing to learn /images/graemlins/smile.gif

Cheers

MidGe

PairTheBoard
08-29-2005, 08:52 AM
[ QUOTE ]
Heya PairTheBoard,

What you say seems, to me, to swipe under the carpet, the fact that one event (the first) is random according to probabilities, the other not, as it is dependent/conditionned by the first (you must take the envelope that's left, no choice/probabilities involved here!) and therefore has no more/less probability than the original event.

Excuse my butting in, but I am a newbie /images/graemlins/smile.gif and willing to learn /images/graemlins/smile.gif

Cheers

MidGe

[/ QUOTE ]

Hi MidGE. Are you talking about the Side-problem (http://www.csua.berkeley.edu/~emin/writings/more_envelopes.html) ?

If so I'm not exactly sure what "Events" you mean. The first person opens his envelope. On the information he has available he correctly - for the side problem - computes his expectation for the switch to be greater than what he sees in his envelope. The second person looks in his envelope. Based on the information he has available he correctly computes his expectation for the switch to be greater than what he sees in His envelope. So they switch. What happens? I'm really not sure how to make sense of it at this point. Say the First Guy sees the amount Y. I think it's true that if the experiment is repeated then for all those times the First Guy sees Y and switches, he will gain on average 2/9Y. The same is true for the Second Guy. If he see's Z then on average he will gain 2/9Z by switching. Clearly, both guys can't gain at the same time. And whoever improves is improving a different Envelope Value than the other Guy is hoping to improve. And certainly it must be true that the Total amounts in the Envelopes don't change by switching.

This really is a more interesting version than the original. It also reveals a flaw in logic that we are making in the Original. In the Original I think we've all been saying that because the Unconditional Switch cannot improve your EV, Therefore the Conditional Switch Logically shouldn't be able to improve your EV. The Side Problem shows this doesn't necessarily follow - Even though we are still correct in our conclusion for the Original that the EV calculation There Is wrong.

For the Side Problem, I'm open to suggestions for how best to explain the Dual Switch thing.

PairTheBoard

Jman28
08-29-2005, 09:49 AM
[ QUOTE ]
[ QUOTE ]
Interesting side-problem (http://www.csua.berkeley.edu/~emin/writings/more_envelopes.html)

[/ QUOTE ]

What do you think about the Side Problem Jman? Two guys hold the two envelopes. Both look inside. They switch and Both Improve their EV in the switch? Kind of mind boggling.

PairTheBoard

[/ QUOTE ]

Yeah. I am a bit baffled. I don't think I'm willing to accept that they can both improve EV, but as of now, I have no argument against it other than "it doesn't make sense."

After reading the second resolution, I was almost ready to accept it, as anytime infinity is involved, my logic seems to not account for everything. But then he goes on to show it in an example without infinity.

I am starting to unserstand the 'looking before switching increases EV' idea. I agree logically that switching before looking and staying before looking have the same EV, while switching after looking and staying after looking do not. This is pretty interesting to me. But it's only due to the fact that the times you get a 1, you clearly shouldn't stay.

I still am not sure though that:
-always switching after looking, and
-always staying after looking unless Y=1 (then switch)
would yeild different EVs.

I'm gonna try and email a prof. I had for a Bayesian reasoning class. He's a smart guy. And where are you at, Sklansky?

gumpzilla
08-29-2005, 11:12 AM
[ QUOTE ]
Yeah. I am a bit baffled. I don't think I'm willing to accept that they can both improve EV, but as of now, I have no argument against it other than "it doesn't make sense."

[/ QUOTE ]

Disclaimer: I haven't really read this thread, but I've thought about this a reasonable amount before.

It becomes pretty clear when you say that the envelopes have x and 2x dollars in them. One guy will always lose x dollars when you switch, the other gains x dollars. It's breakeven, at this point. If you prefer, look at it another way. The other guy has either x/2 or 2x. If he has x/2, then he gains x/2 when you switch. If he has 2x, then he loses x. So it is -EV to him when you switch. Where's the asymmetry? If he has 2x, it is not possible for you to have 4x, so he only gets to go down. This is the same kind of reasoning that you employ to show that switching twice is neutral EV. I'll demonstrate real quickly.

Suppose I'm given x dollars, and the other envelope has 2x or x/2 dollars. Then switching has an EV of + x/4. What about switching again? Well, half of the time I'll lose x dollars (when I correctly switched before) and half of the time I'll gain x/2. So switching after I've switched once just brings me exactly back to neutral, as it should since I have the original envelope.

hmkpoker
08-29-2005, 12:27 PM
I thought of two variants of this game that illustrate my point. Both assume that the player understands that the switch is +EV, and does not adapt to any patterns in the game (because if one did, then there obviously COULD be a +EV strategy)

FIRST: Suppose there are two pairs of envelopes: a $1/2 pair, and a $2/4 pair. Player is always given a $2 envelope, and naturally always elects to switch. He ends up making $.50 a hand. +EV the whole way.

SECOND: The same two-pairs-of-envelope situation applies, but this time he can be given any one of the four envelopes, and he must switch on each one. In this case, however, his odds are break-even!

Let's look at this for a minute.

If he gets a $1 envelope, his EV on the switch is $1. He will definately switch to 2.

If he gets a $2 envelope, his EV on the switch is $.50, for reasons already stated.

BUT, if he gets a $4 envelope, his EV on the switch is -$2.00, making this system completely break-even.

Now, if our hero played long enough to figure out the pattern, he would stand on a $4, making the EV of the non-switch +0, and his overall EV in the game (($1+$.5+$.5+$0)/4=$.25 a hand, again.

What happened here!

Our player's edge came from understanding the parameters of the game. Always switching in the first scenario is obviously correct. Switching is done strategically in the second situation, though.

Now let's examine why this "paradox" is so counter-intuitive. We assume there are no cues as to what the upper-limit of the game might be. Suppose the fellow in our example recieved a check for over a million pounds, and that was more than thrice any check he'd ever seen in the game. If his strategy is to switch, and if his strategy actually pays off (remember, an error his is a HUGE mistake), then we are left to conclude that a player's edge is, in practical terms, incalculable because for a player to recieve absolutely no cues as to what the upper-limit might be, the player must recieve a fair distribution of numbers ranging from zero to infinity. The mere fact that the player would be getting so much money allows us to completely ignore colossal reverse-implied errors (like switching from $4 to $2, an error so egregious that it completely negates the edge), and the player will go on to become an infinity-aire.

Now.

Let's look at one other factor that was ignored.

Relative to the envelope the player recieved, the player may have already committed a -EV error. By "committed," I mean that he is destined to fail here. If a masterful online heads-up limit player picks up AK, and his opponent picks up AA, and the flop brings A-K-2, even the best strategy will leave our big slick master crippled. In much the same way, picking up the $4 envelope dooms our hero to -EV failure. This negative edge will cripple the supposed +EV of switching.


Remember, we usually just deal with EV in terms of games with more concrete parameters. When the game begins to utilize infinite parameters, we cannot gain a calculable edge.

hmkpoker
08-29-2005, 12:32 PM
Let's suppose our hero is in for a different, and much more deadly game. The envelope he recieves may be a nice fat paycheck, but it could also be an equally large debt!

In terms of EV, and by the logic proposed, he should switch on checks, and stand on debts. However, because we have infinite parameters, we are left to conclude that he will make an lose his entire fortune in one hand an infinite number of times. He will still fall victim to the reverse implied odds death trap on certain debts, and will end up with an edge of +0 and an incalculably high SD.

PairTheBoard
08-29-2005, 01:01 PM
[ QUOTE ]
Jman --
After reading the second resolution, I was almost ready to accept it, as anytime infinity is involved, my logic seems to not account for everything. But then he goes on to show it in an example without infinity.


[/ QUOTE ]

Notice in the Finite example he doesn't get an "Always Switch" strategy. In fact, I don't think it's possible to get an "Always Switch" strategy when the probabilty distribution for the Envelope amounts has finite expectation.

PairTheBoard

PairTheBoard
08-29-2005, 01:10 PM
[ QUOTE ]
[ QUOTE ]
Yeah. I am a bit baffled. I don't think I'm willing to accept that they can both improve EV, but as of now, I have no argument against it other than "it doesn't make sense."

[/ QUOTE ]

Disclaimer: I haven't really read this thread, but I've thought about this a reasonable amount before.

It becomes pretty clear when you say that the envelopes have x and 2x dollars in them. One guy will always lose x dollars when you switch, the other gains x dollars. It's breakeven, at this point. If you prefer, look at it another way. The other guy has either x/2 or 2x. If he has x/2, then he gains x/2 when you switch. If he has 2x, then he loses x. So it is -EV to him when you switch. Where's the asymmetry? If he has 2x, it is not possible for you to have 4x, so he only gets to go down. This is the same kind of reasoning that you employ to show that switching twice is neutral EV. I'll demonstrate real quickly.

Suppose I'm given x dollars, and the other envelope has 2x or x/2 dollars. Then switching has an EV of + x/4. What about switching again? Well, half of the time I'll lose x dollars (when I correctly switched before) and half of the time I'll gain x/2. So switching after I've switched once just brings me exactly back to neutral, as it should since I have the original envelope.

[/ QUOTE ]

We decided to confuse ourselves gumpzilla, by moving on to a new improved version of the Two Envelopes problem Here (http://www.csua.berkeley.edu/~emin/writings/more_envelopes.html) .

In the new version there is a probabilty distribution defined for the amounts put in the envelopes, thus making it possible to compute the conditional probabilities based on the observed Envelope amount.

PairTheBoard

hmkpoker
08-29-2005, 01:35 PM
The same could be said of hitting your gutshot...there is either a 100% probability that you'll spike the jack, or a 100% probability that you won't. Obviously this is strategic suicide.

PairTheBoard
08-29-2005, 01:45 PM
[ QUOTE ]
The same could be said of hitting your gutshot...there is either a 100% probability that you'll spike the jack, or a 100% probability that you won't. Obviously this is strategic suicide.

[/ QUOTE ]

You could say that. And if that's ALL you knew about the card to be dealt you would NOT be able to compute the EV of the play.

PairTheBoard

Jman28
08-29-2005, 02:29 PM
[ QUOTE ]
[ QUOTE ]
Jman --
After reading the second resolution, I was almost ready to accept it, as anytime infinity is involved, my logic seems to not account for everything. But then he goes on to show it in an example without infinity.


[/ QUOTE ]

Notice in the Finite example he doesn't get an "Always Switch" strategy. In fact, I don't think it's possible to get an "Always Switch" strategy when the probabilty distribution for the Envelope amounts has finite expectation.

PairTheBoard

[/ QUOTE ]

Well, obviously you shouldn't switch when you get the maximum possible in the finite example.

Maybe I'm confused. What exactly did he show in the finite example? I thought it showed that the EV of always switching is 11/9 * Y.

He does say at the end of it:
[ QUOTE ]
Thus the expected for switching exceeds the expected wealth for staying by 2/9 Y and so we should always switch.

[/ QUOTE ]

housenuts
08-29-2005, 03:50 PM
would switching not be like making a 50/50 bet yet receiving 2:1 odds?

we'll say you open the first envelope and it's $40,000. so you know the other envelope is either $20,000 or $80,000.

by switching you are essentially wagering $20,000 to win $40,000 on a 50/50 shot. mathematically it's a sound bet, but it depends how much gamble you have in you.

Piers
08-29-2005, 04:38 PM
This appears to get asked about once a month.

The answer is that it is not possible to have a uniformed distribution over an infinite interval.

If you try to model the problem by giving each pair of possible amounts ($N,$2N) equal probabilities where N rages across the interval zero to infinity, you can not get the model to follow the axioms of a probability space. Hence using the word probability in this context is confusing and not justified.

The resultant paradox of getting infinite expectation by keeping on switching is much the same as showing any number equals any other number by dividing by zero.

gumpzilla
08-29-2005, 04:51 PM
Yeah, I've seen that variation before. But I don't think it actually makes that big a difference, as I recall. The key is that unless the amount in your envelope is precisely 1, you're always in a 3x vs. 1/3 x situation and you just have to figure out the weighting. So I think it's all very similar to the more traditional example but with a slight extra layer of mathematical misdirection. Again, I'm pretty busy today, so I only have a small amount of time to waste on 2+2 and didn't read the whole thing. It looks like it's talking about the infinite value of the game and the concept of utility. When I worked on that problem before, I found that the value of the game converges quite nicely if you pick a utility function that goes either logarithmically or as the square root of the expected value.

Jman28
08-29-2005, 05:36 PM
[ QUOTE ]
The key is that unless the amount in your envelope is precisely 1, you're always in a 3x vs. 1/3 x situation and you just have to figure out the weighting.

[/ QUOTE ]

But the weighting is already there for you, because of the prior probabilities of the coin flip. That seems to make this example stronger.

Masquerade
08-29-2005, 06:23 PM
It doesnt make any difference whether you swap or not.

Consider doing the experiment a large number of times with N and 2N in the envelopes with two groups: stickers and flippers.

Half the stickers will originally choose N, the other half 2N so the EV is 3/2 N.

Half the flippers will originally choose N and flip to 2N, the other half will originally choose 2N and flip to N, so the EV is 3/2 N.

All intermediate strategies also have EV of 3/2 N.

gumpzilla
08-29-2005, 07:38 PM
[ QUOTE ]

But the weighting is already there for you, because of the prior probabilities of the coin flip. That seems to make this example stronger.

[/ QUOTE ]

Sure, I realize that. My point was that there aren't really fundamental differences. The old "not enough money in the universe" argument can still theoretically apply, for example.

PairTheBoard
08-30-2005, 01:16 AM
[ QUOTE ]
Jman --
Maybe I'm confused. What exactly did he show in the finite example? I thought it showed that the EV of always switching is 11/9 * Y.


[/ QUOTE ]

That was the original example he gave where the expected value of the Envelope amonts is infinite.

In the finite EV example he shows how you can still have a switching strategy with Finite Envelope EV, But it won't be an ALWAYS Switch strategy.

PairTheBoard

PairTheBoard
08-30-2005, 01:19 AM
[ QUOTE ]
would switching not be like making a 50/50 bet yet receiving 2:1 odds?

we'll say you open the first envelope and it's $40,000. so you know the other envelope is either $20,000 or $80,000.

by switching you are essentially wagering $20,000 to win $40,000 on a 50/50 shot. mathematically it's a sound bet, but it depends how much gamble you have in you.

[/ QUOTE ]

In the OP Two Envelope Problem, it's Not a 50/50 shot. That's the thing that hard to accept.

PairTheBoard

PairTheBoard
08-30-2005, 01:27 AM
[ QUOTE ]
This appears to get asked about once a month.

The answer is that it is not possible to have a uniformed distribution over an infinite interval.

If you try to model the problem by giving each pair of possible amounts ($N,$2N) equal probabilities where N rages across the interval zero to infinity, you can not get the model to follow the axioms of a probability space. Hence using the word probability in this context is confusing and not justified.

The resultant paradox of getting infinite expectation by keeping on switching is much the same as showing any number equals any other number by dividing by zero.

[/ QUOTE ]

That Would be an inconsistency in the problem as it is often stated, with the amounts in the Envelopes chosen "At Random". The OP simplifies this version by omitting that phrase. There is no mention of how the Envelope amounts came to be what they are. Any talk of a prior distribution for the Envelope amounts is going beyond the bounds of the problem as stated by the OP. You're left with two Fixed Envelope amounts. If the game is to be repeated with other audience members, the same two Fixed Envelope amounts will be used. That's the game. And that version of the game forces the Uncomfortable conclusion that once an Envelope amount is revealed, it is no longer 50/50 that the other Envelope is double. It's either 100% or 0%, we just don't know which. That's the Conclusion No One wants to accept, but it's the way it is.

PairTheBoard

PairTheBoard
08-30-2005, 01:40 AM
[ QUOTE ]
Yeah, I've seen that variation before. But I don't think it actually makes that big a difference, as I recall. The key is that unless the amount in your envelope is precisely 1, you're always in a 3x vs. 1/3 x situation and you just have to figure out the weighting. So I think it's all very similar to the more traditional example but with a slight extra layer of mathematical misdirection. Again, I'm pretty busy today, so I only have a small amount of time to waste on 2+2 and didn't read the whole thing. It looks like it's talking about the infinite value of the game and the concept of utility. When I worked on that problem before, I found that the value of the game converges quite nicely if you pick a utility function that goes either logarithmically or as the square root of the expected value.

[/ QUOTE ]

There's no need to bring in "utility" concepts. The key difference between the two Versions is that in the Side Version there is a Prior Distribution for choosing the Envelope amounts. Because of that Prior Probabilty Distribution you can actually calculate the conditional expected values. In the Original Problem, The Envelope Amounts are Fixed. If the game is repeated it will repeat with the same Fixed Envelope amounts.

Furthermore, in the Side Version, the prior distribution for Envelope amounts is assumed known. So switching strategies can be based on the known distribution for the Envelope amounts.

In the Original Problem, even though the Envelope amounts are Fixed, we could still look at them as coming from a prior distribution, namely the Delta, fixed point distribution. If we assume This Delta distribution is Known in the same way we assume the 3^n distribution is known in the Side Version, then a Switching Strategy could also be calculated for the Original Problem.

If it's known the the Fixed Envelope amounts are N and 2N, then when opening the envelope, if you see N switch, if you see 2N don't switch.

PairTheBoard

PairTheBoard
08-30-2005, 01:44 AM
[ QUOTE ]
It doesnt make any difference whether you swap or not.

Consider doing the experiment a large number of times with N and 2N in the envelopes with two groups: stickers and flippers.

Half the stickers will originally choose N, the other half 2N so the EV is 3/2 N.

Half the flippers will originally choose N and flip to 2N, the other half will originally choose 2N and flip to N, so the EV is 3/2 N.

All intermediate strategies also have EV of 3/2 N.

[/ QUOTE ]

This is the correct way to look at it. Define the repeated experiment. If you repeat the experiment in the OP you will always be using the Fixed Envelope amounts, N and 2N.

PairTheBoard

Prevaricator
08-30-2005, 02:15 AM
[ QUOTE ]
[ QUOTE ]
Read King Yao's Weighing the Odds in Hold'em Poker for the chapter entitled "The Monty Hall Problem". Same thing.

[/ QUOTE ]

I dont think its the same thing. It is always correct to rechoose in the "Monty Hall problem" as you increase your odds of success from 1/3 to 1/2. The OP has somehow applied this reasoning to a dilema which I dont think you can do. Your odds stay 1/2 in the OP. I dont have the math knowledge to show why the OP is wrong but I think it is. Your average return on rechoosing envelopes should be zero. Maybe not opening the first envelope is the problem, I dont know.

[/ QUOTE ]

in the monty hall problem you increase your odds to 2/3 not 1/2

Piers
08-30-2005, 06:44 AM
[ QUOTE ]
OP: You need to calculate the expected gain due to swapping.

[/ QUOTE ]

[ QUOTE ]
OP: That is the paradox. A simple mathematical puzzle. The question is: What is wrong? Where does the fallacy lie,

[/ QUOTE ]

[ QUOTE ]
PtB: Any talk of a prior distribution for the Envelope amounts is going beyond the bounds of the problem as stated by the OP.


[/ QUOTE ]

To talk about probability and expectation there must be a probability space. Which means there must be a probability distribution.

To answer the question in a real world situation you will need to create a probability distribution based on what you know about game show budgets and odd monetary amounts. Compare cheques for $50.25 and $201 etc… A real world distribution is likely very messy.

[ QUOTE ]
It doesn't matter what the money is, the outcome is the same if you follow the same line of reasoning. Suppose you opened the envelope and found N pounds in the envelope, then you would calculate your expected gain from swapping to be 0.5(N/2)+0.5(2N)-N = N/4, and as this is greater than zero, you would swap.


[/ QUOTE ]

This is the place where the OP uses a uniform distribution on the positive numbers. He uses it to create his paradox, and my post says why it does not work.

gumpzilla
08-30-2005, 09:23 AM
[ QUOTE ]

There's no need to bring in "utility" concepts.

[/ QUOTE ]

If you want the game to have a finite value, you do.

PairTheBoard
08-30-2005, 10:24 AM
[ QUOTE ]
Piers --

To talk about probability and expectation there must be a probability space. Which means there must be a probability distribution.

To answer the question in a real world situation you will need to create a probability distribution based on what you know about game show budgets and odd monetary amounts. Compare cheques for $50.25 and $201 etc… A real world distribution is likely very messy.


[/ QUOTE ]

If the Envelope amounts are Fixed you Do have a probabilty Space. It's the one defined by the repeated experiment of offering the Two envelopes with the same fixed amounts to various members of the audience.

Or, if you insist on a prior distribution for the Fixed Envelope amounts N,2N, just make it the Delta Distribution, ie. P(lower amount is N) = 1.



[ QUOTE ]
OP --
It doesn't matter what the money is, the outcome is the same if you follow the same line of reasoning. Suppose you opened the envelope and found N pounds in the envelope, then you would calculate your expected gain from swapping to be 0.5(N/2)+0.5(2N)-N = N/4, and as this is greater than zero, you would swap.
<font color="white"> ,
,
,
</font>
Piers --

This is the place where the OP uses a uniform distribution on the positive numbers. He uses it to create his paradox, and my post says why it does not work.

[/ QUOTE ]

That comment is not in the Settup of the situation. Furthermore, all it's saying is that the same reasoning would hold if a different amount had been chosen for the Envelope. It doesn't imply anything about how the amount that actually is in the envelope was actually chosen.

PairTheBoard

PairTheBoard
08-30-2005, 10:26 AM
[ QUOTE ]
[ QUOTE ]

There's no need to bring in "utility" concepts.

[/ QUOTE ]

If you want the game to have a finite value, you do.

[/ QUOTE ]

Or you could make it a finite value by changing it some other way. The game as is has infinite expected value.

PairTheBoard

Piers
08-30-2005, 11:00 AM
[ QUOTE ]
That comment is not in the Settup of the situation. Furthermore, all it's saying is that the same reasoning would hold if a different amount had been chosen for the Envelope. It doesn't imply anything about how the amount that actually is in the envelope was actually chosen.

[/ QUOTE ]

You are correct in that he did not state in the set-up of the situation how the amounts in the envelope where decided. However the last line says.

[ QUOTE ]
That is the paradox. A simple mathematical puzzle. The question is: What is wrong? Where does the fallacy lie, and what is the problem?

[/ QUOTE ]

He started with a set-up that did not mention anything about uniformed distributions, made some arguments that ended up with conclusion that you could keep swapping and increase you EV indefinitely. He then ansked what was wrong with his logic.

I claimed that his error was that he implicitly used a uniformed distribution here

[ QUOTE ]
It doesn't matter what the money is, the outcome is the same if you follow the same line of reasoning. Suppose you opened the envelope and found N pounds in the envelope, then you would calculate your expected gain from swapping to be 0.5(N/2)+0.5(2N)-N = N/4, and as this is greater than zero, you would swap.
,

[/ QUOTE ]

This is a point where his argument is wrong. The OP did not ask what the game show contested should do, but what was wrong with his argument.

However if you want to consider the best way for a contestant to make a decision you need to somehow model how the amounts to be put in the envelope where decieded. Number theory, game show budgets and the psychology of game show hosts all play a part.

When I say the probability distribution is very messy I am thinking in particular of this.

[ QUOTE ]
You open the envelope and see a cheque for $34 567.89 do you switch or not?

[/ QUOTE ]

PairTheBoard
08-30-2005, 01:11 PM
[ QUOTE ]
Piers --


I claimed that his error was that he implicitly used a uniformed distribution here
<font color="white"> ,
.
</font>


Quote:
OP --
<font color="white"> .. </font>

It doesn't matter what the money is, the outcome is the same if you follow the same line of reasoning. Suppose you opened the envelope and found N pounds in the envelope, then you would calculate your expected gain from swapping to be 0.5(N/2)+0.5(2N)-N = N/4, and as this is greater than zero, you would swap.


[/ QUOTE ]

Well, we both agree that he's wrong in saying the two probabilties are 50%. It's debatable what he had in mind when he came up with the 50-50. If the problem's settup included the "amounts randomly chosen" phrase then I would agree with you that he came up with it there, assuming the impossible uniform distribution. In fact, when "Ask Marilyn" presented this that's exactly what I argued and that Marilyn had missed the mark with her answer.

But when the "randomly chosen" language is removed from the problem and you Assume the Envelope amounts are Fixed or come from a Delta Distribution, you still have to explain why the 50-50 is wrong. You can say it's not 50-50 because it doesn't come from an impossible uniform distribution. But with the Delta Distribution as Given, if its not 50-50 what are the conditional probabilties? You are forced to say either 100% or 0%, you just don't know which. Those are the probabilities that fit the repeatable experiment of offering the same Envelopes to other fresh contestants, and That is the final conclusion that no one wants to face.

PairTheBoard

08-30-2005, 01:22 PM
[ QUOTE ]
if its not 50-50 what are the conditional probabilties? You are forced to say either 100% or 0%, you just don't know which.

[/ QUOTE ]

That doesn't make sense to me. Why can't you open a $30k envelope and think "Gee that's a lot, and I rarely see anyone win more than that, and given that the waiting lounge has furniture from the 1970s, I think the budget is too tight for $60k" and thus make an assessment that the odds are say 70/30 that you have the big envelope. We do these types of assessments everyday at the poker table.

PairTheBoard
08-30-2005, 01:46 PM
[ QUOTE ]
[ QUOTE ]
if its not 50-50 what are the conditional probabilties? You are forced to say either 100% or 0%, you just don't know which.

[/ QUOTE ]

That doesn't make sense to me. Why can't you open a $30k envelope and think "Gee that's a lot, and I rarely see anyone win more than that, and given that the waiting lounge has furniture from the 1970s, I think the budget is too tight for $60k" and thus make an assessment that the odds are say 70/30 that you have the big envelope. We do these types of assessments everyday at the poker table.

[/ QUOTE ]

Like I said before, that kind of speculation is outside the bounds of the problem. You can speculate endlessly on factors you know absoulutely nothing about. The amounts are chosen. They are fixed. You have no information on how they were chosen. The problem is not about guessing an apriori distribution for how the envelope amounts were picked. If you must have an apriori distribution, assume a Delta Distribition. Work with that rather than trying to evade it.

You let Sklansky force you to assume all sorts of things. Force yourself to assume a Delta Distribution for this problem. You just don't know what the Delta Distribution is. When you open the envelope you now know something about the Delta Distribution. But you still don't know exactly what the Delta Distribution is.

PairTheBoard

08-30-2005, 01:53 PM
[ QUOTE ]
Like I said before, that kind of speculation is outside the bounds of the problem.

[/ QUOTE ]

And that's why the problem is a paradox, because without those bounds, it actually would be +EV to always switch. But in reality, its never really 50-50. Once you see the contents of the envelope, you know that an envelope with half the money is certainly plausible, but you don't know if double the contents is plausible.

punter11235
08-30-2005, 05:44 PM
[ QUOTE ]
And that's why the problem is a paradox, because without those bounds, it actually would be +EV to always switch. But in reality, its never really 50-50. Once you see the contents of the envelope, you know that an envelope with half the money is certainly plausible, but you don't know if double the contents is plausible.

[/ QUOTE ]

This all makes no sense.

[ QUOTE ]
because without those bounds, it actually would be +EV to always switch

[/ QUOTE ]

And this particular statement is pure nonsense. How can you consider any problem without its bounds ?