PDA

View Full Version : the envelope paradox


hazeelnut
11-05-2003, 03:09 PM
Well, this isnt exactly poker related, but i thought i might get an answer to this paradox that has been bugging me the last couple of days. So, here it goes:

Suppose you are in a TV-show and get to choose between two different envelopes, envelope A and envelope B. These envelopes contain cash and the only thing you know about the amounts is that one of the envelopes contains twice as much as the other. So let's say you pick envelope A, and you find out that it is $400 in it. Now the host for this show gives you the option to take envelope B instead. Now you know that there is a 50% chance envelope B contains $200 and 50% chance it contains $800. So, since it is a 50% chance you make $400 and 50% you lose $200 by making the switch you decide to switch envelopes. But regardless of the amount in envelope A the correct decision would have been to switch.

How could this be?

hazeel

Fianchetto
11-05-2003, 03:28 PM
Interesting problem, the way I would look at it is this:
let's say you were to do this 200 times
-The first 100 times you don't switch, keep the $400.
-The second 100 times you switch.

Which would be more profitably?

Well, the the first scenario you would make $40,000 ($400*100). So you're expectation is $400.

Now in the second set of 100 trials, i.e. if you switch.
50 times you win only $200, and 50 times you win $800
So...50*200 + 50*800= $50,000
You would actually make more money in the long run by switching, and your expectation on any given try is $500 ($50,000/100) which is more.

So switch!

irchans
11-05-2003, 03:35 PM
The envelope question has been discussed a few times at 2+2. For example, Take a look at this. (http://forumserver.twoplustwo.com/showthreaded.php?Cat=&Number=366454&page=&view=&sb =5&o=)

hazeelnut
11-05-2003, 03:43 PM
Well, that answer is what makes it so strange and paradoxal. Since you should always switch regardless of the sum in the envelope you pick first, why would you even open the envelope before switching and not just pick the other envelope from the beginning.

Wake up CALL
11-05-2003, 04:03 PM
[ QUOTE ]
Well, that answer is what makes it so strange and paradoxal. Since you should always switch regardless of the sum in the envelope you pick first, why would you even open the envelope before switching and not just pick the other envelope from the beginning.


[/ QUOTE ]

You have restated the conditions of the problem. You should reread the thread to see why (and when to switch).

hazeelnut
11-05-2003, 04:50 PM
"You have restated the conditions of the problem. You should reread the thread to see why (and when to switch)."

I have reread the thread and still dont know when and why to switch, please tell me.

Wake up CALL
11-05-2003, 04:54 PM
If you have three choices $200, $400, $800, you would not switch when you got the $800 so you had better open your envelope. If you got the $200 you would certainly switch since both of the other choices are higher. If you got the $400 you would switch every time due to the reasons outlined in the post above by TiltedKilt.

CrisBrown
11-05-2003, 04:59 PM
hazeelnut,

This one is easy. /images/graemlins/smile.gif

Having chosen an envelope initially, you have either X or 2X in hand, but you don't know which. Switching gives you a .5 probability of -1X (you have 2X now and will have 1X if you switch), and a .5 probability of +1X (you have the 1X now and will double to 2X).

Your switch has zero expectation (-.5X + .5X), because you don't know the value of X. Your expectation for the game is 1.5X, regardless of which envelope you choose first, and regardless of whether you switch.

Cris

CrisBrown
11-05-2003, 05:05 PM
P.S. The reason you have to take the numbers out and replace them with X is that otherwise you change the value of X *for one half of the equation only*. And that's where the "paradox" comes from.

hazeelnut
11-05-2003, 05:16 PM
"If you have three choices $200, $400, $800, you would not switch when you got the $800 so you had better open your envelope. If you got the $200 you would certainly switch since both of the other choices are higher. If you got the $400 you would switch every time due to the reasons outlined in the post above by TiltedKilt."

I think you need to reread the original question. You dont have 3 choices.

The problem is:

It looks mathematically correct to switch all the time but clearly this cant be true because if you switch every time it doesnt have any meaning.

DrSavage
11-05-2003, 05:41 PM
good problem.
basically, the paradox comes from the inexistance of even distribution on infinite positive real numbers. Good article on a subject can be found here:
http://www.u.arizona.edu/~chalmers/papers/envelope.html

hazeelnut
11-05-2003, 06:48 PM
Good link. The solution was a bit more complicated than i had hoped for, but i think i get it now. Thanks.

DougBrennan
11-05-2003, 07:42 PM
How about this for a layman's explanation.

You open the envelope. It has a given amount. You switch envelopes. If you win the switch, you double your money. In the example you gave, you GAIN $400. If you lose the switch, you DROP only $200.

I think the paradox comes from the fact that you gain more when you win than you lose when you drop down.

It also helps if you don't think about it too long.

I already have a headache. /images/graemlins/tongue.gif

CrisBrown
11-05-2003, 11:08 PM
Hi hazeelnut,

<<It looks mathematically correct to switch all the time but clearly this cant be true because if you switch every time it doesnt have any meaning.>>

It only looks mathematically correct because the wording of the problem is misleading. This isn't a mathematical paradox. It's a linguistic paradox, based upon equivocation.

Consider the problem this way:

There are two envelopes: A and B. One holds X. The other holds 2X. You don't know which is which. Even after you select one, you're not sure whether that envelope has X or 2X, so you're allowed one chance to switch for the other. Should you?

The answer, in this format, is clearly that it doesn't matter. If your envelope has X (a 50% probability), then if you switch you'll have 2X, for a net gain of X. If you already have the envelope with 2X (a 50% probability), then if you switch you'll have X, for a net loss of X. Thus, by switching you have a 50% chance to gain X, and a 50% chance to lose X.

The "paradox" only arises when you pick a value for the contents of the first selection, and then try to apply the same mathematical calculation I just gave, using TWO DIFFERENT VALUES FOR X.

Well, gee, if you change the value of X halfway through any mathematical calculation, you'll get funky outcomes. That simply means that if you carefully MISstate a math problem, it can trip up the casual reader.

Cris

George Rice
11-06-2003, 12:33 AM
This problem reminds me of a riddle I heard many years ago.

Three guys go to a hotel looking for a room. The man a the desk tells them the room is $30. Each man chips in $10 to make up the fee. After going up to their room the man at the desk realizes that he overcharged them--he should have charged $25. So the man gives the bellhop $5 and tells him to bring the money up to the men in the room. On the way up to the room the belhop realizes that $5 can't be split evenly three ways, so he decides to give each man $1 and keep $2 for himself.

Each man has spent $9 for a total of $27. The bellhop has $2. So $27 + $2 = $29. But they originally laid out $30. What happened to the other dollar?

The person who told me this couldn't solve it and claimed his calculus teacher told him it was a complex problem (lol). I was just a high school kid and figured it out immediately but was unable to convince anyone of the solution. Over the years only one person I told this to was able to give me the correct answer. But I suspect most of you will see the solution immediately. So who wants to be first?

daryn
11-06-2003, 01:18 AM
hmm..

the hotel manager has $25 in his pocket (or register or whatever)... each man paid $9.. so $9 x 3 = $27 and if the hotel manager has $25 of it, the bell hop has the other $2.. tricky wording i guess but it seems obvious?

CrisBrown
11-06-2003, 02:15 AM
Hi George,

Nothing. The hotel has $25. The bellhop has $2. Each of the three men has $1. *shrugs*

Cris

DrSavage
11-06-2003, 02:28 AM
Here's a way to understand this problem which i believe to be simple :
Before you choose if you want to switch envelopes or not, the person who owns the money decides how much money to put in each envelope. Let's say he chooses an amount X , after which he puts 1/3 of X in one envelope and 2/3 of X in second one. Let's assume he chooses X randomly. Let's say the probability of choosing number X is P(X). You can easily see that P(X) can't be the same for all X, because then the sum(P(X)) where X = 0 .. infinity would be infinite too, but it must be 1 according to basic law of probability. Therefore P(X) must be different for some X. Because of that , when the player opens the envelope and sees the amount Y , the probability of him having chosen the smallest one is not 50%, but some number which depends on P(3*Y) and P(3*Y/2). You can't really evaluate this probability since you don't know anything about probability distribution which was used to randomly generate X. And since that probability is not 50% the paradox falls apart. As an example , let's say that the total amount money owner would put in the envelope is evenly distributed between 100$ and 1000$. If you open the envelope and see the amount bigger than 333.3$ it means that the probability of you having opened the small one is exactly 0.
And if the number is between 66$ and 333$ it is in fact correct to switch. If you are still confused as why you shouldn't have taken the 2nd envelope in the first place, that is because you gained information having opened the first one. Which is only going to help you in case you know the original distribution. Have you not known that he chose the amount between 100$ and 1000$ it doesn't matter if you switch or not since you can't have any idea of what the real probability of you having chosen the small envelope is.

bigpooch
11-06-2003, 04:27 AM
Here is a similar problem that is famous that I'll
restate.

Suppose someone has three big envelopes, two of
which have fake money inside and the third has
ten thousand bucks (just in case you want to play
the WSOP!). He lets you pick one of the three
envelopes and you do so but he doesn't let you
open it just yet! Now he tears open one of the
two envelopes you didn't open and shows you that
it has a bunch of play money in it. He then says
you can either have the envelope that you picked
or the unopened envelope that you haven't picked;
in other words, you can either stick with your
pick or switch. What do you do?

The correct answer I'll post later on but the
answer is quite surprising!

BruceZ
11-06-2003, 04:35 AM
Please let's not start another Monty Hall thread on this forum. Do a search in the archives, and you can read far more than you ever wanted to know about this.

hazeelnut
11-06-2003, 07:43 AM
"Let's assume he chooses X randomly. Let's say the probability of choosing number X is P(X). You can easily see that P(X) can't be the same for all X, because then the sum(P(X)) where X = 0 .. infinity would be infinite too, but it must be 1 according to basic law of probability. Therefore P(X) must be different for some X. "

I dont mean to be picky about this, but P(X) can in fact be the same for all X (It just happens not to be in this case). Suppose you spin a wheel and the wheel can stop on any real number between 0 and 1. So there is an infinite amount of numbers the wheel can stop on. The probability it stops on a certain number is always 0, but the probability it stops on ANY number is always 1. Hence
P(X) = 0 for all X but
sum(P(X)) = 1

Good explanation otherwise. I understand now and can think about other things again, which is always nice.

Hazeel

hazeelnut
11-06-2003, 07:55 AM
"There are two envelopes: A and B. One holds X. The other holds 2X. You don't know which is which. Even after you select one, you're not sure whether that envelope has X or 2X, so you're allowed one chance to switch for the other. Should you?

The answer, in this format, is clearly that it doesn't matter. If your envelope has X (a 50% probability), then if you switch you'll have 2X, for a net gain of X. If you already have the envelope with 2X (a 50% probability), then if you switch you'll have X, for a net loss of X. Thus, by switching you have a 50% chance to gain X, and a 50% chance to lose X."

Hi there Cris

This is the paradox:

1. The common sense saying it clearly doesnt matter to switch (which you stated in a more mathematical way)

2. The at first glance flawless mathematical calculation that says you should always switch.

Since no one can argue with the common sense part (1) of the paradox i wanted to find out what was wrong with the mathematical part (2) of it. Not just state the common sense part of it more mathematically.

Anyway, i get it now, Dr Savage had both a good link and explanation that you can read.

Thanks anyway.

Hazeel

irchans
11-06-2003, 09:29 AM
I remember this question from middle school. What a great question for teaching kids to reason!

rivaridge
11-06-2003, 10:13 AM
I dont know much about this infinite probability garbage but it seems pretty simple to me. To switch I am being offered 100% return on my money if I pick right and a negative 50% return on my money if I pick wrong. If I add these up (50% Times 100% plus 50% times negative 50%)my "EV" is plus 25% so therefore I switch.
Maybe the confusion lies with the percentages. i.e when you double your money you get 100% more and when you lose half your cash you lose only 50%.

irchans
11-06-2003, 10:25 AM
Suppose I switch with probability Exp[-y] where y is the amount in the envelope I pick and y>0. Then what is my expectation?


If the envelopes contain X and 2*X, then there is a 50% chance that I will pick envelope X and my expectation is

expect_pick_X
= Exp[-X]*2*X + (1 - Exp[-X])*X
= (1 + Exp[-X])*X.

If instead, I get the envelope with 2*X in it, my expectation is

expect_pick_2X
= Exp[-2*X]*X + (1 - Exp[-2*X])*2*X
= (2 - Exp[-2*X])*X

The total expectation for the exponential switching scheme is now

exp_expon = .5*expect_pick_X + .5*expect_pick_2X
= .5*((1 + Exp[-X])*X) + .5*(2 - Exp[-2*X])*X
= (X * (3 - exp[-2*X] + exp[-X]) )/2.

The expectation for not switching is

exp_no_switch =3/2*X.

To see which method is better subtract.

exp_expon - exp_no_switch
= (X * (3 - exp[-2*X] + exp[-X]) )/2 - 3/2*X
= (X * ( exp[-X] - exp[-2*X] ) )/2
= ((Exp[X] - 1) * X)/(2 * Exp[2*X]).

That expression is greater than zero when X>0. So the exponential swtiching method is always better than not switching.

ccwhoelse?
11-06-2003, 12:42 PM
[ QUOTE ]
The "paradox" only arises when you pick a value for the contents of the first selection, and then try to apply the same mathematical calculation I just gave, using TWO DIFFERENT VALUES FOR X.


[/ QUOTE ]

i dunno about that

[ QUOTE ]
There are two envelopes: A and B. One holds X. The other holds 2X. You don't know which is which. Even after you select one, you're not sure whether that envelope has X or 2X, so you're allowed one chance to switch for the other. Should you?

[/ QUOTE ]

instead of saying A holds X and B holds 2X, you can say

the envelope you open holds X. then the other envelope can either have 2X or X/2. then you have the same problem as before with dollar values.

or it could be that i know nothing about math so maybe you should assign each envelope a variable before actually opening one up, in which case you are right.

but then again, it seems you are distinguishing between envelopes when in reality you really don't know.

to say one envelope has X and the other 2X is not not that same as to say one has X and the other might have 2X or X/2

Bozeman
11-06-2003, 03:09 PM
Not if X can be boundlessly large.

BillsChips
11-06-2003, 04:25 PM
(My first post)

It seems to me that this is a lot less complicated than most are making it.

Your envelope contains $400. If you switch, you're risking $200 to make an extra $400. So you're getting 2-to-1 odds on an even money (50/50) bet. I think you'd have to take that every time.

Moonsugar
11-06-2003, 05:02 PM
What is bigger?

x

or

(.5*2x)+(.5*1/2x)

Since you make 1/4x if you switch, you always should switch.

And I don't know why you weren't smart enough to pick up the 'right' envelope to begin with /images/graemlins/wink.gif

DonCaspero
11-06-2003, 06:05 PM
Knowing the odds for the pick beforehand you should be smart and pick up the "wrong" envelope the first time, no?

johnnyhearts
11-06-2003, 06:30 PM
I did find the work at the previously mentioned web site interesting but it's unbelievable that those mathematical contortions only come up with the concept that there's some unknown bound on how high the $ amount might go so, therefore, if the amount in the first envelope is high enough (and this is just your judgement call I suppose) you shouldn't trade because the $ amount is so huge it's unlikely there's double that amount in the other envelope!!! The math dude then goes on to admit that if it would, in fact, be feasible for the amount to go to infinity then the paradox is back!!! Amazing! This doesn't seem like a solution to me, but more like a cop out! I'm curious what Mr. Malmuth has to say about this problem since he's a mathematician.

As an aside, this post reminded me of a coffee house hustle. The hustler chirps to the goof, "I'm going to spread 3 cards face down one of which is the A /images/graemlins/spade.gif. You get a chance to draw for the A /images/graemlins/spade.gif and then, without looking at the card you drew make a decision as to whether you want to redraw after I remove one of the remaining cards which I promise isn't the A /images/graemlins/spade.gif. Now, if you don't redraw I'll lay 3 to your 2 but, if you draw, I'll take 2 to my 1!"

Happy Hustling Gentlemen

Johnny Hearts

daryn
11-06-2003, 06:58 PM
there is no paradox. you make the same either way... it should be clear because if you say whichever one you pick, you should switch to the other... then obviously the act of switching matters not.

i am using confusing language here.. forget it.. just trust that there is no paradox

George Rice
11-06-2003, 07:34 PM
Of course. Easy one for 2+2'ers. Try that one on your average guy and you'd be surprised how many fall for the 27 + 2 = 29 instead of 25 + 2 = 27. Amazing.

budman
11-06-2003, 08:05 PM
Then try to get a tell from the host.

mosta
11-07-2003, 02:38 PM
wow! all these posts and only one poster (CrisBrown)
who has any idea what he's talking about! And this
is grade-school level math! What this says about
the requisites for poker play... .

Reread CrisBrown's posts over and over again, very
slowly until you understand. DrSavage, please...
never mind about the math. You go and think about
what "inexistance"[!] is supposed to mean.

The "paradox" [actually, "confusion" is better] is
very simple. The problem says there are two envelops.
They are sealed. Each one has one fixed amount of money,
A and B, such that B = 2 * A. When you say that you
open one envelop with X in it and then the other
envelop must have .5X or 2X in it, that is FALSE.
There are not THREE possible amounts. You have
changed the terms of the problem. There are
actually only TWO possible amounts. What you must say
is, either I have $A and the other envelop has $B
which is $2A, or I have $B and the other envelop
has $A, which is $.5B. And each scenario is equally likely.
NOTE: Again, the amounts are not (.5X, X, and 2X)
they are (A and B) = (A and 2A) = (B and .5B).
I leave it as an exercise to the reader to do the
conditional probability.

This has NOTHING to do with the properties of the
continuum. If you still have difficulty comprehending
the answer, perform the following exercise: obtain
two envelops, a fifty and a hundred. SEAL each bill
in one envelop. Now NOTE that the amounts 25 and
200 will never come into play. Now play the game
with two friends, one who switches, one who doesn't,
opening the envelops and then repeating with new
envelops. Do it a thousand times and record the
results. Actually, no, play the game with me.
I will play you for any amount of money for as
long as I live. Pick the amounts A and 2A, and
for the mere price of 1.55 A I will let you
switch envelops with me until you take all
my money--I'll even pay for the envelops!

The other question, where there actually are three
envelops is more like a paradox--but actually it's
only counterintuitive. The math of course is
elementary. The intuitive way to understand it is
to note that: the switching procedure (after one
loser has been removed) ALWAYS
turns a winner into a loser and a loser to a
winner. Switching after a loser is removed is the
inverse or complement of the random pick.
Therefore the expectation of switching
after one loser is removed is the opposite of
the odds of the original choice (ie 1 in 3 becomes
2 in 3). (Note that the fine point is that
removing one loser introduces new information
into the situation, because the envelop (or game
show door) that is removed is not picked at
random. If an envelop (or door) were to removed
by a meteor, and it happened to be one of the
losers, switching would have a zero expectation.)

DrSavage
11-07-2003, 02:58 PM
You are wrong.

DrSavage
11-07-2003, 03:51 PM
I want to give a bit more detailed answer to this.
[ QUOTE ]
This has NOTHING to do with the properties of the
continuum. If you still have difficulty comprehending
the answer, perform the following exercise: obtain
two envelops, a fifty and a hundred. SEAL each bill
in one envelop. Now NOTE that the amounts 25 and
200 will never come into play. Now play the game
with two friends, one who switches, one who doesn't,
opening the envelops and then repeating with new
envelops. Do it a thousand times and record the
results. Actually, no, play the game with me.
I will play you for any amount of money for as
long as I live. Pick the amounts A and 2A, and
for the mere price of 1.55 A I will let you
switch envelops with me until you take all
my money--I'll even pay for the envelops!


[/ QUOTE ]
I'm not arguing with that. But the problem is this experiment doesn't really demonstrate what the paradox is all about in my opinion, since you took out the randomness of money in the envelope. If it's always the same amount it clearly doesn't matter which one you choose and whether you look at the amount. This is because when you look at the envelope with the total amount being constant , probability of second envelope being bigger is not 50%, it either 100% or 0%, you just don't know which.
If you missed the point of my post, i suggest you run a following simulation:
1)choose a random amount between 0 and 1 (x)
2)assign two variables: one being the 1/3 of x , and second 2/3 of x
3) randomly choose one of them
keep two totals :
4) add the chosen amount to first one
5) add to second one the same amount if it's more than 0.333, otherwise add the amount you didn't choose (as in switch envelopes whenever you think the amount is too small)
6) repeat 1000000 times.

Look at results.

I can save you the trouble of running this simulation, here are my numbers:
never switch : 250202
switch if less than 0.333 : 312746

note that second amount is 25% bigger than the first one, which is the same EV gain as in example given.
Have there existed an even distribution for all positive numbers, it would be always correct to switch. The main reason, as I stick to my explanation , is because that's impossible.

BillsChips
11-07-2003, 03:54 PM
One more take on this.

There are only two choices; Envelope A or Envelope B
The first choice is Random because you have no information about the contents.
Looking at the contents of any one envelope doesn't add to the what you know about the other envelope.

If you always switch, then the choice is still random because:

Choice = Not(Random(A or B))

If Random(A or B) = A then Choice = B
If Random(A or B) = B then Choice = A

The switch doesn't change the fact that the selection is still random.

So switching shouldn't matter in the long run; the envelope is randomly selected in either case.

Copernicus
11-07-2003, 04:01 PM
He's not wrong, Cris is right, there is no paradox, it is only the wording of the problem that makes it seem like there is.

Forget money. You have two doors and you will spend two hours alone with whoever is in the room. Behind one is a beautiful woman (or handsome man, ladies) who hasnt had sex for 3 years. Behind the other is Montecore, who hasnt eaten Roy for 3 weeks. You pick one but just before you open the door you are given a chance to switch. Does it matter what you do? Of course not. The 4 possible scenarios are all equally likely. (Of course if you have a whip it could come in handy in either situation).

The confusion in the statement of the problem is, as mosta noted, that there arent three possiblities of 800, 400 and 200. There is either 400 and 800 or there is 400 and 200. It doesnt magically change after youve made a pick. Once the initial conditions are set your gain or loss by switching is identical, 400 or 200, depending on the initial conditions.

DrSavage
11-07-2003, 04:07 PM
[ QUOTE ]
He's not wrong, Cris is right, there is no paradox, it is only the wording of the problem that makes it seem like there is.


[/ QUOTE ]
Heh.
Why is it that for some reason everybody seems to believe that i think that switching envelopes would increase your EV ? I NEVER stated that, and i tried to make it as clear as possible in all my posts that it doesn't matter which one you choose.
All that I've written on subject is to illustrate WHY there is no paradox. And I additionaly illustrated that it becomes profitable to work out the correct strategy for switching if the original amount in the envelopes is random and you know its distribution.
That's all I'm saying.
I'm not ready to quit my job and start switching envelopes to make a living /images/graemlins/smile.gif.
Also some people might prefer Manticore.

Copernicus
11-07-2003, 04:30 PM
I was writing my post without the benefit of your clearer explanation..just based on "You're wrong", doc. Although even in your clearer explanation you introduce elements that arent in the original problem, such as "random distributions of amounts".

Once the envelopes are filled there is nothing random anymore and it will never matter whether you switch or not.

Note also that I never made any value judgement re Montecore vs the human, just the whip. /images/graemlins/cool.gif

DrSavage
11-07-2003, 05:00 PM
[ QUOTE ]
Once the envelopes are filled there is nothing random anymore and it will never matter whether you switch or not.

[/ QUOTE ]
Unless you know the original distribution for total amount /images/graemlins/smile.gif.
I just like saying that.
I'll make a couple of even clearer examples.
1) If i know that you always put 200 and 400 in envelopes my strategy would be to switch when i see 200$.
2) If i know that it's 50/50 between 200 and 400 or 400 and 800 my strategy would be to always switch if i have 200 or 400. If somebody believes that it's not correct to switch 400 here run a simulation.
And for any other distribution i can work out the correct strategy too.
And if i have no clue what those numbers could be it doesn't matter.

Louie Landale
11-07-2003, 05:42 PM
Your coffee house hustler situation is the classic Monte Hall situation, where Monte will ALWAYS show a goat. Except for the funny payouts. And, of course, except for the fact that he "promises" he won't take away the A spades, but does NOT "promise" that it was one of the original 3; and of course he won't show you that the other card WAS the A spades.

- Louie

Louie Landale
11-07-2003, 05:58 PM
It appears doing EV "calculations" that you should switch. Always switch. But if you gave two people each of the two envelopes, the each should switch for positive EV, yet clearly the outcome is 0 EV total. Since you cannot add up two positives to make zero, the conclusion that each should always switch for positive EV is wrong. Clearly wrong. Continuim be darned.

Now the problem is showing what's the problem with the logic resulting in that wrong conclusion.

"Paradox" means you can "reasonably", or more specifically "apparently reasonably" come to a conclusion that is clearly wrong. In this case, if I give each of two people each of the envelopes, they BOTH cannot have positive EV by switching. And certainly not by switching again. So any "reasonable" argument that says always switch is clearly wrong. Paradox is called "superiority" by those who can tell that the conclusion is wrong. Paradox is called "confusion" by those who can shoot down the "reasonable" argument (Just like "magic" is called "technology" by thse who understand it).

The knowledgable should NOT say "your conclusion is wrong" which is obvious, but should say "here's what's unreasonable about your argument".

Anyway, ... Here's a similar problem: I put some money in an envelope: you open the envelope and note the amount. Now I say you can keep that amount or you can gamble: half the time I'll double the amount and half the time I'll half it. EV says gamble: half the time you gain X and half the time you lose .5X.

But that's NOT this problem. By the time there are two envelopes the amounts are fixed and the amount you are wagering is also fixed. You either have the big envelope or you don't. Now read some of the other responses, who seem to explain the details better than I can. Hopefully, someone will say it better than all of us.

- Louie

BillsChips
11-07-2003, 05:59 PM
Your method isn't quite right.

There are only two possible values, not three:
1. The Larger amount
2. The Smaller amount

It doesn't matter what these values are as long is one value is twice the other, so you can use 400 and 800, or 1 and 2 etc. It's not necessary to vary the amounts because it doesn't add anything to the simulation, but adds unnecessary complexity to the trial. The fact that the $400 is mentioned in the problem description is totally irrelevant.

Randomly select 400 or 800 (or what ever values you choose).
Add the selected value to column one.
Add the opposite value to column two.

Now run your simulation and you'll see that it's purely random, approaching an even distribution.

mosta
11-07-2003, 06:00 PM
[/ QUOTE ]
I'm not arguing with that. But the problem is this experiment doesn't really demonstrate what the paradox is all about in my opinion, since you took out the randomness of money in the envelope. If it's always the same amount it clearly doesn't matter which one you choose and whether you



[/ QUOTE ]

The random properties of the contents of the envelops
are precisely this: p(A) = .5, p(B) = .5
the fact that you don't know what A and B are
is immaterial. They are still CONSTANTS. The
envelops are sealed. Money, to my knowledge, is
not subject to quantum fluctuations of denomination.
The contents of the second envelop do collapse
from a random state space when you open the first
envelop. A and B, two names, with a mathematical
relation defined between them. A and B are
constants. Don't confuse yourself by thinking that
when one number is determined--when you see it--
the other can then vary. It can't. The money
is in the sealed envelop. Do the problem just with
names, A and B. Figure out the distribution. THEN
use the mathematical relation (B=2A) to figure the
expectation. The relation is not the definition of
a random variable.

Compare this problem: I give you an envelop. You
open it. I flip a coin. Heads I put twice that
amount in another envelop, tails I put half,
and you don't see me what I put in there. Now
you should always switch. This scenario is
different entirely. The envelop is not sealed
until after the first one is opened, and the
contents of the second one are randomly determined
by the contenst of the first.

DrSavage
11-07-2003, 06:30 PM
mosta,
I understand that.
I don't need a simulation to confirm that.
I know that if you are given two envelopes and open one and then you choose the other it will not increase your EV.
That's completely not what I'm arguing for because the answer to that is simple and logical : it doesn't matter if you switch or not.
What I am arguing for are similar to original situations where switching may make a difference based on information i get when i open first envelope, and these situations arise when i know something about total amount of money in those envelopes. I do it to illustrate a point on WHY this problem SEEMS to be a paradox.
Let's say I open the first envelope and see 400$. If that's all i know it doesn't matter if i switch or not.
If I know that you could ONLY select to put 600$ or 1200$ total in these envelopes and it's equally probable that you chose either one of those, then it IS correct to switch.
It is also correct to switch here if i know for a fact that you could put anything in these envelopes, but 600$ is just as likely as 1200$. Because then it's the same problem as the one above, and the same problem as you defined in the second part of your post.
My point ,to sum it up, was that if you open an envelope and see 400$ it seems like it's correct to switch. It seems that way because you think that 600$ and 1200$ are equally possible opportunities. But switching WILL NOT be correct unless you know it for a fact in advance. And all the total amounts cannot be equally likely because that would ruin the total probability = 1 law.
I really can't see what we are arguing about. I don't disagree with you in anything that you say.

M.B.E.
11-07-2003, 07:09 PM
[ QUOTE ]
The envelope question has been discussed a few times at 2+2. For example, Take a look at this. (http://forumserver.twoplustwo.com/showthreaded.php?Cat=&Number=366454&page=&view=&sb =5&o=)

[/ QUOTE ]
Here is a link to the 2002 thread where irchans raised the envelope paradox:

http://forumserver.twoplustwo.com/showflat.php?Number=153855 (http://forumserver.twoplustwo.com/showflat.php?Number=153855#Post153855)

I just skimmed the thread without reading it in detail; most of the discussion was about the second puzzle in the thread (with the thousand one-dollar bills), not the envelope paradox.

M.B.E.
11-07-2003, 07:47 PM
[ QUOTE ]
The "paradox" [actually, "confusion" is better] is
very simple. The problem says there are two envelops.
They are sealed. Each one has one fixed amount of money,
A and B, such that B = 2 * A. When you say that you
open one envelop with X in it and then the other
envelop must have .5X or 2X in it, that is FALSE.

[/ QUOTE ]
No, Mosta is wrong. This does not resolve the paradox at all. There is no logical problem in saying "let X be the amount of money in the envelope I open". And when you say that, then the amount in the other envelope is either .5X or 2X. There's no getting around that. X is well defined.

DrSavage's resolution of the paradox (http://forumserver.twoplustwo.com/showthreaded.php?Cat=&Number=397383) is correct.

To put it slightly differently, the inconsistency in the wording of the problem (http://forumserver.twoplustwo.com/showthreaded.php?Cat=&Number=396770) is in the words "the only thing you know about the amounts is that one of the envelopes contains twice as much as the other". In fact, you do have other information about the amounts which you can infer from context. The problem states that you are on a TV-game show. Based on that, you can make reasonable inferences about the order of magnitude of the amount in each envelope. You can make more accurate inferences if you take into account whether the game show is national or local, prime-time or daytime, and so forth. If you've watched the game show before you can probably estimate this with even more precision.

So for example suppose you choose an envelope and it contains $1,500,000. Well, the current record for a TV game show winner (http://www.tvgameshows.net/winners.htm) is $2,180,000. It's pretty unlikely that the show you're on right now would be aiming to eclipse that record by giving away $3-million. It's a lot more likely that your envelope with $1.5-million is the fatter one, with the other having $750,000. You could estimate that there's a 90% chance the other envelope has $750,000, and a 10% chance it has $3-million. So the EV of the other envelope is $975,000, and you're much better off, EV-wise, to keep your $1.5-million.

On the other hand, if you're on the same network game show, and your envelope has $150,000, then you might estimate that there's a 50-50 chance as to whether the other envelope has $75,000 or $300,000. Then the EV of the other envelope is $187,500, so you're better off EV-wise to switch.

But if you were on a game show on some little cable channel, and your envelope contains $150,000, then you might think it's very unlikely the other one has twice that much. So you would not switch.

That's a "real-world" resolution of the paradox. DrSavage's more abstract resolution is better. The other attempts at resolving this paradox on this thread are flawed.

mosta
11-07-2003, 08:20 PM
[ QUOTE ]

I really can't see what we are arguing about. I don't disagree with you in anything that you say.



[/ QUOTE ]

we're arguing because I just broke up with my
girlfriend and I'm in a sour mood. does that
make me a bad person?

mosta
11-07-2003, 08:39 PM
[ QUOTE ]

all. There is no logical problem in saying "let X be the amount of money in the envelope I open". And when you say that, then the amount in the other envelope is either .5X or 2X. There's no getting around that. X is well defined.


[/ QUOTE ]

there is "no logical problem" in making a contradictory
or inconsistent statement. but it is still contradictory.

there are two different scenarios you can pose. one
involves two amounts of money, one involves three.
if you start off talking about two amounts of money
(eg, sealed in envelops in currency in permanent
ink), and then at some point in your reasoning you
start talking about three amounts of money--if
that happens, you have blended two inconsistent
problems together and everything that follows will
be nonsense.

if you are confident in your reasoning, we should
play. but, to make a concession, if we actually
tried to play, I don't think we be able to simulate
the original problem. the original problem is
posed completely abstractly: you run into a guy
on a street, he offers to let you play switch with
envelops with one being twice the other. one time
deal. your expectation of switching is zero. Now,
by contrast, if I actually tried to play one of
you guys, we would start running into problems
of the finite amount of money that I have. If
on teh first time I use 5 and T dollars, then
you get that information. if we do it a few more
times and it's always 5 or T, you know what
to do. Okay, so I mix it up. I flip a coin. Heads
I make it 5-T, tails I make it T-20. well now you
know what to do when you see 5 or 20. Okay so
I add 20-40. same problem. I would have to have
an infinite number of pairs to choose from and
I don't. If nothing else you capture some edge
from the possibility that you see a penny and
know that's the smallest possible denomination
so you know you can switch. (Though, on the
trading floor we do trade in half cents and
tenths ("mills").)

But the original problem is an abstract
problem in probability. Like with flipping a
coin, you rule out the cases where it lands
cocked or on its edge. and the original
"paradox" is also posed in the abstract,
algebraically. and it is still nonsense.
if you pose the question purely mathematically,
you can forget the part about dollars. instead
say, there are two envelops each with an integer
in it, A and B, such that B=2A. what is the
expectation of switching. if you still think
it E<>0, then formal, analytic reasoning is
not your forte, imo (and that's alright!).

...NO WAIT, not just integers, b/c if one
is odd... . Okay: two EVEN integers, A and B.
(is that better?)

jedi
11-07-2003, 08:56 PM
Okay, I'm totally confused now, but here's my take on this:

It IS possible for P(X) to be the same for all X. If the known values are 1-100, P(X) would be 1/100. If known values are 1-1000, P(X) would be 1/1000. We're solving for P(X) as X -> infinity. The probability of each P(X) would be very, very close to zero. Oh crud. Did I just get that wrong. I'm just thinking out loud. What was wrong with that?

[ QUOTE ]

Let's assume he chooses X randomly. Let's say the probability of choosing number X is P(X). You can easily see that P(X) can't be the same for all X, because then the sum(P(X)) where X = 0 .. infinity would be infinite too, but it must be 1 according to basic law of probability. Therefore P(X) must be different for some X

[/ QUOTE ]

M.B.E.
11-07-2003, 09:58 PM
I would say that there is in fact "a logical problem" in making contradictory statements. But nothing I said was inconsistent or contradictory. In particular, nothing in my discussion presupposes that there are "three amounts of money".

DrSavage
11-07-2003, 10:17 PM
[ QUOTE ]

It IS possible for P(X) to be the same for all X. If the known values are 1-100, P(X) would be 1/100. If known values are 1-1000, P(X) would be 1/1000. We're solving for P(X) as X -> infinity. The probability of each P(X) would be very, very close to zero. Oh crud. Did I just get that wrong. I'm just thinking out loud. What was wrong with that?

[/ QUOTE ]

Problem is , it would not be close to zero. It would be closer to zero the larger number you take , but you can only do it with finite numbers.
That's why on infinite numbers probability distribution is never even, but rather something logarithmic, typically for a distribution the larger is the number you're trying to generate the smaller is the probability of that.

johnnyhearts
11-08-2003, 03:20 AM
This hustle isn't a swindle if that's what you're implying. Of course I would show you the A /images/graemlins/spade.gif before mixing the 3 cards on the table facedown and, if you insisted, I would show you whatever card I removed wasn't the A /images/graemlins/spade.gif. You might consider taking such action?!

Johnny

Daniel Lee
11-11-2003, 02:47 AM
There is no paradox. Neither is there a "correct" answer. Individual preferences determine whether the envelope should be exchanged for the specified gamble. This type of problem has been studied extensively in the economics literature.

Given that you can keep the money in the first envelope having value of 1, you can exchange it for a gamble with equally likely payoffs of 2 and 0.5. That is, a gamble with expected value of 1.25 and standard deviation of 0.75. Whether or not you accept the gamble depends on your utility function, i.e., how much you are willing to pay for gambles. This depends upon "risk preference" and varies across individuals, and apparently even across value levels. For example, you may be willing to pay $10 for a gamble with $5 and $20 payoffs. On the other hand, you may not want to pay $100,000 for a gamble with payoffs of $50,000 and $200,000. By positing numerous such questions it may be possible to discern an individual's utility function. Unfortunately, this really doesn't work very well in practice.

During the 1950's James Tobin and Harry Markowitz looked at (independently) such mean-variance analysis. Markowitz developed the "efficient frontier" model for mean-variance analysis and formulated a normative model for portfolio selection. The investor's solution was a tangency point between the "efficient frontier" and a concave utility function. Almost immediately people began to draw positive implications assuming investors make decisions based on the efficient frontier. This rapidly developed into "Portfolio Theory".

hazeelnut
11-11-2003, 09:25 AM
"Given that you can keep the money in the first envelope having value of 1, you can exchange it for a gamble with equally likely payoffs of 2 and 0.5. That is, a gamble with expected value of 1.25 and standard deviation of 0.75."

This is the paradox. How can a switch always have a positive expected value regardless of the sum in the envelope you open? It cannot. If the sum in the first envelope is x then the payoffs 0.5x and 2x arent always equally likely. Further explanation here:

http://www.u.arizona.edu/~chalmers/papers/envelope.html

hazeel

Benman
11-11-2003, 01:01 PM
Hazeel, I believe Daniel is right. You are rightly confused about how both could have a positive expectation, since you would expect that expectations zero out. But they don't. If both switched, the total dollar output stays the same, but nevertheless they individually had positive expectations. The reason is you're 50% likely to double your money, and 50% likely to only halve your money, when you switch. Doubling your money is a bigger dollar gain than halving your money is a dollar loss. Hence the positive expectation on every decision. As far as the "paradox" of this goes, the interesting question is if switching is always right (and it clearly is), then why can't you just select the "second" envelope with the first pick. The answer is that before selecting any envelope, you don't "have anything." Only after you're selected the first envelope do you have any money "in hand," and hence any chance at an improvement. Put it this way--if you say, OK I plan to choose envelope A, but I'll save the trouble and jump immediately to envelope B. Isn't that the same thing as going through the motions of actually having selected envelope A, then switched? It seems so but it isn't!! All you've really done in that scenario is chosen the first envelope you get to look at. Put another way, you've cost yourself the option of cashing in the actual money in the first envelope (which is yours if you've actually opened the first envelope instead of pretending to have done so) on a "good gamble" to get more money. The falacy in everybody's analysis is that expectations must "net out." They do not, if the decisions are made independent of knowledge of what others are doing. In the case of two people selecting from two envelopes, it's hardest to grasp this concept. The money won nets out, but the expectation is positive for both to switch. When it's heads up, and there are only two envelopes, someone has to lose. But I don't get to see if my partner wins or loses. Thus, it's just as if we were both presented with two different sets of envelopes. Always switch!!

bigpooch
11-11-2003, 01:37 PM
Thus, it is clearly better to use exponential switching
than not to switch at all. In fact, it is a simple
exercise to see that one could use any function f(x)
such that it only has the property that

1 >= f(x) > f(2x) >= 0 for all x>0.


Then the expected difference between switching by
using this function and not switching is given by
X*(f(X)-f(2X))/2. A more curious question now arises:
suppose that g(x) is the distribution function of the
underlying amount of the smaller envelope so that
P(a<X<=b) = integral from a to b of g(x)dx. Note g
could be quite exotic; what f now maximizes the
expected value of switching using f?

bigpooch
11-11-2003, 02:00 PM
Always switching has the same expectation as never
switching; use f(x)=1 as the switching function under
the post for exponential switching. On the other
hand, switching with a probability of f(x) for a
certain class of functions f will give a better
expectation.

Louie Landale
11-11-2003, 02:19 PM
The real issue in the question is what is your expected payoff. You derived 1.25 which is incorrect for the problem posted. No, there IS a single correct answer for this part of the problem; its 1.0.

Your utility function or "tolerance for risk" stuff is interesting and true (and yes that often comes down to individual preference) but not nearly as important as the expected payoff part of the problem.

- Louie

DrSavage
11-11-2003, 02:36 PM
You are incorrect.
It is NOT always profitable to switch.
Consider this : let's say there are 10 possible amount combinations of values in envelopes: 1$ and 2$, 2$ and 4$ , ... , 512$ and 1024$.
If you follow "always swtich" strategy you will benefit every time except when you open 1024$. When you open 1024$ always switch strategy will make you lose 512$ , which will amount to whatever profit you would've had while switching another amounts.
The expectations MUST net out, that's the basics of probability theory.
Paradox comes from the fact that when you open 400$ and think that 200$ and 800$ are the possible amounts you don't realize that they are not equally likely because the distribution of the total amount cannot be even.
Read my other posts for more detailed explanations.

Ace-Korea
11-11-2003, 04:14 PM
The easiest way to look at this "Ask Marilyn" problem is this: you take 1 out of 3 cards and it has 1/3 chance of being the Ace. Which means the other two cards have 2/3 chance of having the Ace. But one of the two cards was removed knowing that it isn't the ace, which means the remaining unseen card has carried over the 2/3 chance of being the Ace. So you should switch.

Copernicus
11-11-2003, 04:26 PM
If it hasnt been said before this is just a simple application of the "rule of restricted choice" as it is known in Bridge, or Bayes theorem.

bigpooch
11-11-2003, 06:42 PM
The correct answer to the Monty Hall problem:

Of course, if the host would always show you one of the
envelopes regardless of your choice, that answer is
clear. On the other hand, the host could be one of
these individuals:

Angel of Benevolence: He goes from individual to
individual with the same proposition. If you
should ever pick the right envelope, he just
disappears. On the other hand, if you should
pick one of the two fake money envelopes, he gives
you a second chance by showing you the other
unpicked envelope with fake money! In other
words, if you know AB is the host, always switch!

Demon of Chicanery: He goes from individual to
individual with the same proposition because of a
curse that requires him to give some of his money
away. With him, if you should ever pick the wrong
envelope, he disappears, leaving you with a bunch
of play money. On the other hand, if you happened
to pick the dough, he gives you another chance to
go wrong by generously showing one of the envelopes
you hadn't picked. If you know DC is the host,
hold on to your envelope!

Unfortunately, the above two beings are completely
indistiguishable otherwise.

In fact, if AB were the host 1/3 of the time and
DC were the host 2/3 of the time, it wouldn't even
matter. So quite a lot depends on the host: if
he's your very best friend to carry out this task,
you would tend to switch. If the host is your
worst enemy, you would think about staying with
your pick!

If you knew nothing about the host at all and
whether he was pulling for you or not, you should
simply flip a fair coin to switch! That will
definitely give you exactly a 1/2 chance of
picking the correct envelope!

This leads to a very interesting insight: if you were
on Monty Hall's game show, you should do better if
you were well-liked by the host; in other words, you
would tend to do better if you ever encountered the
three curtains. Don't dress up as Death!

M.B.E.
11-11-2003, 07:04 PM
[ QUOTE ]
Whether or not you accept the gamble depends on your utility function, i.e., how much you are willing to pay for gambles. This depends upon "risk preference" and varies across individuals, and apparently even across value levels.

[/ QUOTE ]
Daniel, you've misunderstood what this problem is about.

Assume a linear utility function, then maybe you'll see the paradox.

Daniel Lee
11-11-2003, 07:19 PM
Two exhaustive, equally likely outcomes each have a 0.5 probability.

E = 0.5 * 2 + 0.5 * .5
= 1.25

Daniel Lee
11-11-2003, 07:33 PM
My understanding of the problem is that someone has given me 1 unit. He then says I can exchange it for a gamble having a payoff of either 2 or 0.5 units, each with probability 0.5 ("equally likely"). Please correct my understanding.

BillsChips
11-11-2003, 07:46 PM
I can't believe this is still going on. FYI. I wrote a program to simulate this problem and the results:

It doesn't matter whether you switch or not. Completely random. Over the long term (20,000+) the numbers are almost equal.

Ulysses
11-11-2003, 08:42 PM
[ QUOTE ]
My understanding of the problem is that someone has given me 1 unit. He then says I can exchange it for a gamble having a payoff of either 2 or 0.5 units, each with probability 0.5 ("equally likely"). Please correct my understanding.

[/ QUOTE ]

There are two envelopes. One has 1 unit, the other has 2 units.

You pick an envelope.

There's a 50% chance you pick the one with 1 unit. If you switch, you will gain 1 unit.

There's a 50% chance you pick the one with 2 units. If you switch, you will lose 1 unit.

.5(1) + .5(-1) = 0. Whether you switch or not, you will make .5(1) + .5(2) = 1.5 units.

That's where all the continuum stuff comes into play. If the envelope you pick has 2/3 of all the money in the world in it (scenario A), switching can't be good. If the envelope has a single unit of the smallest unit of money in the world (scenario B), switching is good. In the most general of terms, the closer the amount in the envelope is to all the money in the world, the worse your chances get of increasing the amount by switching. A function that simply switches in scenario B and doesn't in scenario A is higher EV than either switching or not switching every time. Obviously, it's possible to come up with even better functions than that.

irchans
11-11-2003, 09:00 PM
How about switching when g(y) > g(y/2) ?

M.B.E.
11-12-2003, 04:07 AM
[ QUOTE ]
My understanding of the problem is that someone has given me 1 unit. He then says I can exchange it for a gamble having a payoff of either 2 or 0.5 units, each with probability 0.5 ("equally likely"). Please correct my understanding.

[/ QUOTE ]
You left out the fact that you're on a TV game show. You can infer important information from that. Also the way you've framed the problem it could be a setup. To eliminate that possibility, we should stipulate that (1) you know in advance you will have the option of switching, and (2) you freely make the initial choice between the two envelopes.

Also, to eliminate the factor you mentioned in your first post (although it will hardly ever make a difference in this problem anyway), assume a linear utility function.

bigpooch
11-12-2003, 04:29 AM
Do you mean 2g(y) > g(y/2)? If Y=amount in the envelope
with the least amount is a continuous r.v. with distribution
g then the modified inequality essentially means that
it was "more likely" you picked the envelope with more
money in it for y-epsilon<Y<=y+epsilon than the envelope
with less money in it for y-epsilon/2<Y<=y+epsilon/2. Of
course, that factor of 2 won't exist in the discrete
case.

Louie Landale
11-12-2003, 07:49 PM
Your answer is to this problem: I give you an envelope with money. You count the money. I then say you can keep the money or gamble: half the time you get double and half the time you get half.

Yes, THAT's a good gamble at EV 1.25.

But that's not the same problem as having two envelopes where one has twice as much money; and the recipient, after counting the money, can switch envelopes. THAT situation has EV 1.0 as explained in other posts.

And it MUST have EV 0 since if I gave you and your brother each one of the envelopes surely you BOTH cannot have EV +1.25 if you switch with the other. No, the one of you with the 2X envelope loses X while the other gains X. That's the (apparent) paradox: it SEEMS as though you should switch.

- Louie

Louie Landale
11-12-2003, 07:56 PM
This scenario presumes there IS such a thing as "all the money in the world" and "smallest unit of money". And even if that's true, it presumes the contestant knows what those thresholds are. And it also presumes that he can calculate the miniscule difference in EV when, for example, he's 15% away from one of the Thresholds.

Its not a usefull notion. What do you do if you watch the previous 2 contestants, one gets one quarter of a Yen and the other gets a gazillion dollar check; and your envelope has $1.25?

If the smallest unit is a penny, I doubt there is much EV difference calculatable with your continuim if you have $1.

- Louie

bigpooch
11-12-2003, 08:00 PM
Very well said! A more interesting question is this:
how do you guarantee an EV of $0.25 for a switching
strategy? BTW there is a very nice solution. Here
just assume that the very least an envelope contains
is $1.

bigpooch
11-12-2003, 08:49 PM
The more general question is this:

Suppose there are N envelopes where N>=2 and each
has a distinct amount of money >=$1. The envelopes
are randomly distributed but you are allowed to
open them one at a time. But once you open a new
envelope you cannot choose an envelope previously
opened. What strategy maximizes your chances of
picking the envelope with the most amount of
money and what is that chance? Also, is this
strategy different from the optimal strategy for
maximizing your EV?

Lori
11-12-2003, 10:03 PM
My understanding of the problem is that someone has given me 1 unit.

You don't know how many units you have been given, that is where the paradox begins.
If you are offered two envelopes, one has .6667 units, one has 1.3333 units, when you pick one, you don't know which you have, all you can say is that on average you have picked one unit NOT that yours has one unit.

Switching either gains you .6667 units or loses you .6667 units depending on the pick.

This is one of those horrid problems where it is so simple when you can see the answer that you find it impossible to believe you didn't spot it in the first place.

Lori

bigpooch
11-12-2003, 10:13 PM
This method of switching has positive EV and you
can measure the amount of money in the envelope
in any currency you want! Let f(x) = 1/(x+1)
so that x is clearly a decreasing function.

Now if you switch with probability f(x) where x
is the amount in the first envelope, it is an
easy exercise to see that this will be better
than always switching (which has btw, an EV exactly
the same as never switching): see for example the
post on exponential switching for an analogous
calculation

Louie Landale
11-13-2003, 02:19 PM
This is a completely different problem.

Let me guess:

You need to open lots of envelopes, I suppose in the neighborhood of .5n to .75n, in order to get a good idea of the probably distribution of money in the envelopes. Lets say it looks like its distrubuted between $L and $H. With your remaining picks, stop picking when the amount is "close" to $H, where "close" is a function of how many picks you have left. If you have 4 picks left and you find an envelope that is 90% of $H-$L, then its time to stop.

I have no idea how to say that correctly.

- Louie

bigpooch
11-13-2003, 04:14 PM
To maximize your chances for N very large, you hit upon
a correct idea to look at a lot of envelopes. Suppose
you decide to look at a ratio of r envelopes (or
approximately rN envelopes altogether) and then after
that to pick the first envelope with more money in it
than you had seen before (unfortunately, the envelope
that has the most amount of money could have already
been open!). Then you can approximate the odds that
you will pick the right envelope:

P(r) = P(highest envelope not picked among first rN) x
( P(2nd highest picked in first rn) x 1
+P(3rd highest but not 2nd in first rn) x 1/2
+P(4th highest but not 2nd, 3rd...) x 1/3
+....)

= (1-r) (rx1 + (1-r)r/2 + (1-r)(1-r)r/3 +...)

Let s=1-r above and simplifying yields

P(s) = s(1-s)(1+s/2+s**2/3+...)
= (1-s)(s+s**2/2+s**3/3+...)
= (1-s)(-ln(1-s))

or P(r) = -r ln r

If P(r) is to have a local maximum, the first derivative
dP(r)/dr = 0.

-r (1/r) - ln r = 0
1 - ln r = 0
and therefore, r = 1/e.

Also, P(1/e) = 1/e.

Surprisingly, using this strategy will give a probability
of 1/e which is about 0.3679 of picking the right
envelope. This is definitely remarkable because 1/e is
the probability that is approached as N tends to infinity.
So even if there are a billion envelopes, the chances of
picking the right envelope is greater than 1/3 which
most people would not be able to see at first glance.

The harder question now is for the 2 envelope problem:
what switching function (if any) maximizes your
chances of picking the envelope with the most money?

Louie Landale
11-14-2003, 02:31 PM
I must say, your math was quite easy to follow. Did you say you should automatically sample one third of the envelopes, and then when keep picking the others until you get one that's bigger than any you have seen so far? That seems to work until you start running out of envelopes. Say you've picked the penultimate envelope and its REAL close to the highest one seen so far. I'd say stop now, even though its not the highest you've seen.

Anyway, I don't see how this has anything to do with the two-envelope problem, where you know one has twice as much as the other.

- Louie

johnnyhearts
11-15-2003, 07:13 PM
This problem is now reminding me of a paradox created by the Newtonian concept of an infinite, static universe. In such a model, which was crushed by Hubble's findings, any one point can be considered the center of the universe because it would be bounded on all sides by infinity. Of course, the point in the universe 100 yards to the right is also bounded on all sides by....infinity!

Johnny Hearts

Gamblor
11-20-2003, 10:32 AM
Doesn't the $27 include the $2 the bellhop has?

George Rice
11-20-2003, 07:59 PM
Yes. It's a trick question. You should be subtracting the $2 from $27 to get $25, not adding $2 to $27.

Kurn, son of Mogh
11-21-2003, 09:57 AM
Each man has spent $9 for a total of $27.

This is your fauty assumption. Once the price was adjusted to $25, $8.3333 of each man's outlay is accounted for. When the bellhop gives each man back $1, that means each man now accounts for $9.3333 of his original outlay. $9.3333*3 = $28 and the Bellhop's $2 makes up the difference.

bigpooch
11-29-2003, 12:35 AM
Definitely this paradox is unrelated to an infinite range
of possible amounts in the envelope. Suppose the amounts
in the envelope are from a finite range with an unknown
upper bound. The paradox has to do with how some people
perceive the problem: that they think always switching is
better than never switching. But there are switching
strategies (in fact, infinitely many of them!) that are
better than always switching (which has EV = 0).

Infinity has nothing to do with this envelope switching
paradox.

M.B.E.
11-29-2003, 04:31 AM
[ QUOTE ]
Definitely this paradox is unrelated to an infinite range
of possible amounts in the envelope. Suppose the amounts
in the envelope are from a finite range with an unknown
upper bound.

[/ QUOTE ]
I don't follow this. If the upper bound is unknown, then the range is not finite, from the contestant's perspective.

bigpooch
11-29-2003, 10:55 AM
Suppose the range is from the set of integers between 1 and
N inclusive where N is a very large number (but is unknown).
Then the range is still finite.

Also, the envelope paradox as stated does not really require
that one of the envelopes has exactly twice the amount as
the other. All that is required is that one envelope has
strictly more money than the other. Admittedly, with the
restriction that one of the envelopes has twice the amount
of the other leads people to the wrong EV calculation. Then
the more interesting question is what switching strategies
have positive expectation as opposed to always switching
(which has the same expectation of never switching). Is it
possible to determine the optimal switching strategy, i.e.,
the strategy that has maximum EV?

PairTheBoard
12-03-2003, 05:19 AM
Yes. Also Lori's explanation was one of the best I've seen. You can get graduate students in mathematics pretty worked up with this problem. Despite their expertise they can easily overlook the mistaken Hidden Assumption adressed in the Dr. Savage Link. "Pick a number at Random" is easier said than done.

But even after the observation on the dollar amounts requiring a probability distribution from which to pick them, and even after producing the proper perspective that with x and 2x envelopes you have a 50% chance of gaining x and a 50% chance of losing x by switching; there remains the nagging psychological problem inherent in the true observation that sight unseen I have a 50% chance of doubling the first envelope and a 50% chance of halving the first envelope if I switch. Why does that not gain me money on average? Here's the reason. The 50% of the time that you double you are doubling half the amount you are halving the 50% of the time you get halved.

bigpooch
12-03-2003, 09:15 AM
There is no need for the amounts in the envelope to be
chosen from a nontrivial probability distribution. All that
is required is that the picker has no idea what the amounts
are.

The clearest way to see that switching always is no better
than never switching (and IMO the most elegant) was posted
before: consider two individuals, each getting the envelope
not chosen by the other individual. By symmetry. it is
obvious there is no gain by always switching.

On the other hand, for any switching strategy to have +EV
(and there are infinitely many of them!), this symmetry must
be broken. That requires that the strategy be based on the
amount seen in the first envelope. Thus, consider switching
with probability f(x) where x is the amount and f is a
"switching function": it is a decreasing function and has
range in the interval [0,1].

There does arise an interesting question: for which classes
of distribution functions (for the amount of the smaller
envelope) will there exist optimal switching strategies?

A generalization to more envelopes known to probabilists
sometimes as the "marriage problem" or "bachelor problem" is
also interesting for a different reason. An interesting
question for multiple containers (say N where N is large) is
to suppose there are N individuals where the contents of the
containers differ and the utility functions differ (there
doesn't have to be money in these containers!): how best to
allocate resources in society.

Icarus "the Kid"
12-03-2003, 01:14 PM
Alright, it looks like there's some fabulous math theory behind this and although it's a huge part of what enamours me to the game, I'm a bottom liner, and here it is.

From what we know of the problem, it's a $200 bet (what you stake from the $400) for a 50% chance at a 2 to 1 payoff. To all Blackjack players (unless you're a card counter): switch to opening envelopes, it's a better price. Hey, you're playing with the House's money, 200 bucks is yours to keep.

Here's the real question do you take the FREE $400 and go play 10-20 Hold 'Em or hold on to the chance at playing some FREE 20-40. Worse case scenario, you walk away with $200 and get stuck playing 5-10 or 4-8 with a bunch of loose cannons who call you all the way with 87o (including a double-raise, pre flop) when you've got trip Aces and they catch a gutshot on the river to take down a big pot.

I'll take the second envelope, please....

mosta
12-03-2003, 03:19 PM
are you serious? actually I'm not surprized--for an economist to fail to understand the elementary math they dress up their trivial ideas with--nothing unusual there. but really, to think the switch is +EV is embarrassing even for an economist.

[ QUOTE ]
There is no paradox. Neither is there a "correct" answer. Individual preferences determine whether the envelope should be exchanged for the specified gamble. This type of problem has been studied extensively in the economics literature.

[/ QUOTE ]

PairTheBoard
12-04-2003, 04:28 AM
Yes, your symmetry argument should quickly convince anyone that switching makes no difference. This is easy to see. The real challange of this problem is to find the fallacy in the Expected Value Calculation argument. Exactly where does the argument break down? It IS true to say that sight unseen, switching gives a 50% chance of doubling and a 50% chance of halving. But once the first envelope is opened and an amount, say $A, is observed, it is NOT valid to say that switching produces a 50% chance of doubling $A and a 50% chance of halving $A. This is the point where the Expected Value Calculation argument breaks down. To solve the paradox requires you to explain why. Why isn't it valid to say that switching gives a 50% chance of doubling $A and a 50% chance of halving $A? Why does opening the envelope and observing the amount $A alter the validity of the fact that sight unseen, switching DOES give a 50% chance of doubling and a 50% chance of halving? It is VERY counter intuitive. Here is the key imo. The events "Doubling" or "Halving" are not independent of the event "observing $A in the first envelope". In fact they are completely Dependent events in the case where the envelope amounts are fixed. "Observing $A" completely determines the "Doubling" or "Halving" events. The psychological trick comes in realizing that "Doubling" or "Halving" has been Determined by observing $A, even though you don't yet know what that determination is. This is an example of where there are two mutually exclusive possibilities, one of which is true, and where you have no way of knowing which is true or which is more likely, yet you CANNOT say that there is thus a 50% chance of one and a 50% chance of the other. Maybe you can think of more such examples to illustrate the point.

PairTheBoard
12-04-2003, 09:02 AM
Ok, I asked for other examples of where there are two mutually exclusive possibilities, one of which is true, and where you have no way of knowing which is true or even which is more likely. Yet you CANNOT conclude that there is a 50% chance of its being one and a 50% chance of its being the other.

Assuming the principle of the excluded middle applies to the question of God's existence. Either God exists or She doesn't. You have no way of KNOWING which case is true or even which is more likely, while you are still alive and mortal. Yet I doubt you would want to conclude that there is thus a 50% chance God exists and a 50% chance She doesn't. Assuming you find out when you die, it's not a repeatable experiment. It's already determined once and for all. There is no repeatable experiment for which to assign probabilties for the outcomes.

bigpooch
12-04-2003, 12:58 PM
Quite right! Pascal had some choice words of that
important question.

Going back to the envelope paradox, suppose the ratio of the
amounts were quite extreme: say the ratio of the amounts of
one to other were 1000 to 1. If someone makes the same
erroneous EV calculation, when would he think that something
were amiss? [Or suppose the ratio were even more extreme!]

Most individuals quickly make assumptions that are not
logically deducible from the bare set of axioms or the
conditions originally set forth. Perhaps that is the way
the human mind works which may be considered a biased
reductionism of the original problem.

Also, consider Godel's demonstration in logic that
overturned the way many mathematicians had thought:
specifically, those with faith in the axiomatization of
mathematics were shown the error in their belief that all
statements could be decidable if the axiomatization were
rich enough.