PDA

View Full Version : Logic Problem for GoT


maurile
08-03-2004, 04:27 PM
Before you there are two boxes, one white and one black. Each box contains money; one has twice as much as the other. You may choose either box and keep whatever money is inside.

You choose the white box and find $100 in it.

You are now given the option of switching. You may either keep the contents of the white box, or you may instead opt for the contents of the black box.

What's the EV of switching?

GuyOnTilt
08-03-2004, 04:30 PM
<font color="white">The black box will have $50 in it 50% of the time, and $200 in it 50% of the time. If you choose to switch, you will have $125 on average. If you stay, you will have $100 on average. So the EV of switching is $25. </font>

GoT

blackaces13
08-03-2004, 04:32 PM
I'm not GOT but I say the EV of this is +$25.

EDIT: I see the real GoT got me by 2 minutes as I was calculating it. Oh well.

maurile
08-03-2004, 04:38 PM
If you'd originally picked the black box, you'd have come to the exact same conclusion -- that switching is +EV. So no matter which box you choose first, you should switch.

But that can't be right.

maurile
08-03-2004, 04:41 PM
We can prove two contradictory propositions:

Proposition 1. The amount that you will gain by switching, if you do gain, is greater than the amount you will lose, if you do lose.

Proposition 2. The amounts are the same.

The proof of Proposition 1 is essentially the one GoT gave: Let n be the number of dollars in the box you are now holding. Then the other box has either 2n or n/2 dollars. If you gain on the trade, you will gain n dollars, but if you lose on the trade, you will lose n/2 dollars. Since n is greater than n/2, then the amount you gain, if you do gain — which is n — is greater than the amount you will lose, if you do lose — which is n/2. This proves Proposition 1.

Now for the proof of Proposition 2. Let d be the difference between the amounts in the two boxes, or what is the same thing, let d be the lesser of the two amounts. If you gain on the trade, you will gain d dollars, and if you lose on the trade, you will lose d dollars. And so the amounts are the same after all. This proves Proposition 2.

So which proposition is really right?

aloiz
08-03-2004, 04:43 PM
This is a classic paradox, (not really a paradox, but the way most people naturally think about it) almost as famous as the monty hall problem.

aloiz

GuyOnTilt
08-03-2004, 04:44 PM
But that can't be right.

It can, and it is.

Problems like this one is why people who know a decent amount of game theory get frustrated watching game shows.

GoT

maurile
08-03-2004, 04:47 PM
[ QUOTE ]
It can, and it is.

[/ QUOTE ]
Actually, it's not.

Let me know if you'd like the answer.

GuyOnTilt
08-03-2004, 04:59 PM
Before you there are two boxes, one white and one black. Each box contains money; one has twice as much as the other. You may choose either box and keep whatever money is inside.

You choose the white box and find $100 in it.

Okay, the first rule to solving logic puzzles is to read the intro thoroughly because there's often clues and tricks in it. I've read this through a couple times now carefully and this statement still seems to be impossible to me:

Proposition 2. The amounts [that you're able to gain or lose by swtiching] are the same.

I'll keep thinking...

GoT

AndysDaddy
08-03-2004, 05:02 PM
A decent, but somewhat geeky explanation for this is the fact that the dollar amounts are a geometric progression, with a fast rising curve ahead of you, and a slow falling one behind. With the big upside, and small downside you always want to take any chance to advance up the steep slope.

jwvdcw
08-03-2004, 05:04 PM
[ QUOTE ]
Before you there are two boxes, one white and one black. Each box contains money; one has twice as much as the other. You may choose either box and keep whatever money is inside.

You choose the white box and find $100 in it.

Okay, the first rule to solving logic puzzles is to read the intro thoroughly because there's often clues and tricks in it. I've read this through a couple times now carefully and this statement still seems to be impossible to me:

Proposition 2. The amounts [that you're able to gain or lose by swtiching] are the same.

I'll keep thinking...

GoT

[/ QUOTE ]

GoT:

A question for you then:

Suppose I'm all set to pick box A. After I pick box A, should I switch?

Well what if at the last second I change my pick to box B. Should I then switch back to A after I pick?

aloiz
08-03-2004, 05:16 PM
This is faulty logic. You're essentially taking the average of two seperate variables, because your starting amounts are different. First case -x/2 second case +2*x. x is different in each case, so they can't just be averaged.

aloiz

blackaces13
08-03-2004, 05:32 PM
If AFTER you picked the first box a guy flipped a coin and put either 2X or 1/2 the amount of the box you picked in another box, then you should ALWAYS switch because of the reasons that me and GoT thought it was a +$25 EV to switch.

If however the boxes are just laying there the whole time then you're just as likely to pick the box with the higher $ amount in it so switching or not is meaningless and the EV of switching is 0.

Cool question.

maurile
08-03-2004, 05:57 PM
[ QUOTE ]
If AFTER you picked the first box a guy flipped a coin and put either 2X or 1/2 the amount of the box you picked in another box, then you should ALWAYS switch because of the reasons that me and GoT thought it was a +$25 EV to switch.

If however the boxes are just laying there the whole time then you're just as likely to pick the box with the higher $ amount in it so switching or not is meaningless and the EV of switching is 0.

[/ QUOTE ]
This is intuitively correct. Proving it is another matter.

Here's the technical explanation: link (http://www3.oup.co.uk/mind/hdb/Volume_112/Issue_448/pdf/1120685.pdf)

Mike Caro, of all people, has provided the best simple explanation I've seen [I'm editing it a bit]:

The solution. Switching makes no sense. It's a waste of your time and effort. I'm going to give you two explanations. One is slightly sophisticated, the other is simple and has powerful poker applications.

Sophisticated comes first. Nobody's bankroll is unlimited, and mine is no exception. I decide that a range of $50 to $800 is about right for this experiment. No need to give away big money just to prove a point.

So, I prepare the following sets of boxes: $50/$100, $100/$200, $200/$400, and $400/$800. (Obviously, I could write other pairs of checks, such as $75/$150, but that would just complicate things without changing the explanation.)

When I'm finished, I've created four pairs of boxes. Now I randomly select one pair. I'm not going to make this difficult by breaking it down into what percent of the time you'll get which box and what you gain or lose by switching, but if you're so inclined, you should be able to map this out for yourself in a few minutes.

What I'm going to tell you is this: If you choose a $100, $200, or $400 box, you always gain by switching, because half the time you'll lose half the amount, and half the time you'll double. In the case of a $200 box, this means half the time you'll lose $100 (ending up with $100) and half the time you'll win $200 (ending up with $400). Since each is equally likely, you average a $100 gain for every two times you switch, so trading boxes is worth $50.

This is the crux of the paradox. It previously seemed as if you should always trade. But now that we know the secret size limits of the checks, we can look at it differently. Now we see that there are two exceptions to your lose-half-or-double expectation. If you open a $50 box, there's only one thing that can happen by switching: You gain $50. And if you open an $800 box, there's only one thing that can happen by switching: You lose $400.

So, by switching in both those "extreme" cases, you lose $350. And, wouldn't you know it, that exactly balances out all the gains from all the other choices. So if your strategy is to always switch, you gain nothing. If your strategy is to never switch, you lose nothing. Since you don't know what range your game show decided on for the checks (but that obviously he had to use some range), it does you no good (or no harm) to switch. If you had information letting you know when you were at the high or low end, then you could beat the system by always switching when low, never switching when high. But you don't have this information.

By the way, this explanation holds true no matter how long the sequence of choices you devise, how small the minimum, or how large the maximum.

A simpler explanation. When you think about switching boxes, don't try to figure out all the mathematical implications. Just ask yourself if your opponent wants you to switch. Imagine you offered the boxes. You shuffled them until you couldn't remember which was which. At that point, would you care which box was opened? Of course not! That's the point; look no further.

GuyOnTilt
08-03-2004, 06:41 PM
Hey maurile,

Your question is COMPLETELY different from this one that Mike Caro asked. In your question, YOU SHOULD ALWAYS SWITCH. In Mike Caro's, you should only switch if you pick certain amounts. Knowing the value of the 4 pairs of boxes is what makes the difference. In your problem, there was only one pair, and you should always switch.

GoT

blackaces13
08-03-2004, 06:57 PM
[ QUOTE ]
In your question, YOU SHOULD ALWAYS SWITCH. In Mike Caro's, you should only switch if you pick certain amounts. Knowing the value of the 4 pairs of boxes is what makes the difference. In your problem, there was only one pair, and you should always switch.




[/ QUOTE ]


[ QUOTE ]
Imagine you offered the boxes. You shuffled them until you couldn't remember which was which. At that point, would you care which box was opened? Of course not! That's the point; look no further.


[/ QUOTE ]

GuyOnTilt
08-03-2004, 07:01 PM
I'm sorry, I worded my statment poorly. If you have a SERIES of boxes and do not know the range of values that was being offered, you shouldn't ever switch. If there are only 2 boxes, as in the original problem, YOU SHOULD ALWAYS SWITCH.

Caro's problem and solution isn't wrong, but it requires much different logic from the original one posted. I remain very convinced that in the original logic problem, you should always switch.

GoT

blackaces13
08-03-2004, 07:18 PM
GoT,

When ALWAYS switching you have exactly a 50% chance of choosing either box. So how can this possibly be better than simply keeping the original box you choose which also has you choosing either of the 2 boxes exactly 50% of the time? All you are doing by switching is adding an extra step into a completely random process.

I'd never win a poker argument with you but I'm fairly confident about this one. /images/graemlins/smile.gif

jwvdcw
08-03-2004, 07:29 PM
Ok, well lets look at it this way then:

After you pick and box, and then you switch.

Then, the game show host(or whomever is giving you this chance) allows you the option of staying put or switching yet again! What do you do? You see, its the same problem. If you really believe that you're getting +EV to switch, then you could just go on switching forever and winning more and more money.

maurile
08-03-2004, 07:49 PM
[ QUOTE ]
Your question is COMPLETELY different from this one that Mike Caro asked.

[/ QUOTE ]
No, they're exactly the same. (Edit: Caro made some simplifying assumptions [e.g., only four sets of possible box-combinations] to demonstrate the point more easily. For the full version without those simplifying assumptions see the link to the PDF file earlier in that post.)

Here's a different way of thinking about it. You and another fellow are each given a box, and told that one box has twice as much money in it as the other one.

Should you both want to trade?

Should you both be willing to pay me (say, a penny each) to allow you two to trade? This would obviously be +EV for me (to the tune of two cents) -- which means it has to be -EV for each of you.

Ulysses
08-03-2004, 07:56 PM
Relatively simple explanation as to why GoT's approach is invalid here: Two-envelope paradox (http://www.maa.org/devlin/devlin_0708_04.html).

More detailed explanation (http://jamaica.u.arizona.edu/~chalmers/papers/envelope.html)

Just thinking logically about this problem before jumping straight into EV calculations should make it clear that one can't just apply GoT's basic EV calculations here when dealing w/ finite numbers. See Brocktoon's posts for more on that logic.

Ulysses
08-03-2004, 07:58 PM
[ QUOTE ]
Problems like this one is why people who know a decent amount of game theory get frustrated watching game shows.


[/ QUOTE ]

Problems like this one are why some people should step back and think about problems in a common-sense way before jumping straight into EV calcs.

maurile
08-03-2004, 07:59 PM
This is known as the "two-envelope paradox." I used boxes instead of envelopes to make it harder for people to Google the answer.

Ulysses
08-03-2004, 08:01 PM
[ QUOTE ]
In your question, YOU SHOULD ALWAYS SWITCH.

[/ QUOTE ]

If that is the case, how does opening the envelope influence your decision? You will switch for any value of x, yes? If so, why do we need to open the envelope? But wait...

Ulysses
08-03-2004, 08:02 PM
[ QUOTE ]
If AFTER you picked the first box a guy flipped a coin and put either 2X or 1/2 the amount of the box you picked in another box, then you should ALWAYS switch because of the reasons that me and GoT thought it was a +$25 EV to switch.

[/ QUOTE ]

Exactly.

GuyOnTilt
08-03-2004, 08:08 PM
No, they're exactly the same.

No, they're not. In the envelope paradox and variations thereof, there are situations and conditions where the EV of switching is zero. The problem you gave is not one of these.

If there is a series of pairs of envelopes (or boxes) as Caro presented then there are situations when switching is neutral in EV. In his example, there are 4 pairs: x and 2x, 2x and 4x, 4x and 8x, and 8x and 16x. He himself stated that if we chose an envelope and were allowed to look at the value inside and saw that it was either 2x, 4x, or 8x, we should always choose to switch with its paired envelope. If it were x, we should trade as well, and if it were 16x we should obviously not trade. Then he went on to say that if we didn't know the range of the values in the envelopes, switching has 0 EV. All of the above is true. Let's say out of the 4 unmarked pairs of boxes/envelopes, we choose pair number 1, and of the two boxes in pair 1, we should Box A. We open Box A and see that it contains $200. Should we switch? The answer is it doesn't matter. We have no clue as to the range. We weren't given the information of where $200 ranked in the different values of the boxes. If could just as easily be the lowest amount as a middle amount as the highest amount. Therefore, the EV of switching if it were each of those would cancel out to equal 0. If it the bottom of the lowest pair, we'd gain $200. If it were the highest of the lowest pair, we'd lose $100. If it were the lowest of the 2nd pair, we'd gain $200. If it were the highest of the second pair, we'd lose $100. And same thing with pairs 3 and 4. Therefore, it doesn't matter whether we choose to switch or not. That is the problem that Caro presented.

In the problem you presented, WE DO KNOW THE RANGE OF THE VALUES IN THE BOXES. We pick one of the two boxes and look inside and there's $100. Now, we know that there are 2 possibilities (just as there were 8 possibilities in Caro's problem): It's either the highest of the pair, or the lowest of the pair, against perfectly similar to the process used to calculate Caro's problem's EV. In the first possibility, we chose the lowest of the two boxes, and we'd gain $100 by switching. In the second possibility, we chose the higher of the two and would lose $50 by switching.

I still maintain that in the original post, switching is +EV.

GoT

RocketManJames
08-03-2004, 08:14 PM
[ QUOTE ]
In the problem you presented, WE DO KNOW THE RANGE OF THE VALUES IN THE BOXES. We pick one of the two boxes and look inside and there's $100. Now, we know that there are 2 possibilities

[/ QUOTE ]

GoT, you're wrong here... the range is known AFTER THE FACT. If you knew the range before you opened the box, you'd be in a bounded scenario and some EV calculations could be made. BUT, the range was unknown to you BEFORE you opened that first box. I believe this makes all the difference.

-RMJ

maurile
08-03-2004, 08:15 PM
Maybe you should ignore Caro's explanation. It obviously didn't make the solution clear to you.

Focus instead on my question of whether two people who get different boxes should both be willing to pay me a penny to switch.

GuyOnTilt
08-03-2004, 08:24 PM
If that is the case, how does opening the envelope influence your decision? You will switch for any value of x, yes? If so, why do we need to open the envelope? But wait...

I understand how choosing to switch makes no difference from a common sense and Bayesian perspective, but I don't understand how that overrules EV calcs. One must be flawed somewhere since they're in direct contradiction with each other. I'm at the point where I'm starting to think that it makes no difference, but can't find a flaw anywhere in the math to point to and say, "That's why."

GoT

maurile
08-03-2004, 08:42 PM
[ QUOTE ]
I understand how choosing to switch makes no difference from a common sense and Bayesian perspective, but I don't understand how that overrules EV calcs. One must be flawed somewhere since they're in direct contradiction with each other. I'm at the point where I'm starting to think that it makes no difference, but can't find a flaw anywhere in the math to point to and say, "That's why."

[/ QUOTE ]
Yes, that's exactly why this is such a great problem. You can get the correct answer by using some common sense, but to actually prove it mathematically is extremely difficult (some people think it's impossible). The link I provided before isn't the one I meant to give. The paper I had in mind is no longer available on the Net as far as I can tell. Try this one (http://www.math.rug.nl/~casper/publications/twoenvelopes.pdf), though, for an overview of the Bayesian analysis.

Ulysses
08-03-2004, 09:04 PM
[ QUOTE ]
I understand how choosing to switch makes no difference from a common sense and Bayesian perspective, but I don't understand how that overrules EV calcs.

[/ QUOTE ]

A few people have posted good questions that perhaps will help you think about this problem.

The links I posted provide mathematical/probability analysis of this problem that is relatively easy to grasp even if you don't understand all of the math behind some of their statements.

As for your EV issue, perhaps thinking about it like this will help you see where you're going wrong.

You said that the envelope you pick has x. Thus the other one has either 2x or .5x. So, switching gives you 2.5x/2 = 1.25x.

OK. But what if we define the amount in the other envelope as y? Now, what does x equal? Well, 50% of the time x=2y and 50% of the time x=.5y, right? So, the first envelope gives us 2.5y/2 = 1.25y and switching gets us y.

Anyway, the main reason I entered this thread was that I think if you're going to make smug comments like "It can and it is" and follow up with patronizing comments like your one regarding the game shows, you better get your answers right, else you come off looking very foolish.

Zeno
08-03-2004, 10:15 PM
From the website posted by El:

"To summarize: the paradox arises because you use the prior probabilities to calculate the expected gain rather than the posterior probabilities. As we have seen, it is not possible to choose a prior distribution which results in a posterior distribution for which the original argument holds; there simply are no circumstances in which it would be valid to always use probabilities of 0.5. '


-Zeno

Zeno
08-03-2004, 10:34 PM
The common sense answer to this 'problem' is one of simplicity, a Gordian Knot. Alexander the Great used it long ago. Interesting side bar Choose your knots wisely (http://www.maa.org/devlin/devlin_9_01.html)


The REAL SOLUTION:

There are two boxes with money. Obviously having the money in both boxes is more advantageous that having the money in only one box. So when asked if you want to switch you say , No - I'll just take both boxes' as you aim a 12-gauge pump action shotgun at the hosts head.

I certainly hope this ends all the silly math logic stuff.

-Zeno, Cutting to the chase once again.