PDA

View Full Version : Math problem from "A Mathemetician Plays the Stock Market"


plj8624
04-16-2004, 02:00 AM
seemingly paradoxical problem posted from page 52:

Imagine you are standing on stair 0, in the middle of a very long staircase with 1,001 stairs numbered from -500 to 500 (-500, -499, -498... -4, -3, -2, -1, 0, 1, 2, 3, 4 ... 498, 499, 500). You want to go up rather than down the staircase and which direction you move depends on the outcome of coin flips. The first game- let's call it game S - is very simple. You flip a coin and move up a stair whenever it comes heads and down a stair whenever it comes up tails. The coin is slightly biased and comes up heads 49.5% of the time and tails 50.5% of the time. If you play this game you would almost certainly end up at the bottom of the staircase.

The second game- game C - is more complicated. It involves two coins, one of which, the bad one, comes up heads only 9.5% of the time, tails 90.5%. The other coin, the good one, comes up heads 74.5% of the time and tails the other 25.5%. As in game S, you move up a stair if the coin you flip comes up heads and you move down one if it comes up tails. But which coin do you flip? If the number of the stair you're on is a multiple of 3 (that is, ..., -9, -6, -3, 0, 3, 6, 9, 12, ...), you flip the bad coin. If the number of the stair you're on is not a multiple of 3, you flip the good coin. If you play this game over the long haul, chances are you will end up at the bottom of the staircase.

Parrondo's facinating discovery is that if you play these two games in succession in random order (keeping your place when you switch between games), you will steadily ascend to the top of the staircase. Alternatively, if you play two games of S followed by two games of C followed by two games of S and so on, all the while keeping your place on the staircase as you switch between games, you will also steadily rise to the top of the staircase.

Getting the Best of It is my favorite 2+2 book, but these finding seem to fly in the face of that book. Can anyone explain why my intuitive mathematical sense that randomly switching between two losing games can yield winning results?

Nottom
04-16-2004, 03:21 AM
[ QUOTE ]
Getting the Best of It is my favorite 2+2 book, but these finding seem to fly in the face of that book. Can anyone explain why my intuitive mathematical sense that randomly switching between two losing games can yield winning results?

[/ QUOTE ]

It seems pretty intuitive to me. In the first case the "middle" coin is only slightly biased so its not going to have a huge effect on your results on any one flip. In the second game the first coin is heavily in you favor, but the fact that you have to flip the bad coin every 3rd step more than overcomes you advantage since you will go back 3 steps with the good coin more often than foward with the bad coin. However when you mix the two, the advantage from the good coin is amplified becasue it is used more often than the bad coin. Everytime you should be flipping the "bad" coin but instead get to flip the "middle" coin you pick up a huge advantage over the original game, but when you have to flip the "middle" coin instead of the "good" coin you are only at a slight disadvantage.

daryn
04-23-2004, 11:47 PM
good analysis, very interesting.

MicroBob
04-25-2004, 07:21 AM
i had read this game before (i think i was browsing at a book-store) and found it to be very interesting as well.
took me a little bit of thought to figure out how it would be possible.....i knew i kind of had a general idea in my head of how this might be, but nottom articulated it much better than i ever would have been able.


somewhere on these fine forums is a david sklansky essay on various hands ('the problem with simple rankings')....

22 is a slight favorite over AKo
AKo is a favorite over JTs
JTs is a slight favorite over 22

i know it's not the same...but for some reason this little stair-case game reminded me of that essay.
for many (poker players and non-poker players alike)....it seems rather paradoxical that A can be better than B can be better than C can be better than A (or something like that)....

similar to this:
if you're told A is about 50-50 heads-up vs. B, C, or D...then you would expect A to win 25% of the time in a 4-way battle with B, C and D...just based on the only information you have.

when you substitute the hold-em hands A=22, B=AK, C=QJ and D=T9, you see that A is not going to win 25% of the time here.

i guess the seeming paradoxical nature of these, just like the idea that the 2 staircase games where you will definately lose can be combined to make a winning game, are what i find to be rather interesting.

Bozeman
04-25-2004, 12:14 PM
"when you substitute the hold-em hands A=22, B=AK, C=QJ and D=T9, you see that A is not going to win 25% of the time here."

Actually, it's not too far, it wins ~21%.

MicroBob
04-25-2004, 04:07 PM
really???
well, hooray for 22. i would have guessed lower than 21%.

oh well....guess i didn't make much of a point afterall.

no biggee.

pzhon
04-25-2004, 09:41 PM
Coin A: Heads 9.5%
Coin B: Heads 74.5%
Coin C: Heads 49.5%

I have to say that these were bad choices of values. Why not use whole percentages? The example is robust. I'll use the given figures, though.

Every time you flip coin A, you expect to lose .81 steps.
Every time you flip coin B, you expect to gain .49 steps.
Every time you flip coin C, you expect to lose .01 steps.

Though the initial game has you flip coin A on every third step, you flip coin A more frequently than 1/3 of the time. If you were to flip coin A less than .49/(.49+.81) = 37.69% of the time, then you would win on average, and in the long run you would ascend with probability 1. However, the stable distribution is (.3836,.1543,.4621}, i.e., your stair's number is a multiple of 3 38.36% of the time, 3n+1 15.43%, and 3n+2 46.21% of the time. Because you flip coin A 38.36% of the time, you lose .0087 steps on average.

When you mix in coin C half of the time, you still have a losing coin (A-or-C) and a winning coin (B-or-C), and they win or lose in about the same proportion, .41:.24. You need to flip (A-or-C) at most 36.92% of the time. However, the chance that you toss the losing coin decreases to 34.51%.

To make this more dramatic, you can replace coin A with one that always loses, and coin B with one that wins 90% of the time. It should be obvious that you can't get by multiples of 3 in the initial game, but in the second game you get past multiples of 3 enough when you flip C instead.

These games similar to the idea of a ratchet.

Here is a poker example: Let's suppose that you are a losing poker player because when you suffer two bad beats in a row, you tilt for the next hour. Let's suppose you have a hobby of playing craps, another losing proposition. If you force yourself to alternate hours playing poker and craps, you may greatly decrease the proportion of the time at the poker table that you are tilting, and it is conceivable that you would become a winning gambler overall. The stairs labelled with a multiple of 3 are analogous to the the times you are on tilt for an entire hour.

C M Burns
04-26-2004, 01:28 AM
On that note, I was playing at a game in vegas, and there was this guy who got a little steamed and took most of his chips and went to play one hand of blackjack (he won) then he came back, he said he just wanted to blow off some steam. I thought he was a bit crazy but perhaps he knew exactly what he was doing.

I played with some far less sophisticated calculation on this problem, the way i set it up was that if you are on the good step you have a .62 chance to move up and on the bad a .29 chance, and since u are on the good step 3x' as often overally u are about .51 to move up. I don't know if that is the correct analysis or not.

And so is this a good book? Is it mainly interesting theoretical problems like this or does it talk about direct practical applications?