PDA

View Full Version : Compelling question about a coinflip bet


MatrixMunki
09-04-2004, 02:31 PM
lets say that you pay your friend 1$ everytime it is heads, and he pays you 1$ everytime it is tails. Lets say he will play as long as you want him to out to infinity. Now your ev is obviously 0, but this seems like an amazing deal to me? Take this deal an infinite number of times, there has got to be a streak in which you are up on him for a nice amount! if you have an infinite bankroll, it doesn't matter how much he beats you at first, just wait till your good run has come and u have had an unfair share of luck and quit, what do you think?

pudley4
09-04-2004, 03:17 PM
It's called "Gambler's Ruin"

Even if you take a 0 EV bet, if you have a finite bankroll eventually you'll lose all your money.

Look here (http://mathworld.wolfram.com/GamblersRuin.html)

sfa420
09-04-2004, 11:41 PM
this is why many people lose money on roulette tables....that sign that says what color it was last has to be the best invetion ever....i mean other than 0,and 00 you have a 50/50 chance of red or black and it pays 2:1....it sounds good
but it really isnt
-sfa420

Dov
09-05-2004, 03:49 PM
it doesnt pay 2-1. it pays 2 for 1. that is the same as 1-1.

PublickStews
09-07-2004, 04:54 AM
[ QUOTE ]
It's called "Gambler's Ruin"

Even if you take a 0 EV bet, if you have a finite bankroll eventually you'll lose all your money.

Look here (http://mathworld.wolfram.com/GamblersRuin.html)

[/ QUOTE ]

From that site:

[ QUOTE ]
Since casinos have more pennies than their individual patrons, this principle allows casinos to always come out ahead in the long run. And the common practice of playing games with odds skewed in favor of the house makes this outcome just that much quicker.

[/ QUOTE ]

That site implies that it would be profitable provided you have a larger bankroll than your opponent. It says that ONE person will lose all his money, not all people, and that the ratio of the smaller stack of pennies divided by the total pennies in play is the chance the bigger stack has of going broke. That site explicitly says that casinos would make money even with 50/50 odds, and that tilting the odds in their favor only quickens the process.

That doesn't seem to make sense though. If you have a stack of 198 pennies and your opponent has 2, your chances of going broke in the session are 1%. 99% of the time, you will win 2 pennies, and 1% of the time, you will lose 198. That's 0 EV and provided you kept doing this (a la a casino offering 50/50 odds), you too would go broke.

Louie Landale
09-07-2004, 01:04 PM
If you have an infinate bankroll than adding to it results in the same infinate bankroll which isn't any better than your original infinate bankroll. If that's true talking about EV with such a bankroll is a waste of time.

Infinity+1=Infinity. Algebra doesn't work on Infinity.

So lets just talk about finite bankrolls. Your strategy will yeild a win against your friend in direct proportion to your banroll. But if you tried this in a zillion different universes all at once, eventually you will lose your entire bankroll and will have to quit. Turns out that disaster will net the same as all the paltry wins you locked up in the other zillion universes.

EV on a fair coin flip for even bets = 0. You cannot add up a bunch of zeros to get a positive number. Its a waste of time no matter how you slice and dice it.

- Louie

TripleH68
09-10-2004, 03:19 AM
[ QUOTE ]
casinos would make money even with 50/50 odds, and that tilting the odds in their favor only quickens the process.

That doesn't seem to make sense though.

[/ QUOTE ]

Bobby Jones once pointed out that most amateur golfers had their career rounds when they started off poorly, say bogey-bogey. Why is this? Because when an amateur starts off well he often changes his goal. If you go out to shoot a 79 and start off birdie-birdie-par-par...you will likely lower your expectation to a 75. This will put extra pressure on your game. If you start bogey-bogey you may relax and come into your game in mid-round. I know as I once lowered my best round ever by shooting 43-34-77.

So what the hell does that have to do with the casino? Ever seen a guy get hot at the craps table? He brings $500 and turns it into $2,500. A small hit for the giant casino if the guy has any discipline. Problem is most amateur gamblers will now change his/her expectations. Suddenly a $2,000 gain is not good enough and the casino is open 24-hours-a-day. The end of the story is the poor sap has to borrow cab fare to get home...

Have a nice day.

Disclosure: I own stock in casinos.

Dov
09-10-2004, 10:06 AM
[ QUOTE ]
So what the hell does that have to do with the casino? Ever seen a guy get hot at the craps table? He brings $500 and turns it into $2,500. A small hit for the giant casino if the guy has any discipline. Problem is most amateur gamblers will now change his/her expectations. Suddenly a $2,000 gain is not good enough and the casino is open 24-hours-a-day. The end of the story is the poor sap has to borrow cab fare to get home...

[/ QUOTE ]

This is really not too different from someone who is very shortstacked in a tourney and is sitting at a table where people have him covered in comfort. (NL)

He will have to get very lucky to win this tourney, b/c even if he doubles or triples up, he can still lose his entire stack in one hand.

Paluka
09-10-2004, 11:13 AM
[ QUOTE ]
lets say that you pay your friend 1$ everytime it is heads, and he pays you 1$ everytime it is tails. Lets say he will play as long as you want him to out to infinity. Now your ev is obviously 0, but this seems like an amazing deal to me? Take this deal an infinite number of times, there has got to be a streak in which you are up on him for a nice amount! if you have an infinite bankroll, it doesn't matter how much he beats you at first, just wait till your good run has come and u have had an unfair share of luck and quit, what do you think?

[/ QUOTE ]

Explain to me how this game is better for you than it is for him.

Nottom
09-10-2004, 07:20 PM
Lets say for the sake of argument thatn instead of an infinate bankroll, you have $100 and your opponent has $25. You agree to play untill one of you goes broke.

You should have a tremendous advantage in this game.

Cerril
09-10-2004, 08:56 PM
There is definitely a good conclusion you can come up with here. If you're the one who gets to choose when to stop and HE has an infinite bankroll, then you can limit the risk to the point where you have a very good shot at making money but the money you make will be nearly nothing.

What you would need to do is figure out the deviations and whatnot and basically come up with a % profit that you can reasonably hope to attain before going broke - it may be a very small number.

In fact, what you're doing is taking a 0 EV gamble no matter how you get there (someone else can do the math for other numbers, I'm sure, but I'll give a couple small examples).

+1 before -1 : 50% win $1, 50% lose $1 (duh)
+1 before -2 : 50% you'll get there immediately, 25% you'll fail immediately, the remaining 25% of the time you're back where you started. 50% (12.5%) of that you'll get there immediately, 25% of that (6.25%) you'll fail, the rest you're back where you started... sum that all up and the limit is unsurprisingly 2/3 win $1, 1/3 lose $2.

So no matter how you slice it you're risking as much as he is, you can just pick the amount of risk you prefer. If you have $100, then whatever dollar value you decide to win (make your stop-gain), that determines your % chance of going broke. Want $50? Then 33% of the time you'll lose it all. Double up? Half the time he will first. But if all you want is lunch ($10), then 90% of the time he'll be paying. Of course the remaining 10% you'll be buying him ten lunches!

bigpooch
09-11-2004, 01:04 AM
Well, anyone who knows about this symmetric random walk
knows that if you play forever, you will reach the origin
(net gain of zero) infinitely often (similarly, for any
finite point!). Suppose you pick some big $ amount you are
willing to get to (it could be $1 billion!) and assume that
you will definitely get paid (somewhat unrealistic!) if you
decide to quit when you reach that specified amount. Well,
on the one hand, you will reach that point with absolute
certainty if you play long enough (it's likely in the order
of quintillions of coin flips), but it will be an extremely
long time.

In fact, you could pick any finite amount no matter how
large, and you will eventually get there with certainty.
It is also certain that if you play long enough, you will
also get to the negative of that same amount! Clearly, on
each flip, you have zero ev, but you do have the option of
quitting after "running well"! /images/graemlins/smile.gif

Of course, if you must settle up after each coin flip,
that's a different kettle of fish! /images/graemlins/smile.gif


MORE ADVANCED QUESTION (answer known)

Suppose you have not just you and your friend playing this
coinflipping game, but another pair doing the same (another
coin, of course!) with the proviso that the coinflips for
each pair are occurring at the same time. Also, the four
of you decide to play forever. Is it true that the four of
you will break exactly even simultaneously infinitely
often?

Suppose instead there are now three pairs of players. Will
the six players break even simultaneously infinitely often?


I suggested this question to a friend (who is not familiar
with the theorem that exactly answers the question) but I
know that it doesn't take much more than calculus to find
the answer (Polya's name comes to mind!).

(Answer to be posted later!)

pzhon
09-11-2004, 11:38 AM
[ QUOTE ]

MORE ADVANCED QUESTION (answer known)

Suppose you have not just you and your friend playing this coinflipping game, but another pair doing the same ... Is it true that the four of you will break exactly even simultaneously infinitely often?

Suppose instead there are now three pairs of players. Will the six players break even simultaneously infinitely often?


[/ QUOTE ]
Random walks in dimensions 1 and 2 return to the origin with probability 1, hence infinitely often with probability 1. Random walks in higher dimensions don't return to the origin with probability 1, so the probability of returning infinitely often is 0. Here is a simple argument:

/images/graemlins/diamond.gif The average number of returns to the origin is the sum of the probabilities of returning at step 2, 4, 6, etc.

/images/graemlins/diamond.gif The probability that a particular pair will be even after 2n coin-tosses is roughly 1/sqrt(n pi). (The exact value is (2n choose n)/(4^n), which can be estimated accurately by Stirling's formula, n! ~ sqrt(2 pi n) (n/e)^n.)

/images/graemlins/diamond.gif If there are k pairs, the probability they all are even after 2n tosses is roughly (n pi)^(-k/2).

/images/graemlins/diamond.gif The sum of 1/n^(k/2) is finite when k/2 is strictly greater than 1, and infinite when k/2 is less than or equal to 1.

The average number of returns to the origin can be used to determine the probability of returning. If the probability of returning is p, then the average number of returns is p+p^2+p^3+...=p/(1-p). If the average number of returns is r, then the probability or returning at least once is is r/(1+r). In dimension 3, the average number of returns for this random walk is -1+pi/gamma(3/4)^4 ~ 0.3932, so the probability of returning is 1-(gamma(3/4)^4)/pi ~ 0.2822. I don't think it is easy to get this directly.

Another fun fact: Although the probability of returning to the origin is 1 in dimensions 1 and 2, the expected number of steps before you return is infinite. However, under the condition that you do return to the origin in dimension 5 or above, the expected number of steps before you return is finite. I'm not sure what happens in dimensions 3 and 4.

Bez
09-11-2004, 08:18 PM
I think yes. Didn't understand what bigpooch was on about. If the answer is no could someone explain this in words.