Two Plus Two Older Archives

Two Plus Two Older Archives (http://archives2.twoplustwo.com/index.php)
-   Science, Math, and Philosophy (http://archives2.twoplustwo.com/forumdisplay.php?f=45)
-   -   Am I stupid? I can't fit these two concepts into any type of harmony. (http://archives2.twoplustwo.com/showthread.php?t=320870)

Alex/Mugaaz 08-23-2005 01:54 AM

Am I stupid? I can\'t fit these two concepts into any type of harmony.
 
A: Infinity
B: Chance

Basic, maybe flawed question: Let's say you have a bankroll of 1 trillion BB's, and your winrate is 3bb/100. If you played an infinite amount of time, would you eventually bust out?

Since it is always possible that you could lose X hands, will this eventually happen since the time range is infinity? How does this concept mesh with the fact that you should always be up in the long run since you have an edge?

PairTheBoard 08-23-2005 02:12 AM

Re: Am I stupid? I can\'t fit these two concepts into any type of harmony.
 
I think this is tricky but I don't remember exactly how to handle it. It should be easy for the practicing mathematicians around here.

It's true that an event with low probabilty will occur infinitely many times given an infinte sequence of trials. However, a sequence of events whose non-zero probabilities become smaller and smaller might not occur infinitely many times given an infinite sequence of trials. Your problem has that flavor because as time goes on and you build up more winnings your element of ruin gets smaller.

PairTheBoard

daryn 08-23-2005 02:27 AM

Re: Am I stupid? I can\'t fit these two concepts into any type of harmo
 
this one seems easy. your bankroll is finite, $1 trillion you said. so of course probability dictates that you will eventually go bust.

pokerplayer28 08-23-2005 02:31 AM

Re: Am I stupid? I can\'t fit these two concepts into any type of harmony.
 
i am stupid, but here are my thoughts

since the risk of ruin is greater than 0 and will always be greater than 0 if you play for infinity it is almost certain you will go broke, but even if you play for infinity you will never be 100% certain you will go bust. So it is likely youll go bust but not certain.

This is just what popped into my head, i would really like to hear from someone qualified.

Darryl_P 08-23-2005 04:16 AM

Re: Am I stupid? I can\'t fit these two concepts into any type of harmony.
 
The best way to model this is with a random walk. You start at 0 and then you take a step up to +1 with probability p>.5 and a step down to -1 with a probability q=1-p. You repeatedly take steps either up or down in this manner until you either reach + 1 trillion or - 1 trillion.

There is a formula which says you will reach + 1 trillion first with a probability of [(q/p)^1 trillion-1]/[(q/p)^2 trillion-1], which is very, very close to 1 in the example described.

The formula and its development can be found here

To answer the question of EVER reaching -1 trillion (ie. going broke), simply consider a repeated application of doubling your money before losing it, first with 1 trillion, then with 2 trillion, then 4 trillion, etc.

Your probability of winning this infinite series of trials is approximated by N*(N^1/2)*(N^1/4)*(N^1/8)*..., where N is your probability of winning your first trial, ie. a number just very slightly less than 1.

This infinite product of probabilities (all less than 1) is equal to

N^(1 + 1/2 + 1/4 + 1/8 + ...) = N^2

which is still very, very close to 1.

So while you are not guaranteed to win after an infinite amount of time, your probability of doing so is very, very high.

The key to understanding why the infinite product of numbers less than 1 can still remain very close to 1 in this case is to understand that the infinite series of fractions 1 + 1/2 + 1/4 + 1/8 + ... equals 2.

Alex/Mugaaz 08-23-2005 04:29 AM

Re: Am I stupid? I can\'t fit these two concepts into any type of harmony.
 
I will admit that I am not able to fully understand what you are saying, however in trying to decipher it looks like it doesn't apply since this is not a random walk? Am I incorrect?

Darryl_P 08-23-2005 04:56 AM

Re: Am I stupid? I can\'t fit these two concepts into any type of harmony.
 
Actually it is because you can define each step up as reaching a certain goal like +1000 BB, say and each step down as -1000 BB. Since you have an edge your probability of reaching +1000 BB before reaching -1000 BB is greater than 50%.

Of course if you do it this way then your ultimate goal of 1 trillion BB becomes 1 billion "steps" which is not exactly the same number, but the principle remains the same and the result will still be that you have greater than a 99.999999999 (tack on a few more if you like) % chance of winning.

Alex/Mugaaz 08-23-2005 05:30 AM

Re: Am I stupid? I can\'t fit these two concepts into any type of harmony.
 
If what you're saying is that whether you bust out or not is not certain either way, even over infinity, then I agree. I think there is a chance of busting out, but that it is less than 100% and greater than 0%. Also it seems to follow that the longer this game goes on less chance of you busting out.

fnord_too 08-23-2005 09:37 AM

Re: Am I stupid? I can\'t fit these two concepts into any type of harmo
 
[ QUOTE ]
this one seems easy. your bankroll is finite, $1 trillion you said. so of course probability dictates that you will eventually go bust.

[/ QUOTE ]

I disagree. This is a random walk about a positively sloped line. If he were even money, then I would say he will go bust, since eventually he will hit a bad spot long enough to depleat any arbitrary number, but here the length of the bad streak needed is increasing, since he has positive expectation.

I have a picture in my head that I am not expressing well. Let me try it this way: Say this were flipping a coin where hero is getting 1.1:1 on his money. As time wears on, he will need a bad streak considerably longer than his longest good streak to go bust. To frame it slightly differently, and possibly sophisticly, over any set of trials, cancel the wins and losses. In this case, you cancel 11 losses with 10 wins. Then you are left with pure loss or gain. For him to go bust in this coin example, his longest loosing streak needs to be very much longer than his longest winning streak to date. (I know he can have multiple losing streaks, but you can just extend the period where you cancel wins and losses there.)

Is this making any sense? Basically, if you take a random walk whos expectation has a non zero slope, you do not have to eventually get to zero if you start sufficiently away from zero on the side that your slope is pointing.

fnord_too 08-23-2005 09:41 AM

Re: Am I stupid? I can\'t fit these two concepts into any type of harmo
 
[ QUOTE ]
If what you're saying is that whether you bust out or not is not certain either way, even over infinity, then I agree. I think there is a chance of busting out, but that it is less than 100% and greater than 0%. Also it seems to follow that the longer this game goes on less chance of you busting out.

[/ QUOTE ]

Yeah, that's what I was trying to say. And since you are starting with one trillion BB, I think that your chance of going bust is very close to 0.

08-23-2005 09:52 AM

Re: Am I stupid? I can\'t fit these two concepts into any type of harmo
 
These types of questions may seem interesting thought exercises, but really are not. For one, the concept of "infinite time" already makes a supposition that is not reality-based (certainly not for organic poker-playing human lifeforms). So, to project any human activity to infinity is already a flawed analysis. Second, when one says "infinity" what are you really saying? It can't be logically handled because its so abstract that common concepts fail. If some event is truly infinite, then you should think that every possible outcome will eventually be realized simply because it has non-zero probability over an infinite period. Thus, if your bankroll is finite, there is some non-zero probability you bust and an infinite time to realize that non-zero probability sequence. But as you increase in time, that probability goes closer and closer to zero since your finite starting point is generally increasing. So there's an irreconcilible paradox that every sequence should eventually occur, but the sequence becomes more and more improbable as you go towards infinity. It would be an easier question if you had an infinite number of starting bankrolls, then yes, eventually a number of them would bust. The nature of "infinity" discussions will always have these paradoxes when a single trial is considered.

PairTheBoard 08-23-2005 10:25 AM

Re: Am I stupid? I can\'t fit these two concepts into any type of harmo
 
[ QUOTE ]
kidluckee --

when one says "infinity" what are you really saying? It can't be logically handled because its so abstract that common concepts fail. If some event is truly infinite, then you should think that every possible outcome will eventually be realized simply because it has non-zero probability over an infinite period.

[/ QUOTE ]

It can be handled mathematically. We are not looking at a Fixed event with nonzero probablilty and with infinite trials. If that were the case you would be right in saying the event is bound to happen eventually. But in this case we are looking at a sequence of different events, each one with smaller and smaller probability. We Can handle this situation logically. We have developed mathematics to do so and in some cases of this kind - such as this one - it is Not true that one of the events must eventually occur. In this case the sequence of events is "Going broke before doubling". After each trial where the event does Not happen, the chances of the next event in the sequence are smaller.

It is useful to look at the case of infinity for finding a bound to the probabilty. If we compute the probability of not going broke over infinite time to be say 99.9% then that's a lower bound for the probabilties we're interested in for finite times. For whatever finite time you're really interested in the probabilty for not going broke is therefore greater than 99.9%.

We are in the realm of mathematics here. Vague notions don't carry much weight.

PairTheBoard

08-23-2005 10:42 AM

Re: Am I stupid? I can\'t fit these two concepts into any type of harmony.
 
As long as your winrate is greater than 0 and finite, and your bankroll is greater than 0 and finite, if you played an infinite amount of time, your chances of busting will always be greater than 0% and less than 100%

Even if your bankroll is 1BB, and your winrate is .001bb/100 there is a chance you will never go broke

And no matter how big your bankroll is, you always have a chance of going broke

Almost sounds contradictory, but it's not

08-23-2005 10:44 AM

Re: Am I stupid? I can\'t fit these two concepts into any type of harmo
 
[ QUOTE ]

It is useful to look at the case of infinity for finding a bound to the probabilty. If we compute the probability of not going broke over infinite time to be say 99.9% then that's a lower bound for the probabilties we're interested in for finite times. For whatever finite time you're really interested in the probabilty for not going broke is therefore greater than 99.9%.

[/ QUOTE ]

Good point.

Alex/Mugaaz 08-23-2005 10:54 AM

Re: Am I stupid? I can\'t fit these two concepts into any type of harmony.
 
Ok since most seem to agree the answer is greater than 0 and less than 100%, what is the actual answer, and how can it be calculated?

BruceZ 08-23-2005 11:43 AM

Re: Am I stupid? I can\'t fit these two concepts into any type of harmony.
 
[ QUOTE ]
A: Infinity
B: Chance

Basic, maybe flawed question: Let's say you have a bankroll of 1 trillion BB's, and your winrate is 3bb/100. If you played an infinite amount of time, would you eventually bust out?

Since it is always possible that you could lose X hands, will this eventually happen since the time range is infinity? How does this concept mesh with the fact that you should always be up in the long run since you have an edge?

[/ QUOTE ]

If you have an edge, then your probability of going bust is always < 1, no matter how low your win rate, or how small your bankroll. This probability, called the risk of ruin, depends on your win rate, your standard deviation, and your bankroll, and it can be computed by the formula in this thread for games like blackjack and poker. The derivation can be found here.

While it is true that you are guaranteed to eventually suffer a downswing of X dollars for X arbitrarily large, these downswings do not cause you to go bust because by the time they occur, your bankroll will have grown large enough to absorb them. Your bankroll grows linearly with the number of hands played, while the likelihood of a downswing of a given size depends on your standard deviation, and this increases as the square root of the number of hands played. Hence the growth of your bankroll out paces the frequency and size of the negative swings.

When considering the risk of ruin, many people have the misconception that it makes a big difference whether you play forever vs. only a few hundred hours. In fact, for a significant winner, risk of ruin is a short-term phenomenon. If he doesn't go broke in the first few hundred hours, chances are he never will. This is because once he doubles his bankroll, his risk of ruin becomes squared, e.g. 1% becomes 0.01% since going bust now requires him to lose the 1% bankroll twice. So his risk of ruin for playing forever is essentially the same as for playing only a few hundred hours. This is for a player who reinvests all his winnings in his bankroll. A player who spends all his winnings will go broke once he hits a big enough losing streak. How long that takes depends on the size of his initial bankroll, but the probability that he goes broke appoaches 1 as he plays to infinity.

Darryl_P 08-23-2005 11:45 AM

Re: Am I stupid? I can\'t fit these two concepts into any type of harmony.
 
Good grief! I already gave you the answer. If you want to hear something more pleasing I recommend visiting a psychic or an astrologist because if you ask any mathematician his answer will be the same as mine.

FishAndChips 08-23-2005 07:37 PM

Re: Am I stupid? I can\'t fit these two concepts into any type of harmony.
 
Let's not forget in our analysis that while our bankroll is finite, so to is the total amount of money that can be won playing poker. Whether that's all the money in the world, or just a fraction of it, it definitely changes some conclusions.

If the amount of money that could be won in the game was infinite, you could eventually hit some astronomical bad luck streak with a trillion (or whatever our current bankroll is) bet down swing etc. However, you must compare the risk of ruin to the chance of winning every sweet penny.

Just imagine what would happen if you won "all the money in the world." [img]/images/graemlins/wink.gif[/img]

PairTheBoard 08-24-2005 01:13 AM

Re: Am I stupid? I can\'t fit these two concepts into any type of harmony.
 
[ QUOTE ]
FishAndChips --

Just imagine what would happen if you won "all the money in the world."

[/ QUOTE ]

I think I would keep my 5 year old Mustang because I like it and it only has 40,000 miles on it. But I'd probably get a bigger computer screen.

PairTheBoard

08-24-2005 04:40 PM

Re: Am I stupid? I can\'t fit these two concepts into any type of harmony.
 
This is a very good answer for a limit game, but if you are playing in a no-limit game against players who have your $1 trillion covered, the probability of you busting out approaches 1 as the number of trials approach infinity.

In this situation, it is possible to bust out in a single hand. You're unlikely to go all in, but it will happen eventually. Most of the times when you go all in, you will be a large favorite but not a lock. If you change your strategy such that you will only go all in on the river with the nuts, you will be beaten very easily. That strategy can very easily be defeated by an opponent who pushes preflop on every hand--even if you had AA, you'd have to fold, because you'd only be a 9:1 favorite.

Alex/Mugaaz 08-24-2005 05:12 PM

Re: Am I stupid? I can\'t fit these two concepts into any type of harmony.
 
You misunderstood what I was trying to say. I wasn't saying you were wrong, I was asking you questions about the answer.

Darryl_P 08-24-2005 05:36 PM

Re: Am I stupid? I can\'t fit these two concepts into any type of harmony.
 
If my opponents played in a way that they went all in any more often than, say, once in a million hands, then the assumption of my having a 3 BB/100 advantage would be false...it would be much greater than that.

For me to have only a 3BB/100 advantage we must rule out extremely high variance plays on the part of my opponent since those would lose too much EV.

And when you bring down the variance to "normal" levels, then the random walk model can be used with high precision even in NLH.

Darryl_P 08-24-2005 05:42 PM

Re: Am I stupid? I can\'t fit these two concepts into any type of harmony.
 
OK, no problem, sorry for getting edgy on you there [img]/images/graemlins/cool.gif[/img]

MtDon 08-24-2005 05:42 PM

Re: Am I stupid? I can\'t fit these two concepts into any type of harmo
 
You didn't mention who your opponent(s) would be. If you played against a finite number of opponents, who each had a finite number of chips, then it is possible for you to win, because someone would have to win all the chips.

If you play against either a finite number of players who have an infinite number of chips, or a potentially infinite number of players with a finite (or infinite) number of chips each, then you will eventually lose all your chips. This is, of course unless all the players are so bad they always throw there hands away whatever they have and what ever the board cards are - but this doesn't fit the predicate that you have a 3bb/100 advantage in the game.

Eventually, you will run in to a run of cards where you are blinded off, if you don't lose all your money before that.

If you play for an infinite lenght of time, all possible finite runs of cards will be dealt an infinite number of times. Since you will always have an finite number of chips, you will have to run into a run of cards which will break you.

For example: You will run into these cards for as long a stretch as needed. You have AsAc, the BB has KsKc. Everyone else has nothing and folds pre flop.

Flop: AdKd2h
Turn: 2d
River: Kh

Your full house will lose to his quads every time. These same cards will be repeated, consecutively or in any order needed, as many time as are necessary for you to lose all your chips.

Having an edge, doesn't protect you from a bad run of cards.

The only type of edge that would protect you from a bad run of cards is one that was due to all your opponents playing so badly that, for example, they would fold every time you bet or raised, so that you could win every hand by betting or raising. That is, they play so badly that there would be a stategy that would allow you to win every hand you played.

David Sklansky 08-24-2005 06:14 PM

Re: Am I stupid? I can\'t fit these two concepts into any type of harmo
 
"If you play for an infinite lenght of time, all possible finite runs of cards will be dealt an infinite number of times. Since you will always have an finite number of chips, you will have to run into a run of cards which will break you."

Wrong

Darryl_P 08-24-2005 06:20 PM

Re: Am I stupid? I can\'t fit these two concepts into any type of harmo
 
[ QUOTE ]
If you play against either a finite number of players who have an infinite number of chips, or a potentially infinite number of players with a finite (or infinite) number of chips each, then you will eventually lose all your chips.

[/ QUOTE ]

This is simply not true. The only assumptions you need are the 3BB/100 advantage you have and the fact that you are at least an average player (something which IMO can be assumed from the way in which the question was posed).

That 3BB/100 advantage automatically limits the variance with which your opponents can play because there is a level above which your EV necessarily becomes higher with nothing more than average play.

This limiting variance will still be very small compared to your bankroll so you will still be able to define an amount like a million BB say which you will have a greater than 50% chance of doubling before losing. And here the random walk model applies and so the probability of EVER going broke is very small indeed.

The trouble with your reasoning is that whatever bad run you mention (say 1 trillion hands in row of quads over boat, which of course must happen eventually), the expected amount of money I will have grinded out from my 3BB/100 advantage will cover it more than enough trillion times for it to be a threat to my bankroll.

Of course such a run CAN happen before I've managed to grind out the cover for it but that has a probability of less than 0.0000000001% of happening, as shown by the random walk model.

spaminator101 08-24-2005 08:36 PM

Re: Am I stupid? I can\'t fit these two concepts into any type of harmony.
 
if you had an infinite # of chips you could not go broke

Lexander 08-24-2005 09:31 PM

Re: Am I stupid? I can\'t fit these two concepts into any type of harmo
 
Question for the probabilist, since my Prob Theory course hasn't covered this yet (we are still at the beginning discussing Borel sets and sigma-fields, so allow some room for utter confusion in your answer), though I have a year of study in statistics.

Is this a matter or Convergence in Probability, or a matter of Almost Sure Convergence, or something entirely different?

Thanks.

Darryl_P 08-25-2005 03:38 AM

Re: Am I stupid? I can\'t fit these two concepts into any type of harmo
 
I'd say those are different because they examine how a series of random variables converges to another random variable. Here we are only interested in a single probability.

pif 08-25-2005 05:57 AM

Re: Am I stupid? I can\'t fit these two concepts into any type of harmony.
 
if u have edge than u should use kelly criteria and u will never bust out.

the idea is to bet and precentage amount from ur bankroll (and not a fixed amount).

PairTheBoard 08-25-2005 06:00 AM

Re: Am I stupid? I can\'t fit these two concepts into any type of harmony.
 
[ QUOTE ]
if u have edge than u should use kelly criteria and u will never bust out.

the idea is to bet and precentage amount from ur bankroll (and not a fixed amount).

[/ QUOTE ]

Good point!

PairTheBoard

MtDon 08-26-2005 01:34 AM

Re: Am I stupid? I can\'t fit these two concepts into any type of harmo
 
[ QUOTE ]
"If you play for an infinite lenght of time, all possible finite runs of cards will be dealt an infinite number of times. Since you will always have an finite number of chips, you will have to run into a run of cards which will break you."

Wrong

[/ QUOTE ]

Why?

BillC 08-26-2005 12:22 PM

Re: Am I stupid? I can\'t fit these two concepts into any type of harmony.
 
"If you have an edge, then your probability of going bust is always < 1, no matter how low your win rate, or how small your bankroll. This probability, called the risk of ruin, depends on your win rate, your standard deviation, and your bankroll, and it can be computed by the formula in this thread for games like blackjack and poker. "

This assumes that the bet size is "small" relative to the bankroll. If i bet my entire bankroll on each trial, my ROR is 1. So what happens if you bet say 10% of your starting bank?

PairTheBoard 08-26-2005 12:54 PM

Re: Am I stupid? I can\'t fit these two concepts into any type of harmo
 
[ QUOTE ]
[ QUOTE ]
"If you play for an infinite lenght of time, all possible finite runs of cards will be dealt an infinite number of times. Since you will always have an finite number of chips, you will have to run into a run of cards which will break you."

Wrong

[/ QUOTE ]

Why?

[/ QUOTE ]

What will happen with non-zero probability is that for any long bad streak you want to identify, it will always happen when your bankroll has grown large enough to sustain it.

PairTheBoard

BillC 08-26-2005 01:22 PM

Re: Am I stupid? I can\'t fit these two concepts into any type of harmony.
 
What I meant was that the usual ROR formula assumes small bet sizes and a relatively unskewed payoff distribution.

For highly skewed games such as video poker (and i think most poker tournaments) the ROR calculation is not so nice
see this article

pokerplayer28 08-26-2005 02:32 PM

Re: Am I stupid? I can\'t fit these two concepts into any type of harmo
 
[ QUOTE ]
[ QUOTE ]
"If you play for an infinite lenght of time, all possible finite runs of cards will be dealt an infinite number of times. Since you will always have an finite number of chips, you will have to run into a run of cards which will break you."

Wrong

[/ QUOTE ]

Why?

[/ QUOTE ]

when do your finite run of cards end?

08-26-2005 04:42 PM

Re: Am I stupid? I can\'t fit these two concepts into any type of harmony.
 
[ QUOTE ]
And when you bring down the variance to "normal" levels, then the random walk model can be used with high precision even in NLH.

[/ QUOTE ]

Well, it depends on our assumptions about how the game is set up. If we assume that you are playing against opponents with infinitely large bankrolls, then the random walk model no longer holds. On any hand, there is a very small chance that you will lose your entire stack. Your winnings at any rate of BB/100 will not eliminate that chance, and therefore you will bust out eventually.

If you play against opponents who add or subtract from their stacks so they're at 1 trillion BB at the beginning of each new hand, then you're correct, assuming both you and your opponents play to maximize EV, rather than to maximize or minimize the chance you will bust.

David Sklansky 08-26-2005 08:04 PM

Re: Am I stupid? I can\'t fit these two concepts into any type of harmo
 
"What will happen with non-zero probability is that for any long bad streak you want to identify, it will always happen when your bankroll has grown large enough to sustain it."

PairTheBoard

Who wrote this? I thought we had software preventing more than one person from using the same name.

BruceZ 08-27-2005 02:47 AM

Re: Am I stupid? I can\'t fit these two concepts into any type of harmony.
 
[ QUOTE ]
What I meant was that the usual ROR formula assumes small bet sizes and a relatively unskewed payoff distribution.

[/ QUOTE ]

The initial bankroll can be a single bet. In my derivation of the ROR formula, which is essentially Sileo’s derivation, the ROR for a bankroll of size B is derived as the ROR for a 1 bet bankroll raised to the B power. This assumes that the winnings are reinvested in the bankroll, and that your win rate and standard deviation do not change. My comments above assumed that we are maintining a constant win rate.

If your increase your betting limits as your bankroll grows, then you will go broke with probability 1 if you continue to bet more than twice the Kelly fraction of your bankroll, where the Kelly fraction is approximately EV/sigma^2.


[ QUOTE ]
For highly skewed games such as video poker (and i think most poker tournaments) the ROR calculation is not so nice
see this article

[/ QUOTE ]

The ROR formula that I linked to is derived by assuming that the game with skewed payoffs can be modeled as a coin flip game with the same win rate and variance via the central limit theorem. There may be games for which this model breaks down, but it works well for blackjack and poker, and any game for which the risk of ruin depends primarily on the win rate and standard deviation, and very little on the higher moments.

PairTheBoard 08-27-2005 04:00 AM

Re: Am I stupid? I can\'t fit these two concepts into any type of harmo
 
[ QUOTE ]
"What will happen with non-zero probability is that for any long bad streak you want to identify, it will always happen when your bankroll has grown large enough to sustain it."

PairTheBoard

Who wrote this? I thought we had software preventing more than one person from using the same name.

[/ QUOTE ]

Actually, I let my 6 year old niece make that post. She got it from her 1st grade arithmetic book. She tells me all the kids are into this stuff these days because of all the poker shows on TV.

PairTheBoard


All times are GMT -4. The time now is 02:14 PM.

Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2024, vBulletin Solutions Inc.