PDA

View Full Version : Irieguy's "Zero Sum Thoery"


John Paul
04-11-2005, 08:44 PM
This post is a result of Irieguy's posts/ideas about the abilities of players at different levels. He posited that at any one time there would be a number of players at a given level who do not play well enough to win at that level as some are folks trying to move up, and others are folks falling down from a higher limit. I was interested in this idea, so I have tried to put in mathematical terms as follows:

2 Limits
Assume that there is poker site with one game (NL SnGs or whatever) and two limits. Assume no one ever quits playing. We will call the two limits A (lower limit) and B (higher limit). All of the players would like to be winning at limit B, so if they do well at limit A they will move up. However, if they do poorly at B they will drop back down. So lets assume that the top U percent (expressed as a decimal) of A limit players will move up from A to B, and the bottom D percent (expressed as a decimal) of players will move down from B to A. So - what proportion of players are staying in A, going from A up to B, falling from B or staying at B?

We can figure out what these proportions will be assuming that the system runs long enough to reach an equilibrium.

The change in A will be:
-UA+DB
and the change in B will be
+UA-DB

we also know that A+B=1 (i.e. every player is either playing A or B)

So, at equilibrium, UA= DB, that is the same number of players will move up as are moving down. A little algebra will tell you that the proportion of players in A = D/(U+D) and the proportion in B is 1-[D/(U+D)]. (For brevity, I am only outlining the math here).

So, lets put in some numbers for U and D. Lets assume the top 25% of the A players at any time will try to move up, and the bottom 50% of the B players will drop back down.

Then the proportion playing at the two limits will be:
A=.5/(.25+.5)=.667
B = 1-.666=.333

Looking at the players by their percentile (0=worst, 100=best):
0-49.5th = always playing at A(=0.66*0.75)
49.5 to 83.5th = bouncing between A and B, beating A but failing at B (.66*.25*+.33*.50)
83.5-100th=staying at level B.

So who thinks they are a good (=better than average) player? Lets assume that anyone on level B thinks they are good, as well as half the folks on level A. The folks at B have already proven themselves at A, and half of the folks at A are better than their average opponent. If that is the case, 66.7% of all players will think they are better than average - which would be good for a site that wanted to keep its clients happy.

3 Limits
Now, let's say that the site now offers a third limit C, which is higher than the B. Winning B players move up to C and losing C players move down to B. For simplicity, we will assume that they do this at the same rates as the switch between A and B. Then, the change at each level will be
A: -UA+DB
B: +UA-DB-UB+DC
C: +UB-DC
Again A+B+C=1

Doing some algebra,
A=DB/U
C=UB/D
B=(UD)/(D^2+UD+U^2)

Assuming U=.25 and D=.5 again
A=.572
B=.286
C=.143

What does this mean? Because C has drawn good players from B, B is now easier to beat, so more A players are moving up, making A easier to beat as well.

By percentile:
0-42.9th = Staying at A
42.9-71.5th =Bouncing between A and B
71.5-78.2th =Staying at B
78.2-92.5st = Bouncing between B and C
92.5-100th = Staying at C.

So some players who used to be stuck at level A can now bounce back and forth to level B, and some who could not stay at B now can, or are even taking shots at level C.

No who thinks they are better than average? Lets assume everyone at B and C do - less than half the folks play at those levels. Again half of the A's think they are good as they are better than their average level A opponent. As a result, 71.4% of all players will think they are better than average.

So what does this mean?

Obviously I have made a lot of simplifying assumptions, and a much more detailed analysis could be done. However, I think a few things stand out. Adding a higher limit game attracts the best players, which in turn makes every game at a lower limit easier - which should make all the other players happy. Secondly, there may bucket-brigading of money going on - players who win money at one level and then lose it to the folks on the next highest level and drop back down. In the model the amount of this depends on the values you stick in for U and D, which I don't have real world estimates for. Also, in the real world, there are folks with a lot of money who just start at higher levels, but I suspect that a fair amount of money is working its way up from the bottom as well. Finally, I think Irieguy is on the right track with why competition is unusually hard at the 33's. I have no experience with either level myself, but from what folks are saying, it is harder to move up to the 55's due to the higher buy in and other factors he mentioned (U is lower compared to other levels) but it is just as easy to lose money and move down (D is the same).

Hope folks found this interesting,
John Paul

skipperbob
04-11-2005, 09:11 PM
Irieguy's "Zero Sum Theory" can be more easily stated as follows: " As long as my Dad earns a Sum, he will soon have Zero" /images/graemlins/grin.gif

gumpzilla
04-11-2005, 09:12 PM
You're going to need some kind of justification for your choices of A and B for me to take this particularly seriously. 25% move up, 50% move down? Why should this be? What I will give you is that it seems right that fewer should move up than move down, simply because the existence of the rake suggests that the median earn is probably negative.

I also think that the assumption that there is one way to play poker, that way being NLHE SNGs at a particular site, is pretty poor. At any given time there are going to be people switching over to Omaha, taking up poker as a new hobby and blowing their $200 at the $55s, etc.

One point in Irie's argument that I always found a little bit soft is that it assumes rational players who keep detailed records and have some idea whether they are winning or losing. Does this seem like an accurate depiction of fish? I think a lot of the fish are going to come in, donate their bankroll at some random relatively low buyin and vanish into the night, leaving the system. So the equilibrium needs to include not only people who move between levels, but people who enter and exit the world of SNG poker. Whether this has a huge impact on SNG difficulty is hard to say. One could probably make some reasonable assumptions about the injection of fresh fish at each level.

Slim Pickens
04-11-2005, 09:18 PM
(I'm assuming you meant U and D) So what you're asking for is that U and D are vectors rather than scalars, and turn the whole system into a matrix equation? Whee. This would be a nice expansion, but the simple model is still pretty good.

I agree the rake should be included. I did some backo-da-envelope calcs a while back and decided the rake was fairly important.

Slim

EDIT: Crap, I can't make this right, but the idea is it's just a linear system and adding the additional complexity of making U and D a function of buy-in shouldn't be that hard.

gumpzilla
04-11-2005, 09:31 PM
[ QUOTE ]
(I'm assuming you meant U and D) So what you're asking for is that U and D are vectors rather than scalars, and turn the whole system into a matrix equation? Whee. This would be a nice expansion, but the simple model is still pretty good.

[/ QUOTE ]

That's a fine point, too, but I was just thinking that I have no idea where these specific numbers are coming from. The qualitative result "The games get less populated as you go up in buyins" in a model where we consider a closed system of SNGs can be derived (I'm 99% sure having not done this calculation) with the simple proviso that U<D; that is, that fewer people move up than down. The existence of the rake seems like enough to make this true.

[ QUOTE ]

I agree the rake should be included. I did some backo-da-envelope calcs a while back and decided the rake was fairly important.

[/ QUOTE ]

I think this is right. There are probably more big losers than big winners (at least at the lower levels, and probably all levels), so without the rake I would expect that if you assume that all winning players move up, you'd see pretty drastically different results.

It also occurs to me that there is a relatively long timescale involved before winning players know they are winning, if we assume rational behavior. But there is probably a shorter timescale involved in busting back down. I wonder how this would affect things? Obviously this is a fanciful amount of overcomplication for a dinky little problem.

EDIT: One painfully obvious thing I didn't think of until just now: the rake means that in order for an equilibrium to be reached, there has to be an influx of new money into the system, otherwise the equilibrium that is reached is the trivial "Party Poker = 100%, everybody else = 0".

Slim Pickens
04-11-2005, 09:34 PM
Ok, so I wrote out some garbage and decided I think you're right on, except for three things.

1.[ QUOTE ]
Assume no one ever quits playing.

[/ QUOTE ]
I think you have to include terms for how many players quit for good and start playing for the very first time ever at a particular level. That should be fairly easy.


2.[ QUOTE ]
assuming that the system runs long enough to reach an equilibrium.

[/ QUOTE ] I sure as hell hope it doesn't. I'll have to quit before it gets even close. It {gasp} makes the wole thing a differetial equation {buh hoyven maven glaven}, but not a hard one.


3. The rake has to be in there.

Apathy
04-11-2005, 10:03 PM
Plus there are very very few people who play ONLY SNGs, and many of these people win at ring games and lost at SNGs and don't realise it since they don't keep accurate records and still breakeven overall. Those people often stay at the same level. Or they move up after a lucky tourney score.

John Paul
04-11-2005, 10:29 PM
Thanks for the feedback folks, here are some responses.

Using .5 for D and .25 for U was totally arbitrary on my part and you can plug in whatever values you like, or compare the models algebraically. Either way, I think the main point - that higher level games will draw the better players up and make the lower limit games easier - is true regardless of the values you chose.

In my mind, new players joining and old players quitting would be the most interesting thing to add. Do a lot of new players make low level games easier or harder or have no change? The trick is you have to make some assumptions about new folks and quitters and their are lots of possiblities - do only the worst low level players quit or the worst players at all levels or a random selection of players? Do new players only join the bottom level, all levels equally, or all levels but more at the bottom than the top? And so on.... If we had some real numbers from one of the sites, that would make things easier, but I have no idea where to get that.

Also, if U and D were a function or buy-in and/or rake I think that would be more realistic and better for looking at jumps in buy-in level, but I haven't dealt with that yet. However, that should be relatively easy to do, at least in a simplified way.

John Paul