PDA

View Full Version : A question for smart people


LinusKS
09-29-2004, 02:31 PM
We know from the ICM that the average player's TEV improves from 10% to roughly 18% when he doubles up.

We also know that a player with a 20% ROI sits down at the table with a TEV of 12%, rather than 10% (leaving aside fees, for the moment).

Here's the question: How much does the better-than-average player's TEV improve when he doubles up, compared to an average player's?


I can see two options for calculating this.

The first is to assume that when the better player doubles up, he gets an extra 20% on the improvement in his TEV, in addition to the 20% he started with.

That would mean, for example, you would add 20% of 80% to 80%, and then multiply that by the better player's original TEV. The result is the 20% ROI player has a nearly 24% TEV after doubling up.

The other method is to count the 20% only once, and apply the same multiples the average player gets to that basis figure. In that case, a player with a 12% TEV goes to 21.6% after doubling.

I'm leaning toward the second method, but I'm curious if anyone has any thoughts about this, or can help me out.

Thanks.

eastbay
09-30-2004, 03:12 AM
I don't know. I don't think it's a simple question.

Thinks heads-up for a moment. What we want is a function p(f) that gives the probability of a win if you have a fraction of the chips f. The boundary conditions are obvious: p=0 if f=0 and p=1 if f=1.

For equal players, by symmetry p=0.5 if f=0.5. From there you can surmise that it's just a linear function p=f.

For unequal players, the boundary conditions still hold, but by asymmetry p > 0.5 at f = 0.5. So, what does the rest of the function look like?

I can think of a bunch of highfalutin ways to approach it from random walk or diffusion theory, but maybe the following is about the right bang for the buck:

Assume an offset (advantage over 0.5) for equal stacks, call it a. Then to satisfy the boundary conditions, you could add a function that passes through (0,0), (1,1), (0.5,0.5+a) to the linear function, to get something in the right ballpark. For example:

p=f+4*a*f*(1-f)

You can imagine other choices for the offset function, but you'd be getting into turd polishing territory...

eastbay

dethgrind
09-30-2004, 03:23 AM
[ QUOTE ]
p=f+4*a*x*(1-x)

[/ QUOTE ]

I like it. I'm guessing you mean f here instead of x though, right? Otherwise, what is x?

How do you generalize this to multiple opponents? Let's say you are given an ROI and can safely assume that all your opponents are equally skilled.

eastbay
09-30-2004, 03:24 AM
Reload. x=f.

I think the result is general for N players, if a is a mean advantage over the field. But I worked 18 hours today, so I may be being stupid.

eastbay

dethgrind
09-30-2004, 04:05 AM
[ QUOTE ]
The other method is to count the 20% only once, and apply the same multiples the average player gets to that basis figure. In that case, a player with a 12% TEV goes to 21.6% after doubling.


[/ QUOTE ]

This can't be true. Regardless of how good you are, if you have all the chips, your TEV is 50%, not 50% * 1.2.

Using Eastbay's approach will work though. Just figure out how to use ROI (which most people keep track of) instead of 'a' in his formula.

eastbay
09-30-2004, 10:47 AM
[ QUOTE ]
Then to satisfy the boundary conditions, you could add a function that passes through (0,0), (1,1), (0.5,0.5+a) to the linear function, to get something in the right ballpark.


[/ QUOTE ]

Correction: The offset function passes through (0,0), (0.5,a), (1,0).

eastbay

rachelwxm
09-30-2004, 11:12 AM
I think your title scare me off initially. /images/graemlins/laugh.gif I think this is an interesting question although not as immediate profitable as some of the bubble questions. Nor do I think it’s a simple one by all means. But here are my quick thoughts:

The first proposal is definitely an upper bound of the estimation simply because of the fundamental tournament theory: every dollar you earn is worth less than the one you have right now. Even if you believe your initial investment is worth 120%, you are not going to pay another 120% to get doubled up. Plain and simple.

I would argue that the second one, however close, is an upper bound too. Two reasons:
1. As your chip increases, you should not use the same skill factor 1.2 any more. Let’s assume you have 7999 and your only opponent has 1 certainly, your EV=50 and not 50*1.2. And it’s a decreasing function as you increase your stack.

2. For good players, some portion of your ROI comes from survival. If two go all in, the winner has ev 18.4 and the rest of the party has 10.2. So if you are the lucky winner, you should at least take .2 off before you multiply ev by a factor(which should be less than 1.2 as shown in 1)

I know I did not give any number but hope this ignites some further discussion.

CrisBrown
09-30-2004, 11:44 AM
Hi Linus,

Your question is impossible to answer, as your additional 20% advantage may have just been fulfilled in the act of doubling up.

Cris

hhboy77
09-30-2004, 02:56 PM
eastbay, what do you do for a living?

i'm a pretty smart guy. i have a four-year degree from a good school in computer science, score well on standardized tests, iq etc... but i have no idea what you are talking about?

perhaps i've been drinking too much since college, and my intelligence is just a vestige of what it once was.

btw, i think you guys (not just eastbay) post a ton of interesting and useful stuff, so please keep on.

cheers.hubert

schwza
09-30-2004, 03:10 PM
[ QUOTE ]
We know from the ICM

[/ QUOTE ]

help a cash game player out... what?

LinusKS
09-30-2004, 03:53 PM
Eastbay, thanks for the response.

I'm not sure I understood what you said, though. Any way to break it down for the math-ignorant among us?

Also, shouldn't the upper boundary for prize equity be 0.5 (in a 20/30/50 payout structure)?

And does the formula you provided reduce the player's increase in equity as it approaches the boundary?

Edit: In other words, you wrote that p = probability of a win, but I'm trying to solve for equity in a tournament prize pool. (Maybe we meant the same thing?)

[ QUOTE ]
I don't know. I don't think it's a simple question.

Thinks heads-up for a moment. What we want is a function p(f) that gives the probability of a win if you have a fraction of the chips f. The boundary conditions are obvious: p=0 if f=0 and p=1 if f=1.

For equal players, by symmetry p=0.5 if f=0.5. From there you can surmise that it's just a linear function p=f.

For unequal players, the boundary conditions still hold, but by asymmetry p > 0.5 at f = 0.5. So, what does the rest of the function look like?

I can think of a bunch of highfalutin ways to approach it from random walk or diffusion theory, but maybe the following is about the right bang for the buck:

Assume an offset (advantage over 0.5) for equal stacks, call it a. Then to satisfy the boundary conditions, you could add a function that passes through (0,0), (1,1), (0.5,0.5+a) to the linear function, to get something in the right ballpark. For example:

p=f+4*a*f*(1-f)

You can imagine other choices for the offset function, but you'd be getting into turd polishing territory...

eastbay


[/ QUOTE ]

LinusKS
09-30-2004, 07:39 PM
[ QUOTE ]

This can't be true. Regardless of how good you are, if you have all the chips, your TEV is 50%, not 50% * 1.2.

[/ QUOTE ]

Yes. Whatever advantage you get from being a good player - at least when described in terms of prize equity - must diminish as you accumulate more chips.

[ QUOTE ]
Using Eastbay's approach will work though. Just figure out how to use ROI (which most people keep track of) instead of 'a' in his formula.

[/ QUOTE ]

I don't understand what Eastbay's doing, yet. /images/graemlins/frown.gif

LinusKS
09-30-2004, 07:59 PM
[ QUOTE ]
[ QUOTE ]
We know from the ICM

[/ QUOTE ]

help a cash game player out... what?

[/ QUOTE ]

Independent Chip Model. In a game with a structured payout - 50%/30%/20%, for example - there's not a one-to-one relationship between the number of chips you have and how much you're likely win.

ICM is a way to estimate how much your stack is worth as a percentage of the prize pool at any given time in the game.


Edit: http://www.bol.ucla.edu/~sharnett/ICM/ICM.html

Michael Davis
09-30-2004, 08:13 PM
Linus,

Please do not use elitist post titles to exclude me from answering.

-Michael

LinusKS
09-30-2004, 08:24 PM
Hey. I'm posting, aren't I? :-)

dethgrind
09-30-2004, 09:17 PM
[ QUOTE ]
Then to satisfy the boundary conditions, you could add a function that passes through (0,0), (1,0), (0.5,a) to the linear function, to get something in the right ballpark.

[/ QUOTE ]

What we're really looking for is a function to add to your TEV as determined by the ICM. This will answer Linus' original question.

If you know your ROI, say 20%, then you know that given an initial stack, your TEV is actually 12% (not exactly but close enough for now) instead of 10%. So if you're just starting the tournament, with .10 share of the total chips, your share of the prize pool is an extra .02 than what ICM says.

If you have no chips your share is 0 and if you have all of them your share is .5 regardless of how good you are. So the offset function is something that passes through (0,0), (.1, .02), and (1,0).

A parabola that passes through those points won't work since it adds too much as it gets close to (1,0), and you get values for TEV greater than .5. It would work if the offset function achieved a maximum at (.1, .02), but that isn't easy to figure out.

eastbay
09-30-2004, 10:43 PM
[ QUOTE ]
Eastbay, thanks for the response.

I'm not sure I understood what you said, though. Any way to break it down for the math-ignorant among us?

Also, shouldn't the upper boundary for prize equity be 0.5 (in a 20/30/50 payout structure)?

And does the formula you provided reduce the player's increase in equity as it approaches the boundary?

Edit: In other words, you wrote that p = probability of a win, but I'm trying to solve for equity in a tournament prize pool. (Maybe we meant the same thing?)


[/ QUOTE ]

Equity in a prize structure is a _much_ more complex subject than probability of a win.

If you don't like my answer for probability of a win, I can assure you that your eyes will glaze over any thoughts I could give for the much harder problem of probability of placing nth (which leads to tournament equity, given a payout structure). My failing.

Basically, I think what you want to do is solve a biased random walk problem, and calibrate the bias to give a certain ROI for equal starting stacks. You'd almost certainly have to resort to numerical rather than analytic solutions. Even the unbiased random walk problem (or its continuous analog as a diffusion equation in barycentric coordinates) for n players requires a very sophisticated solution method to arrive at an analytic result.

eastbay

eastbay
09-30-2004, 10:45 PM
[ QUOTE ]
eastbay, what do you do for a living?

[/ QUOTE ]

Computational physics.

[ QUOTE ]

i'm a pretty smart guy. i have a four-year degree from a good school in computer science, score well on standardized tests, iq etc... but i have no idea what you are talking about?


[/ QUOTE ]

It's pretty simple. Maybe I just use too much jargon. And I doubt you drank more than I did in college. Your liver hopes you didn't.

eastbay

LinusKS
09-30-2004, 11:42 PM
Ok, thanks. Let's put it this way, then.

Is if fair to say

a.) A 20% ROI player starts off with an expectation of .12 at the beginning of a tournament, and

b.) If he was to (for example) double up on the first hand his new expectation would be higher than the average player's .1844, but less than .1844(1.2)?

eastbay
10-01-2004, 12:21 AM
[ QUOTE ]
Ok, thanks. Let's put it this way, then.

Is if fair to say

a.) A 20% ROI player starts off with an expectation of .12 at the beginning of a tournament, and


[/ QUOTE ]

What is an "expectation of .12"?

eastbay

eastbay
10-01-2004, 02:12 AM
[ QUOTE ]
Ok, thanks. Let's put it this way, then.

Is if fair to say

a.) A 20% ROI player starts off with an expectation of .12 at the beginning of a tournament, and

b.) If he was to (for example) double up on the first hand his new expectation would be higher than the average player's .1844, but less than .1844(1.2)?

[/ QUOTE ]

Some rough numbers I arrived at with a model of PP $55+ SnGs I won't bother describing:

Including 10% vig, a ~20% ROI player has $EV of ~0.13.
If he doubles up the first hand every time, his ROI increases to ~1.08, with $EV of ~23%.

eastbay

Irieguy
10-01-2004, 02:50 AM
Unfortunately, I had to quit my night job as a computational physicist in order to have more time to play poker. So as tempting as it is to join eastbay's approach, I'll have to try an approach more consistent with my training as an intuitive metaphysicist:

1. The real answer is impossible to calculate, for all of the obvious reasons. You may be able to calculate it for a single player against the same opponents... but even that would only work if neither you nor your opponents ever made any adjustments to one another.

2. Rachel is on to something with her comment about your ROI% being somewhat the result of survival capabilities.

3. Cris is on to something with his comment that some of a winning player's ROI is due to his ability to double up at opportune times.

4. The answer would also be quite different depending on where your particular skills lie. It's almost impossible to be an expert short stack player AND an expert big stack player. It's possible to be good at both, but an expert comes to gain those skills by spending a lot of time with one size stack or another. So doubling up will either marginally increase your skill advantage or marginally decrease it (relative to another 20% ROI player).

5. If you want a visual example of how your skill margin changes as your stack increases, it would probably be closest to an asymptotic line approaching 50% as your chips approach 100% of the field.

Linus, you pose great questions that make me think... but I can never figure out what the point is.

Irieguy

eastbay
10-01-2004, 03:11 AM
[ QUOTE ]
Hi Linus,

Your question is impossible to answer, as your additional 20% advantage may have just been fulfilled in the act of doubling up.

Cris

[/ QUOTE ]

That's kind of a silly non-answer.

An equivalent question which makes it clear why:

You're offered a game with 9 players where you start with 2000 chips and everybody else plays with 1000, and you pay the same buy-in and fee as everybody else, and the house chips in the extra buy-in. It's the same player pool that you play with in 10 players games with 1000 chips each and win for 20% ROI.

What's your expected ROI in the new game? There's no need to "use anything up" now. Not that there was in the first question.

eastbay

rachelwxm
10-01-2004, 09:19 AM
Irieguy, wow, u used to have three jobs (including poker)? /images/graemlins/cool.gif
Glad to learn that both u and eastbay are trained in the same field as I am although I am doing more financial modelling for my career now and barely remember all these UNIX commands. /images/graemlins/frown.gif
As for my poker night job, it's normal these days for me to stay till 2am in the morning to catch up with you guys. /images/graemlins/smile.gif

LinusKS
10-01-2004, 10:29 AM
[ QUOTE ]
[ QUOTE ]
Ok, thanks. Let's put it this way, then.

Is if fair to say

a.) A 20% ROI player starts off with an expectation of .12 at the beginning of a tournament, and


[/ QUOTE ]

What is an "expectation of .12"?

eastbay

[/ QUOTE ]

Sorry.

What I mean is:

Assuming a 10-player game, and leaving out the 10% fee for the moment, 20% ROI player has a TEV of .12 of the prize pool.

So for a $10 game, his expectation when he sits down is $12.

LinusKS
10-01-2004, 12:15 PM
[ QUOTE ]
Some rough numbers I arrived at with a model of PP $55+ SnGs I won't bother describing:

Including 10% vig, a ~20% ROI player has $EV of ~0.13.
If he doubles up the first hand every time, his ROI increases to ~1.08, with $EV of ~23%.

eastbay

[/ QUOTE ]

The reason I excluded the tourny fee is that it seems to make things more complicated. A player with a 20% fee-included ROI is actually more than 20% better than the other players, since the average expectation (for an average player) is loss.

In a $10+1 game, for example, the average player expects to lose $1, so his expectation is $10. The 20% (fee-included) player makes $13.20, so he's actually making 32% more per game than the average player.

Presumably the 20% player's advantage comes from skill, so you'd say he's actually 32% better than the average player.

It looks like you're saying that if an average player, who has an expectation of 0.1 of the prize pool, doubles up and gets an expectation of .1844, a 32% better player who does the same thing obtains an expectation of .23.

So that looks like this:
(AP=Average Player, GP=Good Player)

___Initial EV_____EV after double_____multiplier
AP___0.1___________.1844______________1.844
GP___.132__________.23________________1.74

To put it in game terms, that means when an average player gambles all-in he's risking $10 to win $8.44 (using a $100 prize pool for the sake of the example).

A good player (20% ROI) risks $13.20 to win $9.80.

That's 10:8.44 vs. 13.2:9.8, or
54:46 vs 57:43.

What I need is a way of estimating the multiplier for a range of different ROI's, including negative ones.

eastbay
10-01-2004, 09:39 PM
Here's some numbers from the same model:

roi before double-up, $ev before double-up, $ev after double-up:
roi: -0.0958364 $ev: 0.099458 $ev: 0.191138
roi: -0.0594273 $ev: 0.103463 $ev: 0.195062
roi: -0.0359909 $ev: 0.106041 $ev: 0.19917
roi: -0.0245545 $ev: 0.107299 $ev: 0.20194
roi: 0.00583636 $ev: 0.110642 $ev: 0.205073
roi: 0.0301364 $ev: 0.113315 $ev: 0.210395
roi: 0.0621 $ev: 0.116831 $ev: 0.212399
roi: 0.0814182 $ev: 0.118956 $ev: 0.216393
roi: 0.109782 $ev: 0.122076 $ev: 0.219233
roi: 0.132136 $ev: 0.124535 $ev: 0.223296
roi: 0.170891 $ev: 0.128798 $ev: 0.227007
roi: 0.191391 $ev: 0.131053 $ev: 0.230782
roi: 0.224482 $ev: 0.134693 $ev: 0.2345
roi: 0.253418 $ev: 0.137876 $ev: 0.238149
roi: 0.271755 $ev: 0.139893 $ev: 0.241661
roi: 0.307236 $ev: 0.143796 $ev: 0.244971
roi: 0.346764 $ev: 0.148144 $ev: 0.249649
roi: 0.365618 $ev: 0.150218 $ev: 0.253587
roi: 0.398355 $ev: 0.153819 $ev: 0.257703
roi: 0.431573 $ev: 0.157473 $ev: 0.261612

Not sure where you're going with this, but I wouldn't be trying to put too much of a fine point on these numbers. They are modeled using numerous assumptions that are all approximate.

eastbay

LinusKS
10-01-2004, 10:22 PM
Well, those numbers are interesting, if they're accurate.

They show, for example, that a player with a slightly negative ROI - but still a better than average player - nearly doubles his TEV when he doubles his chips, but a much better than average player only increases his TEV by about two-thirds.

However, I'm still confused about a couple of things.

If I'm reading your chart right, the first example is of a person with about -5% ROI.

That person should still be a better than average player, though right? Yet you've got .99 as his TEV. Shouldn't it be higher than .1?

Also, for the same player, you're showing an EV after double of .19. But the ICM shows an improvement to just .1844 for an average player.

LinusKS
10-01-2004, 10:28 PM
Note: I realize these are small differences - I'm not quibbling. I just want to make sure we're both on the same page, also I'm curious how reliable the numbers are. Where did you get them, btw?

eastbay
10-01-2004, 10:56 PM
[ QUOTE ]
Well, those numbers are interesting, if they're accurate.

They show, for example, that a player with a slightly negative ROI - but still a better than average player - nearly doubles his TEV when he doubles his chips, but a much better than average player only increases his TEV by about two-thirds.

However, I'm still confused about a couple of things.

If I'm reading your chart right, the first example is of a person with about -5% ROI.


[/ QUOTE ]

No, -10% (approx.)

[ QUOTE ]

That person should still be a better than average player, though right? Yet you've got .99 as his TEV. Shouldn't it be higher than .1?


[/ QUOTE ]

No, perfectly average. 10% vig.

eastbay

eastbay
10-01-2004, 11:00 PM
[ QUOTE ]
I just want to make sure we're both on the same page, also I'm curious how reliable the numbers are. Where did you get them, btw?

[/ QUOTE ]

If you didn't want to figure out my approach to win probability in my initial response, I can assure you that you won't want to try to figure out my approach to getting these numbers.

So I'm not going to waste a lot of time explaining it.

eastbay

PrayingMantis
10-02-2004, 09:05 AM
[ QUOTE ]
Well, those numbers are interesting, if they're accurate.


[/ QUOTE ]

These numbers are not accurate, and they will never be accurate. The whole concept of "accuracy" is not relevant here. eastbay's posts in this thread are very interesting, and your questions are good. But it looks like you (and others) are completely forgetting that this game is about _human beings_ who are playing against each other. It's not a "probabilistic contest" of some sort. It's educating to get a feel for how much you "gain" when you double up (although there are many more factors to consider here, aside of your ROI, as others have mentioned), but it is no more than a a rough number, that is relevant only in some vague theoretical aspect.

LinusKS
10-02-2004, 10:53 AM
Eastbay, yes, I see I made a mistake reading your chart.

I'm just curious if you got the numbers from a simulation, or a formula, or what.

I don't want you to waste your time, of course, but your posts have been helpful to me.

tubbyspencer
10-02-2004, 11:16 AM
[ QUOTE ]
Well, those numbers are interesting, if they're accurate.

They show, for example, that a player with a slightly negative ROI - but still a better than average player - nearly doubles his TEV when he doubles his chips, but a much better than average player only increases his TEV by about two-thirds.



[/ QUOTE ]

And that's why poor players (negative ROI) should take more risks on coinflips early on than good players.

LinusKS
10-02-2004, 12:12 PM
Eastbay's first example shows an approximately average player with a nearly 1:1 correlation between CEV and TEV.

His chart doesn't show this, but if a bad player got better than 1:1, he would be doing himself a big favor by going in a lot.

PrayingMantis
10-02-2004, 12:19 PM
[ QUOTE ]
And that's why poor players (negative ROI) should take more risks on coinflips early on than good players.


[/ QUOTE ]

This is a bit circular, and not necessarily true. A poor player, by definition, can not tell what is "more risks on coin-flip" early on, and therefore can not simply recognize them and choose to take more of these - and if he CAN, well, he's probably not a "poor player". OTOH, nothing in the mentioned numbers tells you that a good, even great player, shouldn't take rather close +CEV risks early on, since they can very well be +$EV for him (in terms of ROI, and certainly in $/H terms), especially if he's making them against not-the-poorest players in the field.

eastbay
10-02-2004, 12:37 PM
[ QUOTE ]
[ QUOTE ]
Well, those numbers are interesting, if they're accurate.


[/ QUOTE ]

These numbers are not accurate, and they will never be accurate. The whole concept of "accuracy" is not relevant here. eastbay's posts in this thread are very interesting, and your questions are good. But it looks like you (and others) are completely forgetting that this game is about _human beings_ who are playing against each other. It's not a "probabilistic contest" of some sort. It's educating to get a feel for how much you "gain" when you double up (although there are many more factors to consider here, aside of your ROI, as others have mentioned), but it is no more than a a rough number, that is relevant only in some vague theoretical aspect.

[/ QUOTE ]

Well said, and thanks for saying it.

eastbay

tubbyspencer
10-02-2004, 12:51 PM
[ QUOTE ]
[ QUOTE ]
And that's why poor players (negative ROI) should take more risks on coinflips early on than good players.


[/ QUOTE ]

This is a bit circular, and not necessarily true. A poor player, by definition, can not tell what is "more risks on coin-flip" early on, and therefore can not simply recognize them and choose to take more of these - and if he CAN, well, he's probably not a "poor player". OTOH, nothing in the mentioned numbers tells you that a good, even great player, shouldn't take rather close +CEV risks early on, since they can very well be +$EV for him (in terms of ROI, and certainly in $/H terms), especially if he's making them against not-the-poorest players in the field.

[/ QUOTE ]

Negative ROI players best strategy is similar to a non-card counting Blackjack player's strategy.

They are looking to reduce their chances of losing. Since their over aggressive tendencies approach "correct" play more often with a big stack, than with an average sized stack, a coinflip shot at doubling up often reduces their overall chances of losing(that is, of finishing out of the money).

eastbay
10-02-2004, 12:59 PM
[ QUOTE ]
[ QUOTE ]
[ QUOTE ]
And that's why poor players (negative ROI) should take more risks on coinflips early on than good players.


[/ QUOTE ]

This is a bit circular, and not necessarily true. A poor player, by definition, can not tell what is "more risks on coin-flip" early on, and therefore can not simply recognize them and choose to take more of these - and if he CAN, well, he's probably not a "poor player". OTOH, nothing in the mentioned numbers tells you that a good, even great player, shouldn't take rather close +CEV risks early on, since they can very well be +$EV for him (in terms of ROI, and certainly in $/H terms), especially if he's making them against not-the-poorest players in the field.

[/ QUOTE ]

Negative ROI players best strategy is similar to a non-card counting Blackjack player's strategy.

They are looking to reduce their chances of losing. Since their over aggressive tendencies approach "correct" play more often with a big stack, than with an average sized stack, a coinflip shot at doubling up often reduces their overall chances of losing(that is, of finishing out of the money).

[/ QUOTE ]

Overly aggressive is most certainly not the only way to achieve negative ROI.

eastbay

PrayingMantis
10-02-2004, 01:00 PM
[ QUOTE ]
Negative ROI players best strategy is similar to a non-card counting Blackjack player's strategy.

They are looking to reduce their chances of losing. Since their over aggressive tendencies approach "correct" play more often with a big stack, than with an average sized stack, a coinflip shot at doubling up often reduces their overall chances of losing(that is, of finishing out of the money).


[/ QUOTE ]

That's very true, but is relevant only in a theoretic world. In the real world, poor players don't have a coherent strategy. If they are able to use the strategy you are talking about, which is a rather sophisticated one (relatively so, of course), they are, by definition, not poor players, and therefore should not use this strategy. This is paradoxical, as you can see.

tubbyspencer
10-02-2004, 01:01 PM
[ QUOTE ]

But it looks like you (and others) are completely forgetting that this game is about _human beings_ who are playing against each other.

[/ QUOTE ]

Mantis:

Jesus Christ you're pretty sure(read "full") of yourself sometimes aren't you. It is STUPID of you to assume that because Linus asks a mathematical question, that he has "forgotten" that the game has a pyschological element.

Your posts consistently integrate this type of condescension. There's no need (other than your own pathetic felling of inadequacy, perhaps) for you to be rude in your posts. YOU ARE NOT A COMPLETELY WORTHLESS HUMAN BEING MANTIS - therefore, you don't have to make yourself feel better by putting down others. Grow up, get some self esteem.

Linus asks a good question, and you have to pollute the thread (again, I might add) with your denigrations.

tubbyspencer
10-02-2004, 01:08 PM
[ QUOTE ]
[ QUOTE ]
Negative ROI players best strategy is similar to a non-card counting Blackjack player's strategy.

They are looking to reduce their chances of losing. Since their over aggressive tendencies approach "correct" play more often with a big stack, than with an average sized stack, a coinflip shot at doubling up often reduces their overall chances of losing(that is, of finishing out of the money).


[/ QUOTE ]

That's very true, but is relevant only in a theoretic world. In the real world, poor players don't have a coherent strategy. If they are able to use the strategy you are talking about, which is a rather sophisticated one (relatively so, of course), they are, by definition, not poor players, and therefore should not use this strategy. This is paradoxical, as you can see.

[/ QUOTE ]

Mantis:

Reread my post. I did not say that poor players utilized this as some sort of a deliberate strategy. But thank you for pointing out that something I didn't say (and couldn't logically be inferred from what I said) wasn't true.

Hmmmmmmm. Mantis showing off his knowledge by showing how what someone didn't say is wrong.

PrayingMantis
10-02-2004, 01:11 PM
Wow, aren't you over-reacting a little? /images/graemlins/confused.gif /images/graemlins/grin.gif
As I said myself earlier (read more carefully, please), I think linus questinons are good. My points are still very relevant, as eastbay has mentioned too, in his own reply.

I don't really know what to say to the rest of your "points", only that it's really amazing for me how some certain people get SO angy about things I write here, while others (which I appriciate much more, surely /images/graemlins/grin.gif) find them to be challenging and interesting, not to say helpful.

For each his own. If my posts make you feel and react in such a way (that's a little scary, actually), I suggest you stop reading them. That's extremely easy.

Peace,

PM
/images/graemlins/laugh.gif

tubbyspencer
10-02-2004, 01:15 PM
Go ahead Mantis:

Tell me how Linus was IGNORING the fact that this is a people game, by asking a mathematical question.

My point is not that some of your points aren't relevant, it's that you come off like an obnoxious know it all.

And before you point it out, yes, I'm aware that an obnoxious know it all can make good points in his post. I'm just saying that you could be less condescending (which I note you don't deny).

PrayingMantis
10-02-2004, 01:30 PM
[ QUOTE ]
Tell me how Linus was IGNORING the fact that this is a people game, by asking a mathematical question.


[/ QUOTE ]

Huh? mathematical questions are very very relevant for this game, and as such, so as linus' questions. My point in my reply to linus's post wasn't with regard to his questions, but with regard to his remark in which he said "if these numbers are accurate" (eastbay's numbers). Please, read much more carefully when you are criticizing someone in such an aggressive and disproportioned way. Otherwise, you are only embarassing yourself.

eastbay
10-02-2004, 01:44 PM
[ QUOTE ]
[ QUOTE ]

But it looks like you (and others) are completely forgetting that this game is about _human beings_ who are playing against each other.

[/ QUOTE ]

Mantis:

Jesus Christ you're pretty sure(read "full") of yourself sometimes aren't you. It is STUPID of you to assume that because Linus asks a mathematical question, that he has "forgotten" that the game has a pyschological element.


[/ QUOTE ]

Well, Mantis can be blunt, but what is stupid is to characterize the question as mathematical (I guess I can be blunt, too). It indicates you didn't understand the point Mantis was trying to make.

And I thought it was equally clear that the original poster also had the same misconception, so it was important that someone point out that it is not, in the sense that there is no one answer that can be derived straight from mathematical considerations. Many poker calculations can be, but this is one that cannot, and it's important that people understand the difference.

eastbay

tubbyspencer
10-02-2004, 01:48 PM
Mantis:

Care to explain how it is that Linus(and others - because it's not just Linus that you're smarter than) were "COMPLETELY forgetting that this game is about human beings who are playing against each other"(emphasis, mine - condescending attitude, yours)?

That is a direct quote Mantis. you guys are "COMPLETELY FORGETTING" that it's a game against humans.

Are you really so 100% cock-sure that Linus (and all the other morons) were "COMPLETELY| forgetting that? Maybe they were only 99% forgetting it.

When you tell people they are COMPLETELY FORGETTING something that's totally obvious, you are being a condescending know it all. Aren't you? Explain the comment otherwise.

tubbyspencer
10-02-2004, 01:58 PM
[ QUOTE ]
[ QUOTE ]
[ QUOTE ]

But it looks like you (and others) are completely forgetting that this game is about _human beings_ who are playing against each other.

[/ QUOTE ]

Mantis:

Jesus Christ you're pretty sure(read "full") of yourself sometimes aren't you. It is STUPID of you to assume that because Linus asks a mathematical question, that he has "forgotten" that the game has a pyschological element.


[/ QUOTE ]

Well, Mantis can be blunt, but what is stupid is to characterize the question as mathematical (I guess I can be blunt, too). It indicates you didn't understand the point Mantis was trying to make.

And I thought it was equally clear that the original poster also had the same misconception, so it was important that someone point out that it is not, in the sense that there is no one answer that can be derived straight from mathematical considerations. Many poker calculations can be, but this is one that cannot, and it's important that people understand the difference.

eastbay

[/ QUOTE ]

So eastbay:

Because there is not "one answer that can be derived straight from mathematical considerations", you agree that Linus Has COMPLETELY FORGOTTEN that there is a human element to the game?

At the very least you'll admit that you're inferring that, because it's nothing Linus ever said.

PrayingMantis
10-02-2004, 02:05 PM
I think you should take your tranquilizers now. /images/graemlins/crazy.gif /images/graemlins/tongue.gif

tubbyspencer
10-02-2004, 02:10 PM
[ QUOTE ]
I think you should take your tranquilizers now. /images/graemlins/crazy.gif /images/graemlins/tongue.gif

[/ QUOTE ]

You can't back up your original comment that Linus had COMPLETELY FORGOTTEN about the human element of the game, can you?

Nor can you admit that it's a condescending thing to say.

Excellent response Mantis.

PrayingMantis
10-02-2004, 02:13 PM
[ QUOTE ]
Excellent response Mantis.


[/ QUOTE ]

Well, thanks. I knew we will get along at the end! /images/graemlins/laugh.gif

CrisBrown
10-02-2004, 02:46 PM
Hiya eastbay,

Sorry for the delayed reply. I was having fun in another forum and hadn't read closely here for awhile.

[ QUOTE ]
An equivalent question which makes it clear why:

You're offered a game with 9 players where you start with 2000 chips and everybody else plays with 1000, and you pay the same buy-in and fee as everybody else, and the house chips in the extra buy-in. It's the same player pool that you play with in 10 players games with 1000 chips each and win for 20% ROI.

What's your expected ROI in the new game? There's no need to "use anything up" now. Not that there was in the first question.

[/ QUOTE ]

It's not quite an equivalent question. That 20% edge vs. other random players -- which Linus specified in his original post -- comes from the ability to extract chips from poorer players when they make mistakes. When you bust a bad player who has made a mistake, on the first hand of the SNG, some of that 20% increased expectation has just been realized ... you now have twice as many chips as the other players at the table, owing to this one bad player's mistake.

Much of your 20% edge vs. the table as a whole -- that is, your edge over the mean skill level of those at the table -- may have dome from your edge over that one bad player whom you have already busted. It is axiomatic that the mean skill level of those who remain is likely to be greater, to some very-difficult-to-estimate degree, and thus your edge will now be lesser.

If we somehow had precise rankings for each player at the table, this problem would be soluble, because you could compute the mean skill level of the remaining players and compare that to the mean skill level when you began. But lacking that, it's insoluble.

Cris

eastbay
10-02-2004, 08:21 PM
[ QUOTE ]

Because there is not "one answer that can be derived straight from mathematical considerations", you agree that Linus Has COMPLETELY FORGOTTEN that there is a human element to the game?

At the very least you'll admit that you're inferring that, because it's nothing Linus ever said.

[/ QUOTE ]

Linus keeps asking about a "formula" and "how do you calculate" and "are the numbers accurate."

All of those word selections indicate he doesn't understand the nature of the question he's asking. So yes, I'm inferring it, but not without repeated evidence.

I wouldn't say he "completely forgot" it. I would say he didn't really think about it in the first place, or didn't think about it enough to understand it. That's all. I don't think any of this is enough to get bent of shape over.

eastbay

dethgrind
10-02-2004, 08:26 PM
A better player will have no larger share of the prize pool if he has all the chips or none of the chips. At the start of the tournament (f=.1 fraction of the chips), his advantage a(f) is:
a(.1) = ROI*.11 + .01

For example, a 20% ROI player has an intial advantage of .20*.11 + .01 = .032. So his share of the prize pool is .10 + .032 = .132.

We need to guess advantage values for everything in between 0, .10, and 1 fraction of the chips. Here I'll assume it increases linearly from 0 to .10, then decreases linearly from .10 to 1.

a(f) = 10*f*a(.1) if f < .1

or

a(f) = (a(.1)/.9) * (1-f) if f > .1


So for example, taking ROI 20%, so a(.1)=.032, and you have half the chips, f=.5, your advantage is:
.032/.9 * (1-.5) = .0178

An average player, in the ICM, having half the chips and five opponents with equal stacks, has a TEV of .3611. So the skilled player in the same scenario has TEV .3611 + .0178 = .3789

eastbay
10-02-2004, 08:28 PM
[ QUOTE ]
Hiya eastbay,

Sorry for the delayed reply. I was having fun in another forum and hadn't read closely here for awhile.

[ QUOTE ]
An equivalent question which makes it clear why:

You're offered a game with 9 players where you start with 2000 chips and everybody else plays with 1000, and you pay the same buy-in and fee as everybody else, and the house chips in the extra buy-in. It's the same player pool that you play with in 10 players games with 1000 chips each and win for 20% ROI.

What's your expected ROI in the new game? There's no need to "use anything up" now. Not that there was in the first question.

[/ QUOTE ]

It's not quite an equivalent question. That 20% edge vs. other random players -- which Linus specified in his original post -- comes from the ability to extract chips from poorer players when they make mistakes. When you bust a bad player who has made a mistake, on the first hand of the SNG, some of that 20% increased expectation has just been realized ...


[/ QUOTE ]

Is this what you really think, or this just starting an argument for fun?

I don't agree. Let's just leave it at that.

eastbay

codewarrior
10-02-2004, 08:30 PM
What?

LinusKS
10-03-2004, 02:11 PM
[ QUOTE ]

Linus keeps asking about a "formula" and "how do you calculate" and "are the numbers accurate."

All of those word selections indicate he doesn't understand the nature of the question he's asking. So yes, I'm inferring it, but not without repeated evidence.

I wouldn't say he "completely forgot" it. I would say he didn't really think about it in the first place, or didn't think about it enough to understand it. That's all. I don't think any of this is enough to get bent of shape over.

eastbay


[/ QUOTE ]

Eastbay, I'm still curious where you got your numbers from.

If I missed your answer somewhere, I apologize in advance.

As to what I was thinking about originally... well, I already know the answer to that.

LinusKS
10-03-2004, 02:20 PM
Thank you!

I'm going to work on this a bit. (Like I said, I'm pretty slow when it comes to math.)

The formula makes sense to me. It may or may not be exact, but as long as it's theoretically sound, that's good enough.

[ QUOTE ]
A better player will have no larger share of the prize pool if he has all the chips or none of the chips. At the start of the tournament (f=.1 fraction of the chips), his advantage a(f) is:
a(.1) = ROI*.11 + .01

For example, a 20% ROI player has an intial advantage of .20*.11 + .01 = .032. So his share of the prize pool is .10 + .032 = .132.

We need to guess advantage values for everything in between 0, .10, and 1 fraction of the chips. Here I'll assume it increases linearly from 0 to .10, then decreases linearly from .10 to 1.

a(f) = 10*f*a(.1) if f < .1

or

a(f) = (a(.1)/.9) * (1-f) if f > .1


So for example, taking ROI 20%, so a(.1)=.032, and you have half the chips, f=.5, your advantage is:
.032/.9 * (1-.5) = .0178

An average player, in the ICM, having half the chips and five opponents with equal stacks, has a TEV of .3611. So the skilled player in the same scenario has TEV .3611 + .0178 = .3789

[/ QUOTE ]

eastbay
10-03-2004, 03:10 PM
[ QUOTE ]
[ QUOTE ]

Linus keeps asking about a "formula" and "how do you calculate" and "are the numbers accurate."

All of those word selections indicate he doesn't understand the nature of the question he's asking. So yes, I'm inferring it, but not without repeated evidence.

I wouldn't say he "completely forgot" it. I would say he didn't really think about it in the first place, or didn't think about it enough to understand it. That's all. I don't think any of this is enough to get bent of shape over.

eastbay


[/ QUOTE ]

Eastbay, I'm still curious where you got your numbers from.

If I missed your answer somewhere, I apologize in advance.

As to what I was thinking about originally... well, I already know the answer to that.

[/ QUOTE ]

Simulation. But I'm not sure how such a pat, essentially meaningless answer is helpful.

eastbay

dethgrind
10-03-2004, 03:33 PM
I don't know that it is theoretically sound. It does meet a few minimum conditions though:

- it passes through (0,0) (.1,a) and (1,0)
- it doesn't allow for a TEV of less than 0 or more than .5 (I think)

I also like the fact that it is always positive for positive a, and always negative for negative a, which should be true in the majority of cases.

CrisBrown
10-03-2004, 04:48 PM
Hi codewarrior,

Okay, I'm going to be picking some numbers out of the air here to illustrate the point.

Situation #1: Your skill level is 12, and the nine other players each has a skill level of 10. The mean skill level of your opponents is 10, so you have a 20% edge vs. the average opponent. If you bust an opponent on the first hand, you still have a 20% skill edge vs. the average opponent remaining, because the mean skill level of the remaining opponents is still 10.

Situation #2: You have a skill level of 12. Eight of your nine opponents have a skill levels of 11. The other has a skill level of 2. The mean skill level of your opponents is 10, so at the beginning you have a 20% edge on the field as a whole. Now you bust that very bad player on the very first hand. The remaining players all have a skill level of 11, so the mean is 11. You have only a 9% skill edge over those who are left.

In the first, you have a double stack and a 20% skill edge over the remaining players. In the second, you have a double stack and only a 9% skill edge over the remaining players. I'm not going to do the math (perhaps eastbay can? please please? /images/graemlins/smile.gif ) but I suspect your $EV in these two situations is significantly different. That difference reflects the portion of your original skill edge, in the second scenario, which has now been "realized" in busting the bad player.

Linus' original question didn't specify the relative strengths of the other players. Without knowing that, it's impossible to know how much of your skill edge vs. the field has changed by your busting this very bad player. And if you're trying to derive a $EV based on both your skill edge and your current chip stack, then you need both numbers. That's why I said that the problem, as proposed, was insoluble.

Cris

codewarrior
10-03-2004, 05:23 PM
Hi, Cris.

I was being a smartass, but I bet your reply helped some folks, so thanks.

You still remind me of a girl I dated in high school, and for that I will find you forever endearing /images/graemlins/laugh.gif

Even when I, <shudder> disagree... /images/graemlins/tongue.gif

LinusKS
10-03-2004, 11:21 PM
Chris, that's no different than saying ROI is meaningless if you're playing at a table full of better players.

It's true, but we're talking averages.

CrisBrown
10-03-2004, 11:58 PM
Hi Linus,

For many players, your former statement does reflect the averages. /images/graemlins/wink.gif

Seriously, yes, you can discuss this in terms of averages (the first scenario in my post), so long as you remember that your answer may well be subject to the GIGO principle (garbage in, garbage out). That is, the less certain you are of your "averages," the more likely that your answer to this question will be statistically insignificant. (Think of significant digits here.) You can't get more precision in your answer than there is in your data.

Cris