Two Plus Two Older Archives  

Go Back   Two Plus Two Older Archives > General Poker Discussion > Poker Theory
FAQ Community Calendar Today's Posts Search

Reply
 
Thread Tools Display Modes
  #1  
Old 09-07-2003, 11:07 AM
David Sklansky David Sklansky is offline
Senior Member
 
Join Date: Aug 2002
Posts: 241
Default Tough, Important, General Case, Game Theory Problem

I'm not sure how hard this question is, nor whether it has ever been solved in print. Probably it has. But if it hasn't, the first one to do it would get a feather in his cap.

It involves an obvious variation of regular poker. One round of betting. Except the bets are simultaneous rather than sequential. Everybody antes a certain amount. Then after looking at their cards (or to make it more precise, a real number from zero to one) they SECRETLY press a button indicating whether they are in for a second bet. Those who indicated that they stayed in now compare hands (numbers) and the winner takes the whole pot. If no one has made the second bet there is no action.

In real life this game, while interesting, could never be dealt seriously because of the collusion risk. Partners would never play more than one hand.

Notice that the correct strategy is the same for everybody. Given a certain number of players, a certain ante per player, and a certain size bet per player, the optimum strategy would be to bet some amount of your best hands.
For example if three players each antied one dollar and then had to bet two dollars secretly to reach a showdown, the optimum strategy might be to bet something like the top 30% of your hands (in other words-.7 or higher). If so that would mean about 35% of hands would be no decision.

So here is the question. "n" players are dealt a real number between zero and one. Each player antes "a" dollars. Each player secretly and independently, chooses whether to enter a showdown with a bet of "b" dollars. No action if no one bets. What proportion of your hands should you bet?

The answer will be in terms of n, a, and b. (It would be helpful if you translated it into the answer for my three player example.) This problem might require calculus.

Remember this answer assumes that you have no "read" on the other players. You are assuming that they are also playing their optimimum strategy. If so you will break even. I would expect that if they deviated from optimum (without colluding) your playing the optimum strategy would now probably win and certainly not lose. But that may be wrong. I have not thought about or attempted to find a solution myself. That's your job.
Reply With Quote
  #2  
Old 09-07-2003, 08:32 PM
emanon emanon is offline
Junior Member
 
Join Date: Aug 2003
Posts: 8
Default Re: Tough, Important, General Case, Game Theory Problem

Lets see what everyone thinks of this answer:
Assumptions:
1. Your "hand" is ~ U(0,1)
That is, we are using a univariate distribution to draw each of the hands.
2. All players pursue an optimal strategy.
3. Unlike poker, each hand is independent of the others.

My steps:
1. determine the probability of a victory.
2. determine the EV of a bet.
i. determine the min optimal hand to bet
3. Solve for min victory value.

1. Probability of a victory:
Given you have a hand X, what is the probability that your hand is better than every other players?

P(X>Xi) = Probability that hand X is better than hand of player i.

I'm not sure how to express this compactly, so I will expand P(victory|X) for n=2 thru 4.
For n=2, P(X>X1) = (1-X)
For n=3, P(X>X1 & X>X2) = P(X>X1) + {1-P(X>X1)}*(1-X)
For n=4, P(X>Xi) = P(X>X1 & X2) + {1-P(X>X1&X2)}*(1-X)

Ok, so we now know how to calculate the probability that a hand X will win against n players.

2. Determine the EV.
We want to determine the minimum X that we need to bet.

To determine this, the following variables/equations are needed:
P(min_opt) -> The minimum P(victory) needed to bet.
EV_of_bet -> The expected value for the hands we do bet.
obv -> other bet value, the E[amt of other bets won when i win]
EV -> The overall expected value.

EV = P(min_opt)*EV_of_bet - {1-P(min_opt)}*ante

EV_of_Bet = P(victory)*((a*n)+b+obv) - {1-P(victory)}*(a+b)
Since, when I win, I win all the antes (a*n) plus my bet (b) plus obv; and when I lose, I lose my ante and my bet.

obv = (n-1)* {X - P(min_opt)} * bet
Since: There are n-1 other players who might bet.
Remember, the hands are independent.
For each player, the chance that he will get a hand better than P(min opt) (and therefore bet) but less than my hand (x), is simply the difference between the two.

3. Solve for min victory value:
i. At the inflection point, where EV=0, P(min_opt) == P(victory).

Doing most of the algrebra results in:
0 = (na+2b)*P(min_opt)^2 - P(min_opt)*b - a
Once P(min_opt) has been solved for, need to back out the value X for the min hand to bet.

For a=1, b=2, n=3 I solve for an X=.707

Thoughts, comments, errors, clarifications?
emanon
-------
What about when your opponents are playing suboptimally?
The optimal strategy will no longer maximize your return, however, it will have a higher EV.
Reply With Quote
  #3  
Old 09-07-2003, 08:55 PM
cbloom cbloom is offline
Junior Member
 
Join Date: Jul 2003
Posts: 19
Default Re: Tough, Important, General Case, Game Theory Problem

The solution is : bet all your hands X > c with

c = ( b / (n*a + b) ) ^ (1/(n-1))

For your example case, a = 1, b = 2, n = 3 this is

c = 0.63246

How do you get this? Principle of indifference. When your hand is X = c, you should have the same EV for betting or folding. Consider your ante as already being in the pot. If you fold, your EV is zero. If you bet, I assume that all other players are using the same strategy. They will only call if they have hands X >= c. The chance of a tie is zero, so that means if you have any callers, they will beat you. Thus, you only win the pot if noone calls. Thus, your chance of winning is just the chance that everyone else has a hand < c :

P = c ^ (n-1)

And your EV is :

P * n*a + (1-P) * (-b)

That is, when you win (P chance) you win the antes (n*a). When you lose (1-P chance) you lose your bet (-b). This EV should be equal to the EV of folding, which is zero.

P * n*a + (1-P) * (-b) = 0

P * (n*a + b) - b = 0

P = b / (n*a + b)
c^(n-1) = b / (n*a + b)

c = ( b / (n*a + b) ) ^ (1/(n-1))

And really this game's not very interesting at all.

For the case of a=1,b=2, n = 2 (heads up) , this is:

c = 0.4

it's correct to bet slightly more than half your hands.
Reply With Quote
  #4  
Old 09-07-2003, 11:59 PM
BB King's BB King's is offline
Senior Member
 
Join Date: Sep 2002
Posts: 244
Default Great ! but ...

Great job ! Very good explanation !

But a small error here:

For the case of a=1,b=2, n = 2 (heads up) , this is:
c = 0.4

c=0.5
Reply With Quote
  #5  
Old 09-08-2003, 12:49 AM
Bozeman Bozeman is offline
Senior Member
 
Join Date: Sep 2002
Location: On the road again
Posts: 1,213
Default Re: Great ! but ...

Yup, looks like he used the same value (.4) as is found in the 3 player example (.4^.5) neglecting the n that appears.
Reply With Quote
  #6  
Old 09-08-2003, 03:35 AM
M.B.E. M.B.E. is offline
Senior Member
 
Join Date: Sep 2002
Location: Vancouver, B.C.
Posts: 1,552
Default Re: Great ! but ...

[ QUOTE ]
Great job ! Very good explanation !

But a small error here:

For the case of a=1,b=2, n = 2 (heads up) , this is:
c = 0.4

c=0.5

[/ QUOTE ]

No, that's still wrong. If there's only $2 in the pot, you don't want to be betting $2 half the time, because then your opponent could cream you by tightening up a little bit. So that when you both call, your opponent is the favourite to win $3 of your money, but the times your opponent folds and you call you'll only win $1 of his.

For example, suppose you're playing heads up against me with a=1 and b=2. You bet when you're dealt anything greater than 1/2, but I play tighter and only bet with something greater than 5/8. Let's calculate my EV.

The probability of a push (we both fold) is (1/2)(5/8) = 5/16.
The probability that you bet and I fold is (1/2)(5/8) = 5/16: I lose $1.
The probability that you fold and I bet is (1/2)(3/8) = 3/16: I win $1.
The probability we both bet is (1/2)(3/8).
The probability we both bet and I win is (1/2)(3/8)(5/8) = 15/128: I win $3.
The probability we both bet and I lose is (1/2)(3/8)(3/8) = 9/128: I lose $3.

So my total EV on each hand is

(5/16)(-$1) + (3/16)($1) + (15/128)($3) + (9/128)(-$3) = $(1/64).

But if you're playing optimally, how can I have positive EV of about 1.5 cents per hand?

Actually the correct answer is c = 2/3.
Reply With Quote
  #7  
Old 09-08-2003, 04:01 AM
M.B.E. M.B.E. is offline
Senior Member
 
Join Date: Sep 2002
Location: Vancouver, B.C.
Posts: 1,552
Default Re: Tough, Important, General Case, Game Theory Problem

[ QUOTE ]
The solution is : bet all your hands X > c with

c = ( b / (n*a + b) ) ^ (1/(n-1))

[/ QUOTE ]
Not quite. The correct answer is: bet all your hands X > c with

c = ( b/ ((n-1)*a + b) ) ^ (1/(n-1))

It looks nicer if you say m is the number of your opponents = n-1. Then

c = (b/(am+b))^(1/m).

The error in cbloom's proof is in the assumption that if you fold your EV is 0. You actually have positive EV when you fold, because if everyone else also folds it's a push and you get your ante back. The probability of that happening is c^m = P.

Then, using cbloom's reasoning:

P*(n*a) + (1-P)*(-b) = P*a
P*(n*a + b) - b = P*a
P*(n*a + b - a) - b = 0
P = b/(n*a + b - a)
P = b/((n-1)*a + b)= b/(am+b)

So if we take DS's 3-player game with a=1, b=2, m=2, then c is the square root of 2/(2+2), or sqrt(0.5) = 0.7071. Incidentally that's the same answer posted by emanon.

For the case of a=1, b=2, m=1 (heads up), we get:
c = 2/(1+2) = 0.66667.
Reply With Quote
  #8  
Old 09-08-2003, 08:27 AM
BB King's BB King's is offline
Senior Member
 
Join Date: Sep 2002
Posts: 244
Default Also a Nice Job ! n/t

*****
Reply With Quote
  #9  
Old 09-08-2003, 01:43 PM
cbloom cbloom is offline
Junior Member
 
Join Date: Jul 2003
Posts: 19
Default Re: Tough, Important, General Case, Game Theory Problem

Ah yes, good catch.

An interesting thing about this game is that if your opponent is playing "optimally" you can't beat him just by betting every time.

Consider heads up, n=2, m=1. I have card X, my opponent has card Y. He will bet only if Y > c, with c = b/(a+b)

for this EV computation I consider that the antes are already part of the pot.

if Y < c
you win 2a
if Y > c
if X < c
you lose b
if X > c
you win (2a+b) half the time
and you lose b the other half

So your EV is :

c * (2a) + (1-c) * ( c * (-b) + (1-c) * ( 1/2 * (2a+b) - 1/2 b ) )
c * (2a) + (1-c) * ( c * (-b) + (1-c) * a )

plug in for c and simplify -

b/(a+b) * (2a) + (a/(a+b)) * ( b * (-b) + a * a )/(a+b)

(2ab*(a+b) + a * ( a^2 - b^2 ))/(a+b)^2
(2ab + a * ( a - b ))/(a+b)
(2ab + a^2 - ab )/(a+b)
(ab + a^2 )/(a+b)
a

your EV is exactly the ante, so his is too, and this is just a break-even game. That is, you can't beat him if he's playing optimally, the best you can do is break even.

In fact, if your opponent is playing optimially, you MUST bet if your card is X > c , but if it is less it doesn't matter if you bet or fold. c has been set so that the EV is indifferent to whether you bet or fold when X < c.

Let's solve for what c needs to be to make this indifference -

fold -
if Y > c
EV is 0
if Y < c he folds too
EV is a (push)

bet -
if Y > c
EV is -b
if Y < c
EV is 2a

EV of fold is :

c*a

EV of call is :

c*(2a) - (1-c)*b


c*(2a) - (1-c)*b = c*a

c*(a) + c*b = b
c = b/(a+b)

which is just what 'c' is.
Reply With Quote
  #10  
Old 09-09-2003, 07:47 PM
George Rice George Rice is offline
Senior Member
 
Join Date: Oct 2002
Location: Staten Island, NY
Posts: 403
Default Please explain

Why is the probability of everyone folding c^m, where m=n-1. I would think it would be c^n. Please explain this.
Reply With Quote
Reply


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT -4. The time now is 12:19 PM.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2024, vBulletin Solutions Inc.