PDA

View Full Version : blackjack


blortch1
12-08-2003, 03:54 AM
Has anyone worked out the formula for the probability of winning x hands before losing 2x hands in any number of deals (assuming an equal chance of winning or losing)? For example, what is the probability one would win 4 hands before losing 8 hands in 12 deals, 24 deals, x deals?

SevenStuda
12-08-2003, 01:10 PM
I haven't.

bigpooch
12-08-2003, 07:55 PM
To determine the chances of at least winning 1/3rd of the
total hands, it would be just 1-P(winning less than 1/3rd).
There is an approximation to the binomial distribution
using the standard normal curve which is better as the
trials become very large. For the exact answer, the chances
of not winning at least 1/3rd of the hands for 3m trials
(m an integer>=1) would be determined by:

Summing of the combinatorial numbers
C(3m,k) where k runs from 0 to m-1.
Then, dividing this sum by 2**(3m).

For example, with m=4 or 12 deals, you simply sum

C(12,0) = 1
C(12,1) = 12
C(12,2) = 66
C(12,3) = 220

giving the sum of 299.

2**12 = 4096 so the chances of NOT winning 1/3rd in this
case is 299/4096 or there is a 3797/4096 chance of winning
at least 1/3rd of the hands.

ChipWrecked
12-09-2003, 02:10 PM
I'm a math idiot (business major /images/graemlins/smirk.gif). Is the 1/3 because of possibility of pushes?

BruceZ
12-09-2003, 03:44 PM
Pooch, you're answering a different question. He wants to know the probability of winning 4 before losing 8. That's not the same as winning at least 4 out of 12 because it depends on which games you win. For example, there are not C(12,n) ways to win n games, since once you lose 8 or win 4, it's over, and it doesn't matter what happens the remaining games, so there are more ways to do this out of 2^12.

Here is the correct solution.

I'm ignoring pushes and taking P(win) = P(lose) = 1/2.

Count the successful cases out of 11 games. Note that if we get the 4th win on the nth game, the nth game must be a win, so we just have to count the number of ways to choose the 3 games out of n-1 to win, and multiply by 2^(11-n) possible results for the remaining 11-n games which don't matter.

4 of 4: C(3,3)*2^7
4 of 5: C(4,3)*2^6
4 of 6: C(5,3)*2^5
4 of 7: C(6,3)*2^4
4 of 8: C(7,3)*2^3
4 of 9: C(8,3)*2^2
4 of 10: C(9,3)*2^1
4 of 11: C(10,3)*1
--------------------------
total = 1816.

1816/2^11 = 88.7%. Note that this is the same regardless of the number of deals if more than 11, since we either win 4 or lose 8 out of the first 11.

It would have been easier to compute the probability of losing 8 before winning 4 and subtract from 1, since this would only have 4 terms to sum. We can do that as a check to make sure these account for all the remaining cases out of 2^11 possibilities. Note that this is the same as the last 4 terms above with 3 replaced by 7:

8 of 8: C(7,7)*2^3
8 of 9: C(8,7)*2^2
8 of 10: C(9,7)*2^1
8 of 11: C(10,7)*1
--------------------------
total = 232.

232 + 1816 = 2048 = 2^11, so this accounts for all the cases.

In general, the probability of winning x games before losing 2x games is:

sum{n = x to 3x-1} C(n-1,x-1)*2^(3x-1-n) / 2^(3x-1).

squiffy
12-09-2003, 04:03 PM
Serious question. How is it helpful to know the answer to this probability question. Are there any practical consequences, i.e., how much money you should wager, how large your bankroll will be, how many hands you should play, to clear a bonus?

I assume that without a bonus, you should never play even a single hand of blackjack, which I have heard is negative EV.

BruceZ
12-10-2003, 03:48 AM
I think the poster might be trying to ask "what is the probability that if a person plays this game forever, that his number of losses never exceeds twice his number of wins?" If this was the question, the example wasn't worded properly. This is a classic problem, and it is related to random walks, and the gambler's ruin problem.

Here is the solution to that problem.

Suppose he starts with 1 dollar. Suppose that when he wins, he wins 2 dollars, and when he loses, he loses 1 dollar. Then if he ever has at least twice as many losses as wins, he will be broke. Call this probability p. Then we can write:

p = 1/2 + 1/2 * p^3

This is the standard trick for gambler's ruin problems. It says he can go broke on the first play with probability 1/2, OR he can win on the first play with probability 1/2 and go broke later. If he wins on the first play he will have 3 dollars, and the probability of losing 3 dollars is p^3, since he has to lose 1 dollar 3 times. The solution to the above equation is

p = [sqrt(5)-1] / 2 = 61.8%.

You can verify that this is the solution. This is the probability that he will eventually have twice as many losses as wins, so the probabilty that he never has twice as many losses as wins is:

1 - [sqrt(5) - 1] / 2 =

[3 - sqrt(5)] / 2 = 38.2%

bigpooch
12-10-2003, 04:35 PM
Right. It is interesting that the golden ratio (or its
inverse) appears here! I see that for a player to have at
some time not won at least 1/k of the hands where k is an
integer >=2, the probability is the solution to

p = 1/2 + (1/2)*p**k

but is that also true for k that is any real number >2?

Nevetheless, good answer to the post as I agree that you
answered the question that was probably asked.

MicroBob
12-10-2003, 06:34 PM
fwiw - BJ is not quite a 50-50 game....i believe these numbers are appropriate to a basic strategy player....

42% - win
49% - loss
9% - push

the player closes the gap towards a 50-50 game with 3:2 BJ payout and double-downs.

i know this isn't really relevant to the question at hand....but i just thought i would nit-pick here and point out that BJ is not exactly a coin-flip game.

oherwise, i can only barely understand what in the world you guys are talking about.

BruceZ
12-11-2003, 05:16 AM
Suppose he starts with 1 dollar. Suppose that when he wins, he wins 2 dollars, and when he loses, he loses 1 dollar. Then if he ever has at least twice as many losses as wins, he will be broke.

He will be broke if he has more than twice as many losses as wins, which is what we want. This is the same as saying that he always wins x times before he loses 2x times as stated in the original problem. He can have x wins and 2x losses as long as he got the x wins first.


p = [sqrt(5)-1] / 2 = 61.8%.

You can verify that this is the solution. This is the probability that he will eventually have twice as many losses as wins, so the probabilty that he never has twice as many losses as wins is:

1 - [sqrt(5) - 1] / 2 =

[3 - sqrt(5)] / 2 = 38.2%

The first probability of 68.2% is that he will never have more than twice as many losses as wins, and so it is 32.8% that he will never have more than twice as many losses as wins.

bigpooch
12-11-2003, 10:33 AM
BruceZ:

You must have been quite tired to have typed the following
in your last post:


"The first probability of 68.2% is that he will never have
more than twice as many losses as wins, and so it is 32.8%
that he will never have more than twice as many losses as
wins."


The percentages are 61.8% and 38.2% respectively; also, the
first "never" above should be replaced by "at some time".

If we were to generalize the ratio of winning to >= 1/k
of the time at all times where k>=2 (so that the number of
losses to wins is no greater than the ratio of (k-1) to 1),
this probability is q where q=1-p and p is the nontrivial
solution (the trivial root being 1) to

p^k - 2p + 1 = 0.

Clearly, 1 is a root of the above and dividing by p-1 (here,
say k is an integer) and the nontrivial root is that p for
which

p^(k-1) + p^(k-2) + ... + p -1 = 0.

It is also clear from the above that as k becomes larger
and larger, the root of the equation becomes smaller and
smaller but that root is greater than 1/2 as the geometric
series 1/2 + 1/4 + ... is just 1. Hence, q = 1-p approaches
1/2 from below as k goes to infinity.

The result is well known when k=2 (in a symmetric random
walk, the origin is actually reached infinitely often) and
for k=3, p is just (sqrt(5)-1)/2, the inverse of the golden
ratio.

Now, I don't know much about blackjack, but suppose we were
to ignore pushes and look at basic strategy. Then there are
three possible results (forget about splits, insurance, and
doubling down for now!) if the bet size is exactly 1. For
the change in the equivalent random walk,

Loss: -1 with probability a
Blackjack: +3 with probability b
Normal win: +2 with probabilty c

It cannot be that difficult to solve this once we know a,b
and c. The above would be more complicated if everything
for splitting, doubling down and insurance were also covered
but you get the idea!

Then I suppose you would probably answer the question even
a bit more realistically! For the more general k for some
integer greater than 3, then with r=k-1, the above would be
changed by starting with 2 units instead and the step sizes
for the random walk would now be -2, 3r and 2r respectively.

In any case, I wonder if someone has already solved this
problem for BJ as it cannot be that difficult and thereby
be able to more accurately determine BR requirements. The
more complex task would be to incorporate some count system
but then it might be worthwhile to consider a Monte Carlo
simulation.

I recall years ago, some casinos would give limited 7 for 5
coupons for pit games and if that were the case, k=2.4; of
course, one could only use 5 or 10 coupons per casino and
the maximum bet was quite small!

Thoughts ?

CCass
12-12-2003, 11:08 AM
If you understand basic strategy (most do not) including doubling down and splitting, the odds with a 3:2 BJ payout are more like:

Win - 48%
Lose - 47%
Push - 5%

Add to that a person that can count cards and varies their bets to correspond with favorable / unfavorable situations, and the percentages are more like:

Win - 52%
Lose - 45%
Push - 3%

I base all of these numbers on the writings of Edwin Silberstang, whose card counting system has served me well.

Chris

Cyrus
12-17-2003, 04:42 PM
"If you understand basic strategy (most do not) including doubling down and splitting, the odds with a 3:2 BJ payout are more like:

Win - 48%
Lose - 47%
Push - 5%"

Even if the Blackjack (natural) pay-outs are 6:5 or even 3:1, the above percentages are not affected. Natural pay-outs have nothing to do with those percentages.

"Add to that a person that can count cards and varies their bets to correspond with favorable / unfavorable situations, and the percentages are more like:

Win - 52%
Lose - 45%
Push - 3%"

I'm sorry but how can the percentages change on the basis of how much I'm betting?? Winning a round is winning a round, period. If I'm playing in every round, the percentages of wins/losses/pushes is not affected.

I think that what you meant to say is that by changing our playing strategy according to the count, we can affect those percentages, which is true enough (though not to the extent mentioned in your post; the overall percentages are nearer to BS).

The counter’s true gain in shoes comes from altering his bet size, as you wrote, according to the count (EV). While the win/loss/push percentages remain on average roughly the same as if the player was following BS, it's knowing when he is the favorite, and therefore when to bet big, that gives the card counter his advantage.

--Cyrus

MicroBob
12-17-2003, 07:07 PM
i don't know whether or not these percentages are correct. i posted the percentages as i knew them.

but, fwiw, card-counting is a little more than just varying one's bets. if you are counting cards then you know when it might be better to not double-down on your 11 vs. 10 or hit on your 16 vs. 10, etc etc. there are strategy adjustments to be made as the deck composition changes but, granted, it will only change your win percentage slightly.

still, 52% wins for a card-counter doesn't make a lot of sense to me. if this were the case, wouldn't the card counter be betting more than a minimum on more than half of the hands??
the house is still at an advantage on a majority of the hands....the card-counter reaps an advantage by stepping up his bet when the deck-compostion changes to be in his favor. on a 6-deck shoe he needs a bet spread of 1-to-12 or greater to truly take advantage of the benefits because the percentage of hands in which the player is at a true advantage is too small to be effective at 1-to-2 or 1-to-3 ratio..
the player being at the advantage on 52% of the hands simply doesn't make sense to me.

again, the numbers i provided are the best as i can remember them....i would not be surprised if they were wrong. i am traveling now but when i return home i will see if i can look up the appropriate percentages in either Stanford Wong's 'Professional Blackjack' or Peter Griffin's 'Theory of Blackjack' and will try getting back to this forum with percentages from the appropriate source.

BruceZ
12-18-2003, 03:27 AM
The probability of winning a hand in BJ is about 43%, and losing is about 48%, and this is almost independent of whether the player is playing basic strategy or counting, and it is even independent of the count, for most realistic counts. See this simulation result (http://www.bjmath.com/bjmath/conseq/streak.htm). Many counters believe that they are more likely to win a hand when the count is high, and this is wrong. Their advantage in EV for high counts comes primarily from the increased chance of blackjacks, doubles, and pair splits, combined with the ability to vary their bet when they have a postitive EV. Of course they don't win over 50% of their hands, or they would win even if they never doubled down, split pairs, or varied their bet, and this is nonsense.

BruceZ
12-18-2003, 03:49 AM
You must have been quite tired to have typed the following
in your last post:

"The first probability of 68.2% is that he will never have
more than twice as many losses as wins, and so it is 32.8%
that he will never have more than twice as many losses as
wins."


The percentages are 61.8% and 38.2% respectively; also, the
first "never" above should be replaced by "at some time".


Oops. LOL. Let's try that again:

The first probability of 61.8% is that he will at some point have
more than twice as many losses as wins, and so it is 38.2%
that he will never have more than twice as many losses as
wins.

I agree that we can set up a random walk model of blackjack for basic strategy, but since we know this is a negative expectation game, we really want the bankroll requirements for card counting, and that is difficult to model this way due to variable bet sizes which depend on the count. The bankroll formulas I have given were first derived by the blackjack community, and they are based on a coin flip model which has the same average win and standard deviation as a blackjack game (or poker). Any departure from this model due to the actual nature of the game should be quite minimal. See this article (http://www.bjmath.com/bjmath/sileo/sileo.pdf) for a derivation.

MicroBob
12-18-2003, 04:47 AM
thanks for the info. at least my numbers were pretty close to the mark (and may in fact be correct depending on number of decks).
did your percentages come solely from the simulation (i did not click the link) or is there another source for this info you turned to??