PDA

View Full Version : Odds of AA running into KK at full table?


arod4276
07-18-2004, 11:58 PM
could anyone please tell me what the odds of this hapenneing, along with how you figured it out. thank you very much, arod4276

tylerdurden
07-19-2004, 01:14 AM
((4/52*3/51) * 10) * ((4/50*3/49) * 9)

Chances of any one player getting AA = 4/52 * 3/51 = 0.0045
Chances of any of ten players getting AA = 0.0045 * 10 = 0.0045

Chances of any of nine remaining players getting KK after AA is already out of the deck = (4/50 * 3/49) * 9 = 0.0041

Chances of both AA and KK in same round = 0.0045 * 0.0041 = 0.00198 = 0.19%

tylerdurden
07-19-2004, 01:17 AM
BTW, in that 0.19% of hands, sometimes two different players will have AA in addition to the player with KK, sometimes two different players will have KK in addition to the player with AA, and sometimes two players will have AA and two players will have KK.

DMBFan23
07-19-2004, 01:37 AM
I see this as two separate problems.

Problem 1: I have KK (or AA) and I want to know how often someone else has AA (or KK).

Problem 2: What are the odds in general that this will happen to two players?

The math for problem 1 sets up problem 2, so I'll try that one first.

Answer to Problem 1: I'll solve assuming I have KK and I'm curious if someone else has AA (the converse has the same solution)

if I have KK, the odds of one opponent having AA are roughly 1/204 (I'll justify this upon request). To know how often one of multiple opponents has AA, it is easier to find the odds that NO ONE has it. the rest of the time, someone will.

if p(Bullets) = 1/204, then p(Anything Else) = 203/204. so for 9 opponents to all have (Not Bullets), just multiply the odds of each having (Not Bullets). so we get (203/204)^9 = 95.6% chance that no one has it, so that means that 4.3% of the time we have KK, one of our nine opponents will have AA.

Answer to Problem 2: I thought about a ton of ways to solve this, and I'd love to hear Mason's or others' input on this solution.

Conditional Probabilty says: P (Player 1 has KK and Player 2 has AA) = P (KK given AA) * P (AA).

We know that P (AA) = 1/221.
We also know that P(KK given AA) = 1/204 (from problem 1).

so P (player 1 KK, player 2 AA) is 1/45804.

now we need to account for the fact that players 1 and 2 could be any two players at the table. to choose 2 players from 10 (notice that order matters since we specify that player 1 has KK and player 2 has AA) can be done in 10*9 = 90 ways (10 candidates for player 1, 9 remaining candidates for player 2). so it seems that at a 10 player table, the probability of KK running into AA are 90/45804, or about 1/500.

Fire away, let me know if I made a mistake.

DMBFan23
07-19-2004, 01:38 AM
dammit pvn beat me to it while I was making my response look pretty. our numbers match though. /images/graemlins/cool.gif

tylerdurden
07-19-2004, 10:53 AM
/images/graemlins/smile.gif Yeah, this morning I realized he might be asking if he looks at his hole cards and sees AA, what the chances are of someone else having KK, but you covered that possibility as well.

tylerdurden
07-19-2004, 11:02 AM
I had some typos in my numbers:

((4/52*3/51) * 10) * ((4/50*3/49) * 9)

Chances of any one player getting AA = 4/52 * 3/51 = 0.0045
Chances of any of ten players getting AA = 0.0045 * 10 = 0.045

Chances of any of nine remaining players getting KK after AA is already out of the deck = (4/50 * 3/49) * 9 = 0.044 (BTW, this is also the chance that someone else has KK once you look at your cards and see that you have anything other than a king.)

Chances of both AA and KK in same round = 0.045 * 0.044 = 0.00198 = 0.198%

newcool
07-19-2004, 12:38 PM
Chance that one person has A,A = [ (4/52) * (3/51)] *10 = 4.5%
Chance that another person has K,K = [ (4/52) * (3/51) ] * 9 =4%

Chance that someone has A,A and K,K = .045 * .04 = .0018 or 18/10,000 = roughly 1/555

tylerdurden
07-19-2004, 03:53 PM
[ QUOTE ]

Chance that another person has K,K = [ (4/52) * (3/51) ] * 9 =4%

[/ QUOTE ]

Once you've already established that one person has AA, there's only 50 unseen cards left, four of which are Kings, so it's 4/50 * 3/49.

David Sklansky
07-19-2004, 06:13 PM
Chances of any one player getting AA = 4/52 * 3/51 = 0.0045
Chances of any of ten players getting AA = 0.0045 * 10 = 0.0045

Just for the record this isn't perfectly accurate. See why?

BarronVangorToth
07-19-2004, 07:10 PM
[ QUOTE ]
Chances of any one player getting AA = 4/52 * 3/51 = 0.0045
Chances of any of ten players getting AA = 0.0045 * 10 = 0.0045

Just for the record this isn't perfectly accurate. See why?

[/ QUOTE ]


Don't question the math, David -- ever.

Barron Vangor Toth
www.BarronVangorToth.com (http://www.BarronVangorToth.com)
"Hates It When 9 out of 10 have Pocket Aces but he's the 1 out of 10"

George Rice
07-19-2004, 08:58 PM
But the chances of holding AA and an opponent holding KK and a king flopping seems so much higher. /images/graemlins/wink.gif

Cosimo
07-19-2004, 10:06 PM
[ QUOTE ]
Chance that one person has A,A = [ (4/52) * (3/51)] *10 = 4.5%

[/ QUOTE ]

The error that David refers to is here.

The chance that the SB has at least one ace: approx 14%.
Chance that someone at the table has an ace: 14%*10 = 140%.

Cosimo
07-19-2004, 10:34 PM
I tried tackling this a bit earlier, but there's a ton of terms. Since I'm not that concerned about it, I'll just start with easier problems.

/images/graemlins/spade.gif Problem 1

Player 1 has aces: 4/52*3/51. Another way to calculate this is that there are 6 sets of paired aces and 1326 starting hands. This gives us the common ration of 1 in 221 -- how often you're dealt aces.

/images/graemlins/spade.gif Problem 2

Say you're at a table with two players. What's the chance that at least one of them has aces?

This is equal to the chance that the first does, plus the chance that the first doesn't but the second does. Note that if the first person does not have an ace, it is more likely that the second one does; the second player isn't drawing out of 52 cards but only 50! For example, say the first player has no aces. The second player has a 4/50*3/49 chance to draw two aces, which is about one in 204.

Chance player 1 has AA: 6/1326 (or 1 in 221)
Chance player 1 has one ace: 4/52*48/51 = A
but player 2 has AA: 3/50*2/49 = B
Chance player 1 has no aces: 48/52*47/51 = C
but player 2 has AA: 4/50*3/49 = D

Total: 6/1326 + A*B + C*D
= .887%

/images/graemlins/spade.gif Problem 3

Chance that at least someone at the table has at least one ace: 1-X, where X = chance that no-one has an ace. The reason that I'm expressing it this way because it's easier to calculate.

X = chance first person doesn't have an ace, * chance second person doesn't, etc
= 48/52*47/51 * 46/50*45/49 * ... * 30/34*29/33
= 32*31*30*29/52*51*50*49
= 13.3%

Hence, the chance that at least one player has an ace is 86.7%.

Prevaricator
07-20-2004, 01:16 AM
If player 1 doesn't get dealt AA, then player 2 is slightly more likely to get dealt AA.

tylerdurden
07-20-2004, 01:49 AM
[ QUOTE ]

Player 1 has aces: 4/52*3/51. Another way to calculate this is that there are 6 sets of paired aces and 1326 starting hands. This gives us the common ration of 1 in 221 -- how often you're dealt aces.

[/ QUOTE ]

OK, that's what I need. If there are 6 sets of AA, and 1326 total starting hands, the chance that a given player will NOT get AA is 1320/1326 = 0.995475. Therefore, the chance that all ten players will not get AA is 0.995475^10 = 0.95566. So the chance that at least one gets AA is 1 - 0.95566 = 0.04434.

OK. There are 6 sets of KK. There are 1326 starting hands, but one of them is already taken (AA). The chance that a given player will NOT get KK if an AA is already dealt is 1319/1325 = 0.99547. The chances that all nine remaining players will not get KK is 0.99547^10 = 0.95563. So the chance that at least one gets it is 0.04437.

The chances that AA and KK get dealt on the same hand are then 0.04434 * 0.04437 = 0.00197.

Better?

EDIT: BTW, this is pretty much how DMBFan23 figured it, so his answer was better than my original answer.

tylerdurden
07-20-2004, 08:56 AM
[ QUOTE ]
There are 6 sets of KK. There are 1326 starting hands, but one of them is already taken (AA). The chance that a given player will NOT get KK if an AA is already dealt is 1319/1325 = 0.99547.

[/ QUOTE ]

Hmm, actually, there's only 1225 possible hands left once we take AA out ((50*49)/2). Of those, six are KK, so 1219/1225 = 0.99510. (1219/1225)^9 = 0.95677. So the chance one gets KK is 0.04322

0.04434 * 0.04322 = 0.001916.

aloiz
07-20-2004, 01:07 PM
OK, that's what I need. If there are 6 sets of AA, and 1326 total starting hands, the chance that a given player will NOT get AA is 1320/1326 = 0.995475. Therefore, the chance that all ten players will not get AA is 0.995475^10 = 0.95566. So the chance that at least one gets AA is 1 - 0.95566 = 0.04434.

The event that each person is not dealt AA is not independent from the event that the previous person is dealt AA, therfore you cannot just multiply. To figure out an exact answer for the odds that at least one person at a 10 person table gets AA, you need to use the inclusion exclusion principle, or else you will double count the times that two players get dealt AA.

The odds that at least one player at a ten person table gets dealt AA is 10 * C(4,2)/C(52,2) - C(10,2)/C(52,4) = .045082

aloiz

Cosimo
07-20-2004, 03:36 PM
Yeah, what aloiz said.

[ QUOTE ]
Hmm, actually, there's only 1225 possible hands left once we take AA out ((50*49)/2). Of those, six are KK, so 1219/1225 = 0.99510.

[/ QUOTE ]

Correct. This means there is a 1:204 chance that you have KK once you know that two aces have been removed from the deck.

[ QUOTE ]
(1219/1225)^9 = 0.95677. So the chance one gets KK is 0.04322

[/ QUOTE ]

Nope, this doesn't work. If someone has AA, then the chance that the next person does NOT have KK is .95677. But if they don't have KK, then the chance that the next person does goes up. Do you see why?

There's probably a simple way of determing an AA and a KK are at the table, but it requires some statistics formula that I don't remember any more. Alternately, you can just write out the billion forms:

Person 1 has AA: 100%
-Person 2 has KK: x
-Person 2 has one king: y
--Person 3 has KK: z
--Person 3 has one king: w
---Person 4 has KK: v
---Person 4 has no kings: u
----Person 5 has KK: t
----Person 5 has no kings: o
-----Person 6 has KK: n
-----Person 6 has no kings: m
------Person 7 has KK: l
------Person 7 has no kings: k
-------Person 8 has KK: j
-------Person 8 has no kings: i
--------Person 9 has KK: h
--------Person 9 has no kings: g
---------Person 10 has KK: f
--Person 3 has no kings: e
---Person 4 has KK: d
---Person 4 has one king: c
----Person 5 has KK: b
----Person 5 has no kings: a
etc.

Keep going! Note that I stopped giving away one king after two had been found because if three have been given away, then no-one can have two. Multiply and add all those up! /images/graemlins/grin.gif

aloiz
07-20-2004, 05:06 PM
At a ten person table odds that we have KK up against AA preflop = Odds that at least 1 person has AA * odds that at least one person has KK given that at least one person has AA. However notice that the two events are not totally independent, as the number of people holding AA (1 or 2) effects the odds that at least one player holds KK.

As previously mentioned the odds that at least one person has AA = 10 * C(4,2)/C(52,2) - C(10,2)/C(52,4) ~= .045082

Given that we know that at least 1 person has AA we can figure out the odds that someone has KK.

case 1: only 1 person has AA
odds of at least one person having KK = 9*C(4,2)/C(50,2) - C(9,2)/C(50,4) ~= .0439

case 2: two people have AA
odds of at least one person having KK = 8*C(4,2)/C(48,2) - C(8,2)/C(48,4) = .0424

So we know that at least one person has AA. Given that fact, we can figure out how often each of the previous two cases will occur.
Case 1 ~= .993
Case 2 ~= .007 ( 9/C(50,2) )

So multiply and adding everything up gives us .045082 *( (.993 * .0439) + (.007 * .0424) ) = .001980

aloiz

tylerdurden
07-20-2004, 05:32 PM
[ QUOTE ]
you need to use the inclusion exclusion principle, or else you will double count the times that two players get dealt AA.

[/ QUOTE ]

If I had double-counted some events, my number would have been *higher* than yours, but mine is lower. /images/graemlins/confused.gif

tylerdurden
07-20-2004, 05:33 PM
[ QUOTE ]
Nope, this doesn't work. If someone has AA, then the chance that the next person does NOT have KK is .95677. But if they don't have KK, then the chance that the next person does goes up. Do you see why?

[/ QUOTE ]

No, I don't see it. If we were looking at each players hand as it came out, then dealing the next one, yeah, this would be right. But I'm assuming we're dealing all ten hands simultaneously, like at a normal table.

aloiz
07-20-2004, 05:57 PM
Yea the number you got was lower because you over approximated odds of not getting AA. To figure out the exact odds that all the people at the table do not get AA you need to take P(p1 not have AA) * P(p2 not have AA|p1 not have AA) * P(p3 not have AA|p2 and p1 not have AA) etc...

As you continue given the fact that x amount of people don't have AA decreases the odds that the current player does not have AA. Thus your number was slightly high, making the probability that at least one player having AA slightly low.

Note that you could also use inclusion/exclusion instead of conditional probability to figure out the exact odds that no one has AA. P(no one has AA) = 10 * P(a single person doesn't have AA) - P(two people don't have AA) + P(3 people don't have AA) - P(4 people don't have AA) ....

aloiz

tylerdurden
07-20-2004, 06:34 PM
[ QUOTE ]
As you continue given the fact that x amount of people don't have AA decreases the odds that the current player does not have AA.

[/ QUOTE ]

I still think this is the wrong way to figure this. All hands are dealt (effectively) simultaneously. I don't know what the guy in front of me has before I pick up my cards, so I can't calculate based on what he has or doesn't have, so for computational purpses, his hole cards are effectively the same as the ones still in the deck (i.e. unknown).

A given player's chances of getting AA in the hole on any give hand are always 6/((52*51)/2), no matter how many people are playing. It doesn't matter what position he's in, or how many people get their cards before or after him.

aloiz
07-20-2004, 08:44 PM
You're right in the sense that if you take any specific player at a ten person table the odds that that person is dealt AA is 1/221. However if you now multiply that number by 10 to calculate the odds that at least one player at a ten person table gets AA you are double counting the times that 2 people get AA.

Here's a simplified version. Say you have a four card deck. As Ah 2s 2h, and you deal 1 card to two players. The odds that any given player gets an ace would be 1/2, but the odds that at least one player gets an ace is not 2 * 1/2 =1, because both two's can be dealt. To figure out the odds that at least one player gets dealt AA we take 2 * odds that a given player is dealt AA and subtract the odds that both players get AA. The odds that both players get AA is 1/6. so 1-1/6 = 5/6. Now we can check this because we know the only time that we don't deal an ace is if both two's are dealt. The odds that both two's are dealt would be 1/2 * 1/3 = 1/6.

hope that makes sense.

aloiz

Cosimo
07-20-2004, 09:02 PM
The chance that you have an ace is 14%. The chance that the next guy has an ace is 14%. The chance that someone at the table has an ace: is it 10x14%? What does it mean for there to be a 140% chance that someone has an ace? That's meaningless. That means that the chance that no-one has an ace is -40%, which is again absurd. You can't multiply probabilities for dependent events; only for independent events.

Once you've given a card to one person you can't give it to anyone else. The cards that you hold are not independent of what the next guy holds, or else it would be possible for the entire table to have aces. That's 20 aces. It doesn't matter if you simultaneously distribute the cards, because you still can't give the same card to two people.

One of the tricks of statistics is to take a bunch of simultaneous but dependent events and serialize them. This is not an approximation--it's in fact the recommended way of breaking down complex problems. So whether we say that these events are simultaneous or not isn't really the point. More appropriately, if the dealer gave a pair of kings to one player, it is less likely that he "at the same time" gave a pair of kings to another player.

Take all the bazillions of combinations of hands that can be dealt to ten people. What fraction contains at least one pair of kings? That's X.

Take the subset of those bazillions that includes at least one pair of kings. What fraction contains a second pair of kings? This fraction will be smaller than X--there's only two more kings left, yet every other card is still possible.

tylerdurden
07-20-2004, 10:50 PM
[ QUOTE ]
However if you now multiply that number by 10 to calculate the odds that at least one player at a ten person table gets AA you are double counting the times that 2 people get AA.

[/ QUOTE ]

Oh, you're talking about the first calculation I did? I already admitted that was wrong and submitted a different one. And even then, in my first calculation, I didn't count anything double, I just plain did it wrong (see Cosimo's explanation of why it was wrong).

tylerdurden
07-20-2004, 10:50 PM
[ QUOTE ]
The chance that you have an ace is 14%. The chance that the next guy has an ace is 14%. The chance that someone at the table has an ace: is it 10x14%? What does it mean for there to be a 140% chance that someone has an ace? That's meaningless.

[/ QUOTE ]

Uh, yeah. I got that part, that's why I re-did my answer. Didn't you read my 2nd calculation?? At least you have the right reason why my 1st attempt was wrong.

aloiz
07-21-2004, 12:52 AM
I thought I answered your latest calculation. However the example in my previous point should explain why both approximation methods, whether you calculate the odds that someone gets AA and multiply by 10, or you calculate the odds that no one gets AA and raise it to the tenth power are both approximations and not accurate. If you use 10 * 1/221 as the odds at least one person gets AA preflop you're double counting the times two people get AA. If you use (1320/1326)^10 as a way to calculate the odds that no one gets AA preflop this is also wrong, as you do not have 10 independent events.

If you go back to the four card deck example. Your way of calculate the odds that no one is dealt an ace preflop would be (1/2) ^ 2 = 1/4. However the actual odds that no one gets an ace is 1/2 * 1/3 = 1/6.

If I'm still missing your calculation then I apologize, but none of your calculations that I’ve seen are exact.

aloiz

DMBFan23
07-21-2004, 01:12 AM
I would think that the probability of being dealt (not AA) IS independent, since the cards are being dealt simultaneously and assessed "instantaneously" if that makes sense.

IMO, it's the same principle as not counting the expected value of outs that are dead in other people's hands when calculating pot odds...you just ignore their cards, since we cannot know what they have.

So returning to our example, we can only know, given the two cards we are considering, the odds that they are not AA. only when we assume something as a stipulation does conditional probability apply (ie given that x, y happens so often...). from there finding p', p' = odds no one has AA, 1-p gives us the odds of someone having AA. this also takes care of all double counting, since we are finding the following probability: "it is not the case that no one has AA." this could be one or two players with AA.

Cosimo
07-21-2004, 01:28 AM
[ QUOTE ]
I would think that the probability of being dealt (not AA) IS independent, since the cards are being dealt simultaneously and assessed "instantaneously" if that makes sense.

[/ QUOTE ]

This is one of the things that statistics teachers try to drill out of you. If what happens in one event affects what is possible in another event (even if that second event is 'simultaneous'), then the two events are dependent.

Strictly, there's no way to deal a card to one player but still have the card in the deck to give to the second player. If you give a non-A to one player, you can't give the same card to the next one.

That's the definition of dependence.

aloiz
07-21-2004, 01:59 AM
[ QUOTE ]
I would think that the probability of being dealt (not AA) IS independent, since the cards are being dealt simultaneously and assessed "instantaneously" if that makes sense.

[/ QUOTE ]

As cosimo stated this is incorrect. Again look at my previous example with the four card deck if you need further clarification.

[ QUOTE ]

IMO, it's the same principle as not counting the expected value of outs that are dead in other people's hands when calculating pot odds...you just ignore their cards, since we cannot know what they have.

[/ QUOTE ]

The reason why you ignore burn cards and other people's hole cards when you deal with outs is because your outs are just as likely to be the burn card or in someone's hole cards as they are to be in the remaining deck. Again to illustrate let's take a simple example. A deck of As, Ah, 2s, 2h. You and your opponent are dealt a single card and a single card is turned up. What is the probability given that your down card is the As that the card turned up makes you a pair. Well you have 1 out in the deck so you calculate your odds as being 1/3. We can figure this out another way by seperating into two cases. The first case is that your opponent has the other ace leaving you with 0 outs. The second case is your opponent does not have the other ace. Case 1 occurs 1/3 of the time. Case 2 occurs 2/3 of the time. So 1/3 * 0 = 0 (probability that we make our hand times the probability our opponent has the other ace) 2/3 * 1/2 = 1/3 (probabilty that we make our hand times the probability that the ace is still in the deck). Adding the two together we get 1/3.

[ QUOTE ]

So returning to our example, we can only know, given the two cards we are considering, the odds that they are not AA. only when we assume something as a stipulation does conditional probability apply (ie given that x, y happens so often...). from there finding p', p' = odds no one has AA, 1-p gives us the odds of someone having AA. this also takes care of all double counting, since we are finding the following probability: "it is not the case that no one has AA." this could be one or two players with AA.

[/ QUOTE ]

Not sure if I completely understand you here, but you can figure out the odds that no one has AA preflop, or at least one person has AA preflop using conditional probability. To figure out if no one has AA P(p1 not have AA) + P(p2 not have AA | p1 not have AA) + P(p3 not have AA | p1 and p2 not have AA) etc...

Note the P(p1 not have AA) = 220/221
P(p2 not have AA | p1 not have AA) = P(p1 has one ace) * P(p2 not have AA) + P(p1 has no aces) * P(p2 not have AA)

This gets messy very quickly, but as you can see we are accounting for all possiblities, and that each event (p1 has one ace, p1 doesn't have an ace) effects the odds that the next player has AA. If that doesn't make sense go back and work through simplier examples using smaller decks.

aloiz

pzhon
07-21-2004, 07:30 AM
Let P11 be the probability that a particular pair of players has AA and KK.

Let P21 be the probability that a particular triple of players has AA, AA, and KK or AA, KK, and KK.

Let P22 be the probability that four particular players have AA, AA, KK, and KK.

The probability that some 4 players have AA, AA, KK, and KK is (10 choose 4)*P22 = Q22.

The average number of triples of players with 3 high pocket pairs is (10 choose 3)*P21. However, every time 4 players have high pocket pairs, there are 4 triples of players with 3 high pairs. The probability that exactly 3 players have high pocket pairs is (10 choose 3)*P21 - 4 Q22 = Q21.

The average number of pairs of players with AA and KK (not including AA vs. AA or KK vs. KK) is (10 choose 2)*P11. Every time there are exactly 4 players with high pocket pairs, there are 4 pairs of players with AA vs. KK. Every time there are exactly 3 players with high pairs, there are 2 pairs of players with AA vs. KK. The probability that exactly one pair of players has AA vs. KK is (10 choose 2)*P11 - 4 Q22 - 2 Q21 = Q11.

The probility that at least one pair of players has AA vs. KK is Q11 + Q21 + Q22.

Evaluating:

P11 = (2)(4C2)(4C2)/(52C2,2,48) = 12/270,725
Choose who has the aces, which aces, and which kings.

P21 = (2)(3)(4C2)(4C2)/(52C2,2,2,46) = 3/25,448,150
Choose whether aces or kings are odd, choose who gets the odd pair, choose the odd pair, and choose the division of the other 4.

P22 = (4C2)(4C2)(4C2)/(52C2,2,2,2,44) = 1/8,779,611,750
Choose which players get the aces, choose the division of aces, and choose the division of kings.

The multinomial coefficient (A choose B, C, D, ...) means A!/(B!C!D!...).

Q22 = 1/41,807,675
Q21 = 4,112 / 292,653,725
Q11 = 575488 / 292,653,725

Q11 + Q21 + Q22 = 82,801 / 41,807,675 = 0.0019805215190751458913... = 1/504.91751307351360491...

This calculation is related to something called Möbius inversion, a generalization of inclusion-exclusion.

pzhon
07-21-2004, 07:54 AM
[ QUOTE ]

case 1: only 1 person has AA
...

case 2: two people have AA
...

So we know that at least one person has AA. Given that fact, we can figure out how often each of the previous two cases will occur.
Case 1 ~= .993
Case 2 ~= .007 ( 9/C(50,2) )


[/ QUOTE ]

Your overall method is ok, but this is part is wrong.

P(Case 2) = 45/270725
P(Case 1) = 10/221 - 90/270725

P(Case 1 or Case 2) = 10/221 - 45/270725
So, the conditional probability P(Case 2 given (Case 1 or Case 2)) = 45/270725 / (10/221 - 45/270725) = 0.00368701...

This is not 9/(50C2)=.007. The conditional probability that two players have aces given that you have aces is not the conditional probability that two players have aces given that someone has aces.

A more concrete example: Suppose you are one of 6 friends who visit a restaurant on two days each week. Everyone comes on Sunday, and then each friend comes on a different day of the week. The probability that everyone is there given that you are there is 1/2. The probability that everyone is there given that at least one friend is there is 1/7, even though you are no different from any other friend.

Why is this? Conditioned on Case 2, the probability that you have aces is 2/10. Conditioned on Case 1, the probability that you have aces is 1/10. You are more likely to have aces if 2 people have aces than if just 1 person has aces, so using the condition that you have AA makes Case 2 more likely.

The reason this didn't make much of a difference in your answer is that you were taking a weighted average over two terms that were very close anyway, so that one of your weights was wrong by .00366 caused an error smaller than 1 part in 1000.

DMBFan23
07-21-2004, 08:56 AM
yeah, thinking about it more, you would have to factor in the lack of replacement inherent in dealing cards. so there would be about a million conditional probabilites involved here, this is fun...

although for practical purposes, my original answer is about 1/508, and pzhons was 1/504, so I think 1/500 is a sufficient estimate

tylerdurden
07-21-2004, 09:48 AM
[ QUOTE ]

This is one of the things that statistics teachers try to drill out of you. If what happens in one event affects what is possible in another event (even if that second event is 'simultaneous'), then the two events are dependent.

Strictly, there's no way to deal a card to one player but still have the card in the deck to give to the second player. If you give a non-A to one player, you can't give the same card to the next one.

That's the definition of dependence.

[/ QUOTE ]

OK, so you're saying that my chances, before any cards are dealt, of getting AA are different if I'm facing two other players than if I'm facing nine other players? From my perspective, it doesn't matter if the cards that are not in my hand are in the deck or on the table. Either they're in my hand or they're not.

Now, if you're watching each card come out and recalculating after each card, then you're right, the events are dependent. But that wasn't what the original question asked.

aloiz
07-21-2004, 11:04 AM
Thanks, that makes sense, and I understand your method as well. However, I'm still not getting your answer exactly.

x = 45/270725 / (10/221 - 45/270725)
y = 8*C(4,2)/C(48,2) - C(8,2)/C(48,4)
z = 9*C(4,2)/C(50,2) - C(9,2)/C(50,4)
10/221-90/270725 * (x * y + (1-x) * z) = .001980017...

aloiz

aloiz
07-21-2004, 11:27 AM
[ QUOTE ]
[ QUOTE ]

This is one of the things that statistics teachers try to drill out of you. If what happens in one event affects what is possible in another event (even if that second event is 'simultaneous'), then the two events are dependent.

Strictly, there's no way to deal a card to one player but still have the card in the deck to give to the second player. If you give a non-A to one player, you can't give the same card to the next one.

That's the definition of dependence.

[/ QUOTE ]

OK, so you're saying that my chances, before any cards are dealt, of getting AA are different if I'm facing two other players than if I'm facing nine other players? From my perspective, it doesn't matter if the cards that are not in my hand are in the deck or on the table. Either they're in my hand or they're not.

[/ QUOTE ]

This is not what he's saying. You keep worry about the odds from a given person's perspective. We are not concerned with ten individual perspectives, but the group as a whole. Since the question is concerned with the entire table, you need to be assessing all ten individual events (a player being dealt, or not being dealt AA). However these individual events MUST be dependent on each other in that they are all being dealt cards from the same deck.

Here's a simple example of the principle that he's trying to illustrate. You have an urn with 2 red and 3 blue balls for a total of five balls. You select two balls at random simultaneously from the urn. What is the probability that you select two red balls? Now if we were to select a single ball the odds that it would be red would be 1/5. However the odds of selecting two are not 2 * 1/5 because the events are not independent even though they occur at the same time. We have C(5,2) = 10 total ways to choose two balls, and C(2,2) = 1 way to choose 2 red balls so our answer would be # of ways to choose 2 reds / total number of ways to choose two balls = 1/10.

aloiz

BruceZ
07-21-2004, 01:18 PM
[ QUOTE ]
If player 1 doesn't get dealt AA, then player 2 is slightly more likely to get dealt AA.

[/ QUOTE ]

You are saying that the events of players being dealt AA are not independent. That is true, but it is NOT the reason why multiplying by 10 is not exact. The reason is that more than one player can have AA. In other words, the events of players being dealt AA are not mutually exclusive. When we multiply by 10, we double count all the hands where 2 players hold AA.

To be exact, we have to subtract off the small probability that 2 players hold AA. The probability that two specific players hold AA is 1/C(52,4). There are C(10,2) = 45 ways to pick 2 players out of 10. Now notice that these 45 pairs are mutually exclusive since only 1 pair can have the 4 aces. Therefore, the probability that 2 players have AA is exactly 45/C(52,4). The probability of at least 1 player having AA is exactly 10*6/C(52,2) - 45/C(52,4) =~ 4.51%.

Your answer would correctly explain why the approximation 1 - (220/221)^10 = 4.43% is not exact. Multiplying these probabilities requires that the hands be independent.

It is possible for events to be mutually exclusive without being independent. For example, if you hold AA, the probability that a specific opponent holds AA with you is 1/1225, and the probability that one of your 9 opponents holds AA is exactly 9/1225. This is exact because only 1 opponent can have AA. The events of different opponents holding AA are mutually exclusive, even though they are not independent.

pzhon
07-21-2004, 02:31 PM
There was a 90 that should have been a 45, but correcting that didn't fix the problem.

I think the problem is as follows: Given that precisely one player has AA, you need the conditional probability that at least one player has KK. This is not the same as the conditional probability that at least one player has KK given that one particular player has AA, but the latter is what you have computed. You haven't used the information that there isn't a second AA. It might not seem to make any difference, but it does.

P(at least one KK | BB has AA) = 9*(4C2)/(50C2) - (9C2)/(50C4) = 0.0439253

P(at least one KK | exactly one AA) = (Q11+Q21/2)/P(exactly one AA) = 0.0439365

(with Q11 and Q21 as defined in my solution)

With the latter value, I get 82,810 / 41,807,675 again. I don't know of a good way to get the necessary conditional probability directly, without computing Q22, Q21, and Q11.

aloiz
07-21-2004, 03:54 PM
Yea the 90 was a typo. And I understand how including the possibility of a second AA when computing P(at least 1 KK | only 1 AA) skrews things up. I'm sure there's a way to figure it out using conditional prob, but now that I see how complicated it is I like your solution more and more, which by the way was very clear and easy to understand.

thanks,
aloiz