PDA

View Full Version : Shoot-out at the OK Corral

LinusKS
11-09-2004, 03:02 AM
Suppose four gun-fighters rob a bank, and decide to split the booty three ways.

The only problem is, they can't decide who gets left out. To settle the matter, Abe suggests they have a shoot-out.

Now these four gun-fighters are good friends, and they all happen to know Abe is the best shot. Bill and Corby are mediocre, and Dan is truly sucky.

Assume they take up positions an equal distance apart, none of them shoots any faster or slower than any of the others, and they all act rationally once the shooting starts.

If Abe can hit a target at that distance with 80% accuracy, Bill and Corby are 60% and 40% accurate respectively, and poor Dan misses 80% of the time, who is most likely to be eliminated? Who is most likely to survive?

11-09-2004, 07:18 AM
"...and they all act rationally once the shooting starts."

Danny-boy. Cuz if he's acting rationally he'll run like hell.

irchans
11-09-2004, 09:23 AM
Assuming:
1) The booty is equally divided among the survivors.
2) The shooting stops as soon as one robber gets shot.
3) "act rationally" means use a Nash equilibrium strategy for the game where the only payoff is percentage of the booty.

Then I think there is only one Nash Equilibrium for the game. Each robber randomly chooses an opponent every round and shoots. Abe is the most likely to survive. Poor Dan is the most likely to die.

If we replace Assumption 2) with

2) The shooting stops when at least 3 robbers are dead.

Then, the problem becomes difficult.

jason1990
11-09-2004, 09:27 AM
[ QUOTE ]
Assume they take up positions an equal distance apart

[/ QUOTE ]

So I guess they're not all on the ground.

LinusKS
11-09-2004, 01:27 PM
[ QUOTE ]
[ QUOTE ]
Assume they take up positions an equal distance apart

[/ QUOTE ]

So I guess they're not all on the ground.

[/ QUOTE ]

No. They're wearing gravity boots.

LinusKS
11-09-2004, 01:32 PM
[ QUOTE ]
Assuming:
1) The booty is equally divided among the survivors.
2) The shooting stops as soon as one robber gets shot.
3) "act rationally" means use a Nash equilibrium strategy for the game where the only payoff is percentage of the booty.

Then I think there is only one Nash Equilibrium for the game. Each robber randomly chooses an opponent every round and shoots. Abe is the most likely to survive. Poor Dan is the most likely to die.

If we replace Assumption 2) with

2) The shooting stops when at least 3 robbers are dead.

Then, the problem becomes difficult.

[/ QUOTE ]

I'm not sure what the "Nash Equilibrium Theory" is - I'll look it up in a moment.

But, isn't there a strategy each of the robbers could use that would maximize his chances of surviving (as opposed to shooting randomly)?

Mike Haven
11-09-2004, 02:49 PM
i think that because they are good friends greed will not overcome and they will stop shooting after at least one is killed on any volley of shots

because Abe is the best shot everyone will try to take him out so on each volley that has to be fired he will have a 60% chance of being killed by Bill, and of the 40% of the time he's not killed by Bill he has a 40% x 40% = 16% chance of being killed by Corby, and a 24% x 20% = 4.8% chance of being killed by Dan = total 80.8% chance of being killed

Abe will be worried most at being killed by Bill so he will aim at Bill and will kill Bill 80% of the time

therefore Abe has the most chance of being killed first and will hang up his boots 80.8% of the time

Bill has almost as much a chance of being killed first at 80% of the time

Abe and Bill will both be killed 80% x 80.8% = 64.64% of the time

imo, good old Corby and Dan will survive 100% of the time

jason1990
11-09-2004, 03:20 PM
Well, suppose you're Abe. Whoever you shoot at, if you hit them, it's over. So you don't gain anything by trying to take out one person over another. You can't try to take out Bill or Corby, thinking that after they're dead, you have a better chance of surviving. After they're dead, it's over.

Either someone's dead and it's over or everyone survives and the shooting continues. But if everyone survives, who cares who you shot at originally?

BettyBoopAA
11-09-2004, 09:45 PM
"Assume they take up positions an equal distance apart, none of them shoots any faster or slower than any of the others, and they all act rationally once the shooting starts"

Each person wants to maximim their survival and is greedy so they want the most people dead. This means the correct game theory solution for all 4 is to shoot at once. Abe and Bill will both shoot at each other at the same time as they both want to take out the player that can eliminate them. Since Corby and Dan understand this, their best action is to shoot at each other which makes
Corby the least likely to die at 20%

KungFuSandwich
11-09-2004, 09:50 PM
60 will have 80 then 40 and 20 then 20
80 will have 20,40,and 60 then 20,40 then 20
40 will have no one the first round then at worst 80 then 20
20 will have no one the first round no one the second round and then at worst 80 the third

Ah screw it, my head hurts

elwoodblues
11-10-2004, 12:17 PM
[ QUOTE ]
Assume they take up positions an equal distance apart,

[/ QUOTE ]

Is this possible with 4 people? If they arrange themselves in a square:
a b
c d

A is closer to B than he is to D.

TomCollins
11-10-2004, 12:19 PM
One has gravity boots, so they are forming a pyramid.

TomCollins
11-10-2004, 12:21 PM
I'm assuming if two people die, then the loot is split 2 ways instead of 3. And if 3 people die, then one guy gets all the loot. As soon as one person dies, its over. So there is no real advantage in targeting anyone, since if you are going to shoot at A and hit or shoot at B and hit, the game is over, and you don't have to worry about being hit.

I can't see how there is a better strategy than shooting randomly.

I thought the problem was shoot until everyone is dead but 1. That is far more interesting unless I am missing something obvious.

elwoodblues
11-10-2004, 12:44 PM
Excellent. I'll take the guy with gravity boots any day of the week.

BettyBoopAA
11-10-2004, 01:39 PM
I can't see how there is a better strategy than shooting randomly.

put yourself in Abe's shoes, the worst scenario for him is for him to be shot by someone he could have shot first. With that line of thinking, he will choose to shoot at the best shooter, Bill

TomCollins
11-10-2004, 02:09 PM

B kills A 60%*33% = 20%
B kills C 60%*33% = 20%
B kills D 60%*33% = 20%

C kills A 40%*33% = 13.3%
C kills B 40%*33% = 13.3%
C kills D 40%*33% = 13.3%

D kills A 20%*33% = 6.7%
D kills B 20%*33% = 6.7%
D kills C 20%*33% = 6.7%

In this case, A should shoot the opponent most likely to live, because in this case, more people end up dead. The opponent least likely to die is B, since he doesn't have B shooting at him. So A should always shoot B.

But now C figures this out, so he realizes if he doesn't shoot at B (since A should take care of him 80% of the time), hes better off shooting at just A or D.

This is actually becoming quite interesting. I will have to look at it later.

TomCollins
11-10-2004, 04:33 PM
So A can do better than randomly shooting.

It looks like the maximum EV for every player can be reached when each of his opponents has an equal chance of dying. So this means the overall EV is when everyone dies an equal amount of time. This happens when each gunfighter decides to randomly shoot (but not equally random) at each other fighter. This comes out to roughly a 57% chance of survival for each fighter. However, the EV is about 24% of the loot. This is because occasionally every fighter will die.

Lesson learned: Share the damn money!

HigherAce
11-10-2004, 07:22 PM
Well heres what im thinking. If someone shot at me I dont care how good they are I would be shooting at them next. Therefor I would think they all would shoot at Dan. Most probability of them ending the game there since there all good shots and even if they miss hes not gonna hit any of them. Think about it. If you were holding a gun agaisnt 3 people and you knew 1 out of you 3 was a horrible shot but you only needed one dead to win, what would give you best chances of ending in the money, so to speak. You fire at a good person and miss hes not gonna ignore a good shot and focus on a crappy one. Why find out which one is a little bit better outa the three in this situation instead of focusing on the one everyone knows is blind. Hence, Dan is dead. /images/graemlins/smirk.gif

TomCollins
11-10-2004, 07:35 PM
Remember all of our players are playing with perfect strategy, so they don't need to worry about pissing everyone off. They just want to maximize their profit. They maximize their profit when the most people die, so any case that provides the maximum EV of people dying, the better. I have not proved this, but I beleive your maximum EV happens when all three of your opponents have an equal chance of dying on the next round. Since everyone is attempting to reach this equilibrium, the optimum play happens when everyone has an equal chance of dying. A shoots B x% of the time, C y% of the time, and D z% of the time.

BettyBoopAA
11-10-2004, 08:35 PM
"but I beleive your maximum EV happens when all three of your opponents have an equal chance of dying on the next round"
How is that possible, it's not. I believe this problem is a variation of the prisioners dilema. The best solution would appear for A, B and C to all shoot at D at the same time which would most likely eliminate D and there is 20% chance that one of the 3 would be killed by D.
The dilemna here makes everyone shoot at the same time and gets player A into thinking, that B is better off if he shoots A because there will be less people to split the money. If everyone shoots at someone different, we can expect 2 people to die on average(with a small chance that all are dead). Since no one trust anybody else, I don't see how A wouldn't shoot at B. B understands this and his best chance of survival is to shoot at A since his chance of killing A is higher than the other players chances and his best hope is that A misses him. B can't risk shooting at someone else in the case everyone misses, he doesn't want A to have another chance at him.
If you believe that then C shoots at D and D shoots at C.

TomCollins
11-10-2004, 10:28 PM
You aren't accounting for the fact that when two people die, you are much better off. Suppose B and C always shoot D, and D shoots at C, since this is what your strategy is.

So in this strategy (without acconting for A's actions), D will live .6*.4 of the time, C will die .2 of the time.

Compare my strategy to yours.
You always shoot at D.
So you will have the following results:
All Live: 3.8%
C Dies Only: 1.0%
D Dies Only: 76.2%
Both Die: 19.0%

So by the time the game ends, the EV of the players are as follows:

A: 36.6% of booty
B: 36.6% of booty
C: 26.4% of booty
D: .3% of booty.

So lets just see if A can do any better.
Suppose he decides to try to shoot B instead.

My EVs end up as:
A: 51.4%
B: 6.1%
C: 34.8%
D: 7.7%

So A stands to take a HUGE improvement by shooting B instead of D. D will still die 76% of the time, instead of 80% of the time. But now B dies 80% of the time as well, instead of 0%.

Since B, C, D use optimal strategy, they will adjust to this by trying to shoot A more often as well.

The equilibrium is almost certianly when everyone is equally likely to die.

You also state that it is impossible to reach that.
It is not. You do something like A shoots B 23% of the time C 37% of the time and D the remainder. I don't have hte exact numbers calculated, but its very possible to reach that.

BettyBoopAA
11-10-2004, 10:55 PM
Tom,
you missunderstood my solution. What the players should do is shoot at D, they will get approx 1/3 of the booty. However this is not what they will do.

A and B are both afraid that each other will not shoot at D and shoot at the other, thus lowering their EV.
Thus A and B aim at each other. Since C and D know this, they both aim at each other.
How do you shoot at someone 23% of the time if you have one shot? You have to choose someone.

TomCollins
11-10-2004, 11:23 PM
You flip a 100-sided dice and if its 23 or lower, you shoot him. It is very simple.

tek
11-11-2004, 04:15 PM
If they were smart, all three would shoot Dan (preferably in the back while he's walking out of the saloon to Main Street).

Since he is sucky, they wouldn't want him to get lucky and take out one of the other three fair to good shooters.

Nor would it be logical for the other three to risk taking each other out.

BettyBoopAA
11-11-2004, 04:37 PM
You flip a 100-sided dice and if its 23 or lower, you shoot him. It is very simple.

Yes I understand but your solution involves trust, You are player A you roll your dice, you are not worried that player B's best solution is now to shoot at you since that would maximize his value.
A, B and C should all shoot at D, but by not attacking the weak combined with a lack of trust as each player is better of if they and they alone break from the ideal solution thus the 2 weekest shooters will have the best chance to survive.

TomCollins
11-11-2004, 04:51 PM
My solution has nothing to do with trust. In fact, my solution is quite the opposite.

If all tried to gang up on D, then A would realize he would fare better by trying to shoot B or C instead. That way, D is most likely going to die, but now the possibility of 2 deaths or even 3 deaths increases tremendously.

As long as my assumption is true, that a player maximizes his EV when each of the other players is equally likely to die, there is a Nash Equilibrium when all 4 players are equally likely to die. No player can improve his chances beyond this.

It cannot be an ideal situation for an individual if he can break from a strategy and do better, as well as if another individual can break from the strategy and cause you to do worse.

TomCollins
11-11-2004, 05:02 PM
In fact, since all of these people are perfectly logical, they would realize there is no way to win, and decide that it is better to split the money equally than to risk the small chance that everyone dies.

So my FINAL ANSWER is they both realize this and decide that a gunfight is a terrible idea with expert game theorists.

BettyBoopAA
11-11-2004, 06:21 PM
"As long as my assumption is true, that a player maximizes his EV when each of the other players is equally likely to die, there is a Nash Equilibrium when all 4 players are equally likely to die. No player can improve his chances beyond this"

I don't think this is true at all in this problem. A, B and C are all better if if they shoot D, 95.2% of the time, they will kill D in the first round and they will die 1/3*20% (6.6% of the time)
Player A-C all faced with the question would you have 1/3X all of the time are 1/2X 50 percent of the time so your assumption is not correct.

The Trust part comes will A B and C trust each other to do the correct thing. It like the 2 prisoners who must remain silent and not admit the other one is guilty.

TomCollins
11-11-2004, 06:46 PM
This is a fact. I have proved the math here. Whether or not you believe me is another story.

Suppose the following percentages of dying. A= 6.6%, B=6.6%, C=6.6%, D=1-(.2)(.4)(.6)= 95.2%.

With another scenerio, where A shoots B every time.
A = 6.6%, B = 76.52%, C = 6.6%, and D=76%.

In these two scenerios, A fares much better. The odds that B C and D die have risen tremendously (since it was impossible before). Also, the odds of 2 players dying happens nearly 3/5 of the time. A does fantastically better here. So why would A not take this chance?

Of course, if they could all agree to do this and trust each other, there would not be a free-for-all, and A, B, and C could put themselves in a situation where D cannot improve and, and all three do better than if they could not trust each other.

If there could be an alliance formed, there are 4 possible alliances. Each player prefers to be in a alliance to not being in one. Also, each player fares best when his "alliance" wins more often. Every player would then want to be in an alliance with A. A would prefer to be in one with B and C.

So this is where the prisoners dilema comes in, since each of the three A,B, and C are better off cooperating. But each is rewarded by deviating.

The key here is the words "and they all act rationally once the shooting starts.". Rationality in terms of the prisoner dilema results in deviating from the "optimum" outcome for the group. So I stand by my point that The gunfight is cancelled, and everyone splits it equally.

However, the smartest thing for A,B, and C to do collectively is to trade their shares of the loot with each other, so that no one can deviate from the plan. If A, B, and C each agree to have 1/2 of the other opponents share loot if all three live and all of it if one dies, and keep none of their own share, they could act completely rationally, and do better than 25%.

So my NEW solution is that A, B, and C agree to take 1/2 of each of the other two's share at the end of the gunfight if they all live. Otherwise, they get the other survivors loot. If they are the only survivor in the alliance, they get to keep their own money.

In this case, there is much less an incentive to kill someone in your alliance, since the only way you get money is if they survive, or you kill both of them.

jimdmcevoy
11-12-2004, 03:09 AM
Well put Tom

Senor Choppy
11-12-2004, 11:19 PM
I've never seen someone miss the point of the original post in so many different ways.

There are no alliances, no hidden trust issues, etc. They have to make a decision on the spot with no deal-making going on beforehand, what should they do?

The answer, as has been stated before, is they all shoot at the best shooter of the remaining 3, which effectively means that the worst 2 are guaranteed survival.

jason1990
11-13-2004, 12:19 AM
What remaining three? There are four of them. If one or more dies, the "game" is over. Whenever the "game" is on, there are always four shooters. There is never a "remaining three" that factors into anyone's decision making. The only relevant factor is how to maximize the likelihood that more than one shooter dies so that the survivors' share of the booty is larger.

BettyBoopAA
11-13-2004, 12:28 AM
What remaining three?
I don't think anyone will shoot themselves, that makes 3 and all 4 will start shooting at the same time.

Senor Choppy
11-13-2004, 07:02 AM
Putting yourself in the shoes of the shooter, there are 3 remaining players to choose from when deciding who the best shot is because you can't choose yourself.

MickeyHoldem
11-13-2004, 10:14 AM
Let's assume you want to increase your share of the loot... The only way to do this is by:
1) shooting at someone that may not be targeted by someone else thereby maximizing the number of people that may die
2) if there is no target that is not being shot at already, shoot at the target being shot at by the worst shooter, again icreasing the likelyhood that more will die
3) praying that you do not get shot!

Since you don't have any control over 3) let's look at 1) and 2)

Suppose your Abe: Given that you don't have any info about how the others will choose their target Abe constructs the following table of variations:

Bill Corby Dan
Abe Abe Abe shoot anyone .33B .33C .33D
Abe Abe Bill shoot Corby or Dan... score .5C and .5D
Abe Abe Corby shoot Bill or Dan... score .5B and .5D
Abe Bill Abe .5C .5D
Abe Bill Bill .5C .5D
Abe Bill Corby shoot Dan... score 1D
Abe Dan Abe .5B .5C
Abe Dan Bill 1C
Abe Dan Corby 1B
Corby Abe Abe .5B .5D
Corby Abe Bill 1D
Corby Abe Corby .5B .5D
Corby Bill Abe 1D
Corby Bill Bill 1D
Corby Bill Corby 1D
Corby Dan Abe 1B
Corby Dan Bill shoot Bill... he has the least chance of dying...score 1B
Corby Dan Corby 1B
Dan Abe Abe .5B .5C
Dan Abe Bill 1C
Dan Abe Corby 1B
Dan Bill Abe 1C
Dan Bill Bill 1C
Dan Bill Corby shoot Corby 1C
Dan Dan Abe .5B .5C
Dan Dan Bill 1C
Dan Dan Corby 1B

add up all this and you get 9.33B 9.33C 8.33D, so if your Abe you shoot either Bill or Corby... at this point it doesn't matter. But wait. Let's now consider Bill's options... Bill, who has spent more time studying (than practicing with his pistol) figures that Abe will not shoot Dan for the same reasons. He now considers this table:

Abe Corby Dan
Bill Abe Abe .5C .5D
Bill Abe Bill .5C .5D
Bill Abe Corby 1D
Bill Bill Abe .5C .5D
Bill Bill Bill .33A .33C .33D
Bill Bill Corby .5A .5D
Bill Dan Abe 1C
Bill Dan Bill .5A .5C
Bill Dan Corby 1A
Corby Abe Abe 1D
Corby Abe Bill 1D
Corby Abe Corby 1D
Corby Bill Abe 1D
Corby Bill Bill .5A .5D
Corby Bill Corby .5A .5D
Corby Dan Abe 1A
Corby Dan Bill 1A
Corby Dan Corby 1A

Total 6.33A 3.33C 8.33D, so clearly Bill decides to shoot Dan. Now Corby, who has studied even more than Abe and Bill, quickly comes to the same conclusion, and constucts this table in his head:

Abe Bill Dan
Bill Dan Abe 1A
Bill Dan Bill 1A
Bill Dan Corby 1A
Corby Dan Abe 1B
Corby Dan Bill 1A
Corby Dan Corby .5A .5B

and decides to shoot Abe.

Meanwhile Dan has been at the saloon drinking heavily. He knows he was a fool for getting involved in this shootout. The only thing more pathetic than his shooting skill is his math ability. Dan was always goofing off in school, going fishing with his buddies, chasing the girls. Ah, the girls! He'd always been lucky with the ladies... until he met Sally, his true love. He knew the moment he saw her that she was the one... and if he'd married her, oh how his life might have been different. Instead of falling into this drifting, criminal life, he might now have a family, gainful employment, and a future. If only it hadn't been for that prick Bill. After another couple drinks, Dan is going to shoot Bill!

Alright... this has been a long post. Bill can use the table method to arrive at shooting Bill and then Abe if he can follow this mess will choose to shoot Corby... so we have
A &lt;--&gt; C
B &lt;--&gt; D
I don't think this is better than
A &lt;--&gt; B
C &lt;--&gt; D
but getting A choosing to shoot B because he is the next best shot is faulty logic. There is no logical reason to shoot someone based on shooting ability.

I think Bill survives most often and Corby bits it the most.

I may try and write some code for a 4 player game...

TomCollins
11-13-2004, 11:46 AM
If there are no deals that can be made, then all will realize its a bad deal and not agree.

If that is not possible, they will all have an equal shot of dying.

Your strategy is wrong, as I have proved it several times. Read the former posts for the math.

jason1990
11-13-2004, 11:51 AM
Let me put it this way. According to the assumptions of the problem, everyone will shoot simultaneously. And the bullets will all fly at the same speed. Those that hit their target will hit them at the same time. If anyone dies, the shooting stops. If no one dies, everyone again fires simultaneously.

So suppose I'm Bill and I want to shoot at Abe because he's the best shot and I think that shooting at him will increase my chance of survival. Well, since we're all shooting simultaneously, shooting at Abe will not protect me in the first round. It can only protect me in later rounds if I happen to hit him. But if I hit him, there won't be any later rounds. So I cannot protect myself at all by my choice of who to shoot at. The only thing I can do is try to maximize the number of people who will be shot, and thereby maximize my share of the loot.

So it's not at all clear that the optimal solution is to shoot at the best shooter. In fact, it seems quite obvious that this is suboptimal, since it means 3 people are all shooting at the same person (Abe) which makes the expected number of deaths lower than it needs to be.

TomCollins
11-13-2004, 01:00 PM
Jason, as resident Math God, can you confirm my answer so that we can put an end to this thread?

jason1990
11-13-2004, 05:08 PM
I would like to describe my understanding of the difference between these two things. I am not an expert in this field, so if someone else is, please correct me if I am mistaken.

Consider an n-player game. Each player chooses a strategy S_j. We have n functions f_1, ..., f_n such that

f_j(S_1, ..., S_n)

is player j's EV under these strategies.

A Nash Equilibrium is a set of strategies S_1, ..., S_n such that, for each j, the function

x --&gt; f_j(S_1, ..., S_{j-1}, x, S_{j+1}, ..., S_n)

has a local maximum at x = S_j.

An optimal strategy for player j is a strategy x = S_j that is the global maximum of the function

x --&gt; min f_j(S_1, ..., S_{j-1}, x, S_{j+1}, ..., S_n),

where the min is taken over all possible sets of strategies (S_1, ..., S_{j-1}, S_{j+1}, ..., S_n).

Can anyone tell me whether this is correct and describe the relationship between these objects?

jason1990
11-13-2004, 06:05 PM
Here's a quote from http://www.haverford.edu/math/lbutler/maths-illustrated.html

"Nash proved that every general sum (noncooperative) game has an equilibrium: a collection of (mixed) strategies, one for each player, such that no player can improve his (expected) payoff by changing his (mixed) strategy unilaterally. Examples like the Prisoner's Dilemma show that equilibrium strategies should not be called optimal, contrary to Keith Devlin's presentation on National Public Radio and Akiva Goldsman's scene for A Beautiful Mind. A Nash equilibrium is not necessarily a "best solution" nor does it necessarily give the "best result". (The links above are to the 7 minute audio clip and 90 second video clip from which I quote.) Nevertheless, at such an equilibrium, no player is motivated to change his (mixed) strategy since he cannot force other players to change theirs. In the Prisoner's Dilemma, what's best for the two players is for them both to refuse to testify (and each spend 1 year in jail), but this is not a Nash equilibrium since if one were to testify he would be rewarded with a reduced sentence (0 years). At the Nash equilibrium they both testify (and each spend 2 years in jail), since if only one were to refuse to testify he would be punished with a longer jail sentence (3 years) based on the other's testimony."

I have a strong hunch that this Shoot Out problem is an example like this. There are equilibria, but no "optimal" strategies.

BettyBoopAA
11-13-2004, 06:12 PM
Abe and Corby are smart and will not like this solution, they are both worse of than the solution of A,B,C all shoot D. A will go back to the matrix and decide not to shoot Cory. This will start the process of look out for yourself.

Abe doesn't want Bill to shoot him so by shooting the next best shooter, each players thinks they increase their chance for survival.
I don't think there's a better math solution than A,B and C shooting D but I think if you really did this problem in real life, using paintguns, you would find A&lt;&gt;B and C&lt;&gt;D

Gabe
11-14-2004, 04:06 PM
I haven't looked at the other answers yet, but off the top of my head I would think that the others would shoot at Abe, because if Abe was gunning one of them, that person would only have a 20% chance of surviving. So if the others all go for Abe, he only has a 19% chance of surviving and since whoever Abe is gunning for has a 20% chance of making it, Abe is most likely to be eliminated.

If my logic is correct, Corby and Dan would have an equal chance of surviving, since the others will gun for Abe, and Abe will try to pick off the next best shot, Bill, then Corby and Dan should make it.

Mike Haven
11-18-2004, 11:34 PM

they all aim at Dan knowing he can only shoot at one of them AND he is likely to miss

Dan will die first eventually probably before he is able to take anyone out

then they share the booty

of course if Dan gets a lucky hit off first whoever he shoots dies but this is less likely than him getting shot

the chance of Dan surviving AND killing one on any particular volley is miss x miss x miss x dan hits

this is 0.2 x 0.4 x 0.6 x 0.2 = 0.0096 less than 1% (that is if they gang up on Dan and he shoots at any one of the others)

dogsballs
11-19-2004, 07:56 PM
Each player wants to maximise the chance that someone...anyone except themselves dies on the first round. Assuming they cannot influence how others aim, then they simply want to maximise the chance that someone else bites it and just hope anyone shooting at them misses their shot..

From Mr20's point of view it doesn't really matter who he shoots at if they all shoot at him.

A) From Mr80's point of view:

If he shoots at Mr20 and the others also do, then someone else bites it 95.2% of the time...plus Mr20's share (20% of the time they all miss multiplied by the two thirds of the time on average he's shooting at one of the others and not Mr80). I'll ignore Mr20's contribution for these comparisons.

If the other 2 go for Mr20 and Mr80 goes for Mr60, then someone else dies 95.2% (76% Mr20 + 19.2% Mr60 when Mr20 is missed) of the time. The same.

There's no extra incentive for Mr80 to shoot anyone except Mr20. Going for Mr60 doesn't change his outcome.

B) From Mr60's point of view:

If Mr80 and Mr40 go for Mr20 and he "defects" by aiming for Mr80, then someone else will die 88% + (60% of 12%) = 95.2%. Same result.

1) If he suspects Mr80 will go for him and if he still goes for Mr20, then someone else (apart from Mr60) will die 1-(0.4*0.6) = 76% plus two thirds of 20% of 24% (Mr20's attempt, assuming he chooses randomly). Total 79.2%.

2) If he suspects Mr80 will go for him and if he goes for Mr80, then someone else will die 60% plus (40% of 40%) = total 76% of the time; plus two thirds of 20% of 24% (Mr20's attempt). Total 79.2%.. Same thing. Plus he himself dies the same proportion of the time as in situation B1.

Defecting by going for the other best shot has no extra incentive for either Mr80 and Mr60; it has only downside.

From Mr40's point of view:
If the other 2 go for Mr20, then going for Mr80 means someone else dies 1-(0.2*0.4) = 92% plus 40% of 8%. Total 95.2% again...bit of a trend, huh...

There's no incentive for the better shooters to go for anybody except Mr20 and hope their buddies also know that, ensuring that their only risk is from Mr20 himself.

I havent checked this so may be wrong.

dogsballs
11-19-2004, 08:01 PM
[ QUOTE ]
I've never seen someone miss the point of the original post in so many different ways.

There are no alliances, no hidden trust issues, etc. They have to make a decision on the spot with no deal-making going on beforehand, what should they do?

The answer, as has been stated before, is they all shoot at the best shooter of the remaining 3, which effectively means that the worst 2 are guaranteed survival.

[/ QUOTE ]

its been mentioned already - shooting the next best shot is irrelevant. If you miss he's still there the next round. If you hit, the games over so it didn't really matter who you picked - you hit the target and it's finished.

dogsballs
11-19-2004, 08:17 PM
[ QUOTE ]
So suppose I'm Bill and I want to shoot at Abe because he's the best shot and I think that shooting at him will increase my chance of survival. Well, since we're all shooting simultaneously, shooting at Abe will not protect me in the first round. It can only protect me in later rounds if I happen to hit him. But if I hit him, there won't be any later rounds. So I cannot protect myself at all by my choice of who to shoot at. The only thing I can do is try to maximize the number of people who will be shot, and thereby maximize my share of the loot.

So it's not at all clear that the optimal solution is to shoot at the best shooter. In fact, it seems quite obvious that this is suboptimal, since it means 3 people are all shooting at the same person (Abe) which makes the expected number of deaths lower than it needs to be.

[/ QUOTE ]

I think jason's hit on the nugget here: it makes sense for all the better shots to realise that they should all aim for the worst shooter, thus ensuring the maximum opportunity for someone other than themselves to die.

TomCollins
11-19-2004, 11:42 PM
As it has been discussed many times, defecting does have a benefit, since it is much more likely that 2 people die.

Izaak_Walton
11-20-2004, 06:14 PM
Great response! /images/graemlins/grin.gif /images/graemlins/grin.gif /images/graemlins/grin.gif /images/graemlins/grin.gif /images/graemlins/grin.gif /images/graemlins/grin.gif

dogsballs
11-21-2004, 08:47 PM
[ QUOTE ]
As it has been discussed many times, defecting does have a benefit, since it is much more likely that 2 people die.

[/ QUOTE ]

I didn't bother factoring that in. All I want to do is best survive the shootout first. (I'll get the rest off 'em later anyway /images/graemlins/wink.gif)