PDA

View Full Version : A theoretical game


SkyRocker
06-14-2005, 07:33 AM
Todd and Mike are playing a game. Todd says to Mike:

"I'll give you 5 bucks if you flip a coin and do the following; if it's heads the bet is over. If it's tails you flip again and if it's heads you give me 1$. But if it's tails the second time you have to flip again and give me 2$ if it's heads the third time. If it's tails 3 times in a row and then heads you give me 4$ and so on."

Mike alwasy keeps his 5$
Examples:
T: Mike gives 0$ to Todd
HT: Mike gives 1$ to Todd
HHHT: Mike gives 4$ to Todd
HHHHT: Mike gives 8$ to Todd


Should Mike take the bet for 5$? If not, how much should Mike demand to get paid to take the bet?

well
06-14-2005, 08:19 AM
So, if it's H the tth time, Mike pays 2^(t-2) for t>1.

Note that you switched H and T in your examples.

Let S be the stochastic variable denoting the amount of dollars Mike has to pay back.

Then ES (the Expectation value of S), is Sum{t=2..inf}: 2^-t*2^(t-2) = Sum{t=2..inf}: 1/4 = inf.

So, no price is good enoug.

Regards.

Jazza
06-14-2005, 09:00 AM
if Mike's utility function is concave down (like most peoples) or linear he should not take the bet

it's EV is -infinity like Well said

moomoocow
06-14-2005, 10:58 AM
A friend of mine actually offered me the other side of the bet (I would get +inf EV). She wanted $8 and I said done!

We were playing monopoly at the time so we used odds and evens on the dice instead. After I won the first 4 rolls (netting $7, $1 + $2 + $4 + $8 - $8) I was almost tempted to ask her how much she'd pay to be buy me out... but being the nice person I was, I won one more round and then lost, netting $23.

This girl also does the keep on doubling strategy at roulette in AC - Typical prospect theory behaviour.

probman
06-14-2005, 01:26 PM
This game is called the Saint Petersburg Paradox. If you were to only look at EV, then you are left with an apparent paradox. One needs to look at the expected log of the value of a portfolio that has some money in hand and some money bet on the game. Assume that b percent of ones money is placed in the game and 1-b kept in hand. One then maximizes the E(log(V(b))) over all b in [0,1], where V(b) is the value of the portfolio with the given make up. Let b* be the maximizing value of b. This is called the log optimal portfolio. The log optimal portfolio has some very nice properties, see Thomas Cover's 1980 paper Competitive Optimality of Logarithmic Investment. Using this analysis, a fair price can be found. In particular, the fair price for the game is the largest price for which b*>0.

For a little intuition, the reason one maximizes the expected log of a portfolio is that such a maximization is equivalent to maximizing the exponential growth rate of your wealth.

On a slight aside, the log optimal portfolio analysis can be used to justify the Black Scholes (no arbitrage) price for options. Here the portfolio would consist of keeping 1-b of your wealth in the bank and investing the remaining b in options. If you examine the largest option price where b*>0, we see that it agrees with the Black Scholes (no arbitrage) price.

AaronBrown
06-14-2005, 08:15 PM
The pro and con arguments to this approach are discussed nicely in a new book by William Poundstone, Fortune's Formula (http://www.amazon.com/exec/obidos/ASIN/0809046377/qid=1118794349/sr=2-1/ref=pd_bbs_b_2_1/104-0309909-8700705). It turns out to have some very interesting history, with a lot of famous people.

An important point is that to make this game worth more than $5, you have to assume your friend would really pay more than $1,000,000. That's why paying $8, much less infinity, is foolish in practice.

moomoocow
06-14-2005, 09:33 PM
We were college students back then so that's pretty relevant. I think even among (degenerate gambling) college student, a $127 debt would be honoured so -

1/2*1+
1/4*2+
1/8*4+
1/16*8+
1/32*16+
1/64*32+
1/128*64
= $3.5

so, in retrospect, ... I think I got the raw end of the deal - but got lucky /images/graemlins/smile.gif