PDA

View Full Version : Statistics Question


08-25-2005, 03:46 PM
This was posted in the Science, Math and Philosphy forum, but it doesn't look like I'm going to get an answer there, so I posted here (apologies to the wrong-forum nits)

So I'm reading one of my textbooks, and I read something that if I understand correctly basically says:

Suppose that you have $1. You bet some one on a coin flip, even money. If you lose, you stop. If you win, 1/2 a second later you bet $2. If you lose you stop. If you win, 1/4 a second later you bet $4. If you lose you stop. If you win, 1/8 of a second you bet $8. etc...

So, the authors of the book conclude the probability that you will go broke after 1 second is 1. I agree.

However they conclude that your EV after 1 second is -$1. I disagree, I say it's $0.

What do you guys think? It may just be me not understanding a definition some where, I dunno

Anyways, here's some of the book:


Quote:
--------------------------------------------------------------------------------

Example 3.1 The following example of a suicide strategy is borrowed from Harrison and Pliska (1981). It can be modified easily to provide an example of an arbitrage opportunity in an unconstrained Black-Scholes setting. For simplicity, we take r = 0, T=1, and So=1. For a strictly positive constant b>0, we consider the following trading strategy:

y1(t) = {-b 0 =< t <= r(b), 0 otherwise}

y2(t) = {1+b, 0 =< t <= r(b), 0 otherwise)

where

r(b) = inf{t: St = 1 + 1/b} = inf{t:V(y,t) = 0}


--------------------------------------------------------------------------------



here V(y,t) is defined as:

y1(t)*St + y2(t)*Bt

St = Stock Price
Bt = Bond Price (risk free investment, like money in a savings account (means Bt>0) or loaned from a bank (means Bt<0))


Quote:
--------------------------------------------------------------------------------

In financial interpretation, an investor starts with one dollar of wealth, sell b shares of stock short, and buys 1+b bonds. Then he holds the portfolio until the terminal date T=1, or goes bankrupt, whichever comes first. The probability of bankruptcy under this strategy is equal to p(b) = P{r(b)<1}, so that it increases from zero to one as b increases from zero to infinity. By selling short a very large amount of stock, the investor makes his failure almost certain, but he will probably make a great deal of money if he survives. The chance of survival can be completely eliminated, however, by escalating the amount of stock sold short in the following way.

To show this, we shall modify the strategy as follows. On the time interval [0,1/2], we follow the strategy above with b = 1. The probability of bankruptcy during [0,1/2] thus equals p = P{r() =< 1/2}. If r(1) >1/2, the amount of stock sold short is adjusted to a new level b1 at time 1/2. Simultaneously, the number of bonds held is revised in a self-financing fashion. The number b1 is chosen so as to make the conditional probability of ruin during the time interval (1/2,3/4], given that we have survived up to time 1/2, equal to p again.

In general, if at any time t(n) = 1 - (1/2)^n we still have positive wealth, then we readjust (typically increase) the amount of stock sold short, so that the conditional probability of bankruptcy during (t(n), t(n+1)] is always p. To keep the strategy self-financing, the amount of bonds held must be adjusted at each time t(n) as well. The probability of survival until time t(n) is then (1-p)^n, and it vanishes as n tends to 0 (so that t(n) tends to 1). We thus have an example of a piecewise constant, self-financing strategy, (y1,y2) say, with V(y,t) = y1(0)So + y2(o) = 1,

V(y,t) = y1(t)St + y2(t)>=0, for any t in [0,1)


--------------------------------------------------------------------------------



by So they mean St at time t=0


Quote:
--------------------------------------------------------------------------------

and V(y,1) = 0. To get a reliable model of a security market we need, of course, to exclude such examples of doubling strategies from the market model.


--------------------------------------------------------------------------------



So basically I say V(y,1) = 1, not 0.

This is from "Martingale Methods in Financial Modelling" by Marek Musiela and Marek Rutkowski

I typed this (yes the whole damn thing) so there many or may not be some errors, also, I changed a few symbols (like y instead of psi)

LetYouDown
08-25-2005, 03:52 PM
Well, you're betting $1...you're going to lose that dollar. I guess it's a matter of semantics whether it's -1 or 0. I'd side with -1. How would you qualify the EV of a bet where you bet $1 and get $1 returned to you no matter what?

08-25-2005, 04:05 PM
Basically I am thinking that there is an infintesimal chance that you win an infinite amount of money, which is why on average you will not lose or win any money

I think a similar question is the doubling system:

You bet $1, you win you stop. If you lose you bet $2, if you win you stop. If you lose you bet $4 etc....

What do you think the EV is of this game?

And does the answer change if you are playing red on roulette (house edge involved)?

Anyways, I probably picked a bad example, as my discrepancy with the book is definatly not a matter of semantics

LetYouDown
08-25-2005, 04:12 PM
No, I think I misunderstood what you meant. You're saying that the amount you win will still be inversely proportional to the chance that you win, regardless of the fact that the chance you will win approaches 0 for all practical intents and purposes?

08-25-2005, 04:17 PM
[ QUOTE ]
No, I think I misunderstood what you meant. You're saying that the amount you win will still be inversely proportional to the chance that you win, regardless of the fact that the chance you will win approaches 0 for all practical intents and purposes?

[/ QUOTE ]

Yeah pretty much

But to tell the truth I don't know the exact mathematical definition of EV, so that might be a factor

So what do you reckon the EV of the situation in the OP is? 0 (what I think) or -1 (what the authors of the book think)?

LetYouDown
08-25-2005, 04:22 PM
Practically, -1. In reality, 0. Although I'd say this qualifies as a paradox...and then again, I'm not certain that it does. All the more reason to think it's a paradox, lol.

mosdef
08-25-2005, 05:06 PM
Interesting question.

You're right by the way. The expected value of your winnings is 0, not -1.

The reason for the apparent paradox is that just because the prob of your winnings equalling -1 approaches 100%, that doesn't make the EV -1. Consider the following thought exercise. Play the exact same game, but everytime you win the coin toss you actually win nothing. Now, according to the authors' logic, since prob of of your winnings equalling -1 approaches 100%, then EV -1. But this new game can't have the same EV, can it? Of course not.

Return to the original example. The reason that their logic breaks down is that they are making an incorrect inference. In particular, let X_n be the r.v. representing your winnings in the game. Let Y equal r.v. equal to -1 with probability 1. Prob(X_n=-1)->1 as n-> infinity. However, the r.v. X_n does NOT approach the r.v. Y as n-> infinity. For that to be true, the values of X_n and Y would need to fit within an arbitrarily small distance of each other over the whole range of possible outcomes. This is clearly not the case since some of the values become X_n are extremely large as n-> infinity.

This is very hard to explain without a chalk board and pictures. Please fly to Toronto with a chalk board and I will explain it to you in person.

08-25-2005, 05:23 PM
[ QUOTE ]
Please fly to Toronto with a chalk board and I will explain it to you in person.

[/ QUOTE ]
/images/graemlins/grin.gif


What does r.v. stand for?

mosdef
08-25-2005, 05:27 PM
random variable

08-25-2005, 05:36 PM
[ QUOTE ]
random variable

[/ QUOTE ]
/images/graemlins/blush.gif

ok, it makes sense now, thanks, I was beginning to think I seriously misunderstood a very basic concept some where since this was in a textbook

In the past, when I have had disagreements with textbooks, it was profitable to bet on the textbook, not me

AaronBrown
08-26-2005, 02:20 PM
This is one of those weird things about infinity that defies common sense. I understand how uncomfortable people are with saying you can have negative expected value from making fair bets, but it is true. A simpler example is the standard martingale strategy of doubling your bet every time you lose in roulette. If you can bet unlimited amounts and guarantee to finish in finite time (as by making each spin take half as long as the previous one) you can have a positive expected value for the strategy although each bet individually is negative expected value. That's no argument for actually playing the thing, it's just a consequence of the mathematical definitions.

If you like this sort of thing, remember the infinite hotel. It's all full when the infinite bus pulls up with an infinite number of guests. "No problem," says the clerk, "I'll just tell all the existing guests to move to the room twice the number of their current room. That will free up an infinite number of odd numbered rooms, so we can accomodate everyone." It's no use arguing that the hotel was full before so it can't accomodate new guests, with infinity that kind of argument doesn't work.

Another fun one is there are more irrational numbers than rational numbers, but there are an infinite number of rational numbers between any two irrational numbers.

mosdef
08-26-2005, 02:37 PM
[ QUOTE ]
Another fun one is there are more irrational numbers than rational numbers, but there are an infinite number of rational numbers between any two irrational numbers.

[/ QUOTE ]

well, more mind-boggling is how many MORE irrational numbers there are than rational numbers.

also, the number of rational numbers in the interval (0,1) is the same as the number of rational numbers in the interval (0,2) and is the same as the number of integers.

also, most of the irrational numbers are trascendental, but we only know of about 25 of them.

these things provide great food for thought up to the point where you go insane. after that you get tenure and no one ever sees you again so it doesn't matter.

by the way, do you think the EV of the game is 0 or -1? initially i thought it was 0 (see my previous post), however you've got me doubting myself. However, i think if you write the winnings as a function of n, where n is the last toss, then sum all of the winning over all n weighted by the probability of n being the last toss, you get 0, not -1.

AaronBrown
08-26-2005, 02:50 PM
[ QUOTE ]
by the way, do you think the EV of the game is 0 or -1? initially i thought it was 0 (see my previous post), however you've got me doubting myself. However, i think if you write the winnings as a function of n, where n is the last toss, then sum all of the winning over all n weighted by the probability of n being the last toss, you get 0, not -1.

[/ QUOTE ]
It's -1, just as the roulette strategy is +1.

The problem with your logic is the probability of n being the last toss is not indpendent of whether the result is 0 or 1. You're computing the unconditional distribution of wealth at points in time and averaging them. The strategy determines a distribution of wealth conditional on stopping.

mosdef
08-26-2005, 03:03 PM
Hmm... I think I see it now.

Let N be the random variable equal to the number of the last toss. N will take on values 1,2,3,... with probabilities 1/2, 1/4, etc.

Let X be your profit when the game stops. X is a random variable, written as a function of the random variable N it is X(n)=-1 for all n. So if you take the E[X] you just get -1. Correct?

08-27-2005, 08:49 AM
[ QUOTE ]
you can have a positive expected value for the strategy although each bet individually is negative expected value.

[/ QUOTE ]

You are right, to suggest this makes me very uncomfortable.

I have to ask you another question:

You bet $1 on a coinflip. If you lose, you stop. If you win, you bet again, but your next bet you get 2:1 odds (+EV), so you bet $2, if you lose you stop, but if you win, you are now stuck with a profit of $5 ($1 from first bet, $4 from second bet)

So your next bet is $6, and now you get 4:1. If you lose you stop, if you win you now have $29 profit.

And on your next bet you get 8:1 odds. You bet $30. if you lose you stop, if you win you have a profit of $269.

Your next bet you get 16:1 odds, etc....

Same as before, each bet takes half as long, so it's over in a second. Do you still think the EV is -$1? I think it is now positve infinity, quite a big discrepency.

alThor
08-27-2005, 01:35 PM
[ QUOTE ]
I think it is now positve infinity, quite a big discrepency.

[/ QUOTE ]

Yes; the coin flip problem as you described it is just the St. Petersburg Paradox, but with an (irrelevent) "time" story. The EV involves an infinite sum. By the standard way of calculating an infinite sum, the EV is infinite.

Yet in addition, the probability of going broke is also 100%; that's the paradox!. It's just yet another "infinity" paradox.

Now regarding the book (which I have not read), I don't know what assumptions they are using. For instance, if you bound the wealth of your opponent in the St Petersburg game, you cannot get infinite EV. The value of the world's capital is finite. Therefore one cannot apply the coin example directly to the stock market. Hence I cannot say whether there is a mistake in the book.

But in summary, you are correct that there would be a flaw in any reasoning that said, in your coin example, that "since P(ruin) = 1, EV must be -1". Again, I can't tell if that's what the book is trying to say.

alThor

krubban
08-27-2005, 01:42 PM
First visit to this part of the forums for me. Feels like being in mathclass /images/graemlins/smile.gif
I don't agree on your last example though that the estimated value is positive infinity. If you do an infinite amount of coinflips you will lose eventually no matter what odds you get. Unless you meant that the profit from each flip is somehow saved up and not used to bet on your next flip.

08-27-2005, 03:50 PM
[ QUOTE ]
[ QUOTE ]
you can have a positive expected value for the strategy although each bet individually is negative expected value.

[/ QUOTE ]

You are right, to suggest this makes me very uncomfortable.

I have to ask you another question:

You bet $1 on a coinflip. If you lose, you stop. If you win, you bet again, but your next bet you get 2:1 odds (+EV), so you bet $2, if you lose you stop, but if you win, you are now stuck with a profit of $5 ($1 from first bet, $4 from second bet)

So your next bet is $6, and now you get 4:1. If you lose you stop, if you win you now have $29 profit.

And on your next bet you get 8:1 odds. You bet $30. if you lose you stop, if you win you have a profit of $269.

Your next bet you get 16:1 odds, etc....

Same as before, each bet takes half as long, so it's over in a second. Do you still think the EV is -$1? I think it is now positve infinity, quite a big discrepency.

[/ QUOTE ]

I would like to change it a bit more

Repeat this new process (which took no more than 1 second) but do it in no more than 1/2 a second, then do the whole process again in no more than 1/4 a second

Now according to your logic the EV should be negative infinity, and I say it's positive infinity, an even bigger discrepency