MarkD
02-12-2004, 05:39 PM
Let's take a hypothetical player who we have accurate data on for a total of 1,000,000 hours. It doesn't matter how we got this data let's assume it exists.
For this 1,000,000 hours the player has a WR of 1 BB/hr and a SD of 10 BB/hr^(1/2).
From this we know his actual win rate is (within 99%):
WR = 1 +/- 0.03 BB/hr.
But in any given 8 hour session how much can he expect to win or lose?
My guess at the moment would be:
Win = WR*t +/- (3*SD/sqrt(t))*t
= WR * t +/- (3 * SD * sqrt(t))
where:
WR = win rate
t = time in hours
SD = standard deviation (BB / hr^(1/2))
which would give us a range for win/loss per 8 hour stretch of:
Win/loss = 8 bb's +/- 84.85 BB's.
Is this correct or am I way off?
For this 1,000,000 hours the player has a WR of 1 BB/hr and a SD of 10 BB/hr^(1/2).
From this we know his actual win rate is (within 99%):
WR = 1 +/- 0.03 BB/hr.
But in any given 8 hour session how much can he expect to win or lose?
My guess at the moment would be:
Win = WR*t +/- (3*SD/sqrt(t))*t
= WR * t +/- (3 * SD * sqrt(t))
where:
WR = win rate
t = time in hours
SD = standard deviation (BB / hr^(1/2))
which would give us a range for win/loss per 8 hour stretch of:
Win/loss = 8 bb's +/- 84.85 BB's.
Is this correct or am I way off?