View Full Version : What does standard deviation really mean?

11-03-2002, 12:51 AM
I have been tracking my play faithfully for the last 6 months. I have played appoximately 850 hours(over 110,000 hands online). My win rate is about $4.75 an hour(This is mostly at 1/2 and 2/4). I have moved up in limits as my bankroll has allowed; I now play some 3/6 and even test the waters at 5/10. I just computed my standard deviation at $101.82(That is using each of my 204 sessions as a data point).

I'm not really a statistics guy, so what does this standard deviation mean? How can I use this number to analyze my play? Any other stats advice is appreciated.


11-03-2002, 02:30 AM
Standard deviation is a measure of how much your results vary from the average. Your average hourly rate will be within 1 standard deviation of your true hourly rate with a probability of 68%, within 2 standard deviations with probability 95.4, and within 3 standard deviations with probability 99.7%. Use a table of the standard normal distribution to find other percentages. You will be 1 standard deviation below avererage with probabability 16%, 2 standard deviations below average with proability 2.3%, and 3 standard deviations below average 0.13%. Use these same percentages for being above the average. This will give you some level of confidence in what your true average rate is.

Your standard deviation seems too high. It should be somewhere around 10 times your hourly rate, lower if you play tight, and perhaps higher if you play in loose aggressive games or shorthanded games. You may want to track it separately for different limits.

If your standard deviation is too high, then you may be making too many marginal decisions, or playing in games that are very aggressive. This will cause you to take much longer to get to the "long run", the time that it takes before you will be ahead with a certain probability. For example, say you wanted to know how long it would take before you had an 84% chance of being ahead if you won 1 bb/hr on average. If your standard deviation was 10 bb/hr, this would take (10/1)^2 = 100 hours. If your standard deviation was 20, it would take (20/1)^2 = 400 hours, or over 4 times longer.

Another consequence of a standard deviation that is too high is that you have a greater probability of going broke for a given hourly rate and bankroll size.

11-03-2002, 03:15 AM

I used each session as a data point; each session is an average of 4.25 hours. Would this make a difference? My average session is a win of almost $20 which is about $4.64 an hour. My standard deviation is $101.79 or about 25 big bets, but would that be per session or per hour? Each session is 4.24 hours, so would that mean it is closer to $25 an hour, or about 6 big bets.

Sorry if these are silly questions; I had a stats class in college, but that was along time ago.

Thanks for your help,

Phat Mack
11-04-2002, 01:23 AM
How did you calculate your standard deviation?

11-04-2002, 07:27 AM
I used the STDEV function in Excel; I used my win or loss for each session as my data elements.

11-04-2002, 09:43 AM
This would give you standard deviation per session. If your standard deviation per session is 25 big bets, and EVERY session were 4.24 hours, then your hourly standard deviation would be 25/sqrt(4.25) = 12.1 big bets. You have to divide the standard deviation by the square root of the number of hours since you divide variance by the number of hours. BTW, people often say that their standard deviation is so many bb/hr. Strictly speaking this is incorrect. Standard deviation doesn't have units of bb/hr, it has units of bb/hr^.5. Variance has units of bb^2/hr. What they really mean is that this is their standard deviation for exactly 1 hour.

Since your session lengths will vary, this will only be an approximation. The formula given by Mason Malmuth in Computing Your Standard Deviation in the essay section of this site takes into account the length of each session. Try this formula instead.

11-06-2002, 03:17 AM
If your win rate is in $/hr your SD should be in $/hr - I've never understood why it was measured as $ / root(hr), could you explain that?

11-06-2002, 03:05 PM
Standard deviation cannot have units of $/hr. If it did, that would imply that if your standard deviation was say sigma $/hr, then if you played n hours it would be sigma*n $. Thats wrong; it depends on the square root of time, so after n hours it is
sigma*sqrt(n). Thus it has units of $/sqrt(hours). The reason for this comes from a property of normally distributed random variables that says the variance of a sum of random variables is the sum of the variances. So if your variance for 1 hour is sigma^2 and you play for n hours, your variance is n*sigma^2. Standard deviation is the square root of the variance or sqrt(n)*sigma.

When people say their standard deviation is so many bb/hr, they really mean that it is this many bb for 1 hour, or this many bb/hr^.5.