Two Plus Two Older Archives  

Go Back   Two Plus Two Older Archives > Limit Texas Hold'em > Micro-Limits
FAQ Community Calendar Today's Posts Search

 
 
Thread Tools Display Modes
Prev Previous Post   Next Post Next
  #1  
Old 12-04-2005, 11:46 PM
shadow29 shadow29 is offline
Senior Member
 
Join Date: Jun 2004
Location: ATL
Posts: 178
Default Mean Value Theorem Question

This post is pretty much worhtless, but I was wondering about something today.

For those who haven't taken Calc recently, the Mean Value Theorem states (from Google):

for two points (a, f(a) ) and ( b, f(b) ), on a continuous curve, there is a point c in between where the slope f '(c) is the same as the slope, m, of the line joining the two points.

Which basically means if you average a rate, you have to be going that rate at some point.

Applying this concept to poker, we all know that we go through downswings and upswings. From the Mean Value Theorem we can deduce that if our true winrate is 3 BB/100, then for a certain period of time we have to be winning at truly 3 BB/100. That's about as far as my math skills go. So to all those math majors out there, how long does one truly win at that rate? Is it just an instant? Or does it depend?
Reply With Quote
 


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT -4. The time now is 01:33 PM.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2024, vBulletin Solutions Inc.