#1
|
|||
|
|||
EV
Can someone please give me a good idea of what EV means in non-math terms? Thanks
|
#2
|
|||
|
|||
Re: EV
[ QUOTE ]
Can someone please give me a good idea of what EV means in non-math terms? Thanks [/ QUOTE ] OK, you go on a date that costs you $90 for dinner and drinks and there is a 75% chance that you will get laid afterwards. Your other choice is to buy a hooker that costs $100 but you have a 99% chance to get laid. In this example the hooker has a higher EV. Now if you could get by for a $50 dinner and still have the 75% chance of getting laid the date would have the higher EV. Sorry that I still needed to use some math but does this help? |
#3
|
|||
|
|||
not quite
Thanks for your reply. But you just gave an example of ev, not a definition of what it means.
|
#4
|
|||
|
|||
Re: EV
OK, you go on a date that costs you $90 for dinner and drinks and there is a 75% chance that you will get laid afterwards. Your other choice is to buy a hooker that costs $100 but you have a 99% chance to get laid.
In this example the hooker has a higher EV. It general it depends on how much getting laid is worth to you. In the date scenario you have a 25% chance of not getting laid and losing $90, and a 75% chance of getting laid and still losing $90. Your EV would be .25*(-90) + .75*(-90 + laid) = .75*laid - 90. So your EV is three-quarters of a lay minus 90 bucks. With the hooker, you have a 1% chance of losing $100 bucks and not getting any, and a 99% chance of getting laid and losing $100, so your EV is .01*(-100) + .99*(-100+ laid)= .99*laid - 100. So the hooker is a better deal only if .99*laid - 100 > .75*laid - 90 This holds only if laid > $41.67. So if getting laid is worth more than $41.67, meaning you would pay that for a 100% chance of getting laid, then you should go with the hooker. Otherwise you should go on the date assuming you somehow must make one of these two choices. Of course, if getting laid isn't worth $41.67 to you, then you certainly won't pay $90 or $100 for just a chance of getting laid, so in this case you would always take the hooker or nothing. The next case is more interesting. Now if you could get by for a $50 dinner and still have the 75% chance of getting laid the date would have the higher EV. Now your EV would be .25*(-50) + .75*(-50 + laid) = .75*laid - 50. This would be better than the hooker if: .75*laid - 50 > .99*laid - 100 or laid > $208.33 So now getting laid has to be worth more than $208.33 to you in order to prefer the hooker, all other things being equal of course, and if they aren't, you can do a different calculation. As for the mathematical definition, EV is a weighted sum of the value assigned to each random outcome weighted by the probability of obtaining that outcome. So if X(i) is a random vector of values for N possible outcomes indexed by i, and P(i) is the probability of X(i), then: EV = sum{i = 1 to N} P(i)*X(i) If X is a continuous random variable, such as the random choice of a number from 0 to 1, then the sum becomes an integral. |
#5
|
|||
|
|||
Re: EV
Imagine you got to make this same decision many times. The average amount you expect to make (or lose) is the EV. As many gets very large, the average gets closer and closer to the EV, which is actually defined in terms of probabilities instead of averages.
|
#6
|
|||
|
|||
Re: EV
Oh, he said "non-math". EV stands for expected value. Think of it as an average. It is what you will make on average if you take a certain action a large enough number of times. If something has an EV of $1, you might win $2 with some probability, or you might win $.50 with some probability, or you might lose $3 with some probability, but on average you will win $1, and if you played a thousand times, you would win around $1000, and your win per play would be close $1, and if you played a million times, your win per play would be even closer to $1 per play, even though the actual dollar amount that you are different from $1 million dollars may actually be greater than the amount you were different from $1000 after 1000 plays. The average will still be closer to $1.
The amount that you are different from $1000 or $1 million is determined by the standard deviation. That is a measure of the spread of your results. Your standard deviation for 1 million plays is greater than your standard deviation for 1000 plays, which is why your spread is bigger after 1 million plays, but your standard deviation per play is actually smaller after 1 million plays, which is why your average play will be closer to $1. This is because when you go from 1000 plays to 1 million plays, your standard deviation for all of those plays does not increase as fast as your average win for all of those plays. Now a lottery may sometimes have a positive EV if the jackpot is large enough. That means you will make money if you play under these conditions a large enough number of times, but since you only make money when you hit the jackpot, and since the odds against that are millions to 1, you can't really ever play enough times to make money. So you don't really "expect" to make money when you play, even though your expected value is positive. |
#7
|
|||
|
|||
Wow
Great post, Bruce! I'm still laughing.
|
#8
|
|||
|
|||
Re: EV
"Oh, he said "non-math". "
And everyone knows hooking is all about math. |
#9
|
|||
|
|||
Re: not quite
[ QUOTE ]
Thanks for your reply. But you just gave an example of ev, not a definition of what it means. [/ QUOTE ] If you can recognize an example of EV why do you need a non math definition? |
#10
|
|||
|
|||
Re: EV
How much money you can expect to make playing a certain hand in a certain manner consistently
|
|
|