View Single Post
  #63  
Old 10-21-2005, 02:02 AM
jason1990 jason1990 is offline
Senior Member
 
Join Date: Sep 2004
Posts: 205
Default Re: Classic Type Game Theory Problem

[ QUOTE ]
[ QUOTE ]
[ QUOTE ]
Suppose m is a measure on [0,1] such that m([0,1]) = 1 and m({x}) = m({y}) for any x, y in [0,1]. That's what we mean by a uniform distribution.

[/ QUOTE ]

No it's not. That would allow too many measures. For example, any probability measure on [0,1] which is absolutely continuous with respect to Lebesgue measure has this property.

[/ QUOTE ]

No. What I mean by "uniform distribution" is a measure that assigns nonzero equal measure to each of the singletons in the space. This is what we need so that we can pick real numbers uniformly at random. If you want a continuous uniform distribution that's a different issue.

[/ QUOTE ]
Okay, so that's what *you* mean. But that's not what anyone else means. Everyone else understands it to mean Lebesgue measure. And when it means Lebesgue measure, the problem makes perfect sense.

[ QUOTE ]
[ QUOTE ]
A uniform distribution is one which is invariant under translations, rotations, and reflections.

[/ QUOTE ]

The notions of translations, rotations and reflections don't even make sense on a general measure space.

[/ QUOTE ]
No one's talking about a general measure space.

[ QUOTE ]
[ QUOTE ]
[ QUOTE ]
If m({x}) > 0 for some x in [0,1] choose n so large that n * m({x}) > 1. Then choose n distinct x_1, ... x_n in [0,1]. We'd have

1 = m([0,1]) >= m({x_1, ..., x_n}) = m({x_1}) + ... + m({x_n}) = n * m({x_1}) > 1

so that 1 > 1. This is a contradiction. Hence m({x}) = 0 for any x in [0,1]. Oops.

[/ QUOTE ]
What does "oops" mean here? Do you think you have arrived at a contradiction to the existence of m?

[/ QUOTE ]

We want the event {x} to have nonzero probability. That's the oops. We concluded that there is no measure assigning nonzero equal weights to all the points.

[/ QUOTE ]
No. *You* want the event {x} to have nonzero probability.

[ QUOTE ]
[ QUOTE ]
"Uniform distribution on [0,1]" is standard terminology for Lebesgue measure. And if m is Lebesgue measure, then m([0,1])=1 and m({x})=0 for all x. The statement of the problem makes perfect sense and the existence of the uniform distribution on [a,b] is proven in any first-year graduate real analysis course, and some undergraduate ones.

[/ QUOTE ]

You're joking, right? I just proved for you that you can't assign nonzero equal mass to each of the points. You can't pick real numbers uniformly at random.

[/ QUOTE ]
I'm not joking. The uniform distribution on [a,b] is a constant times Lebesgue measure. Its existence is proved in first-year courses.

[ QUOTE ]
[ QUOTE ]
Are you, by chance, an algebraist?

[/ QUOTE ]

No. Don't insult me. I'm a complex analyst. I study very serious problems which to a certain extent involve about trying to find very special measures on certain compact subsets of the complex place. The notions of Hausdorff measure, doubling measures, Calderon-Zygmund theory of singular integrals, capacity, etc. all play very major roles in what I study. Capacity as you might know arises by assigning a very special probability measure to the boundary of a compact set. cf. the Painleve problem, the Denjoy conjecture, the Vitushkin conjecture and Tolsa's theorem.

[/ QUOTE ]
I thought I was insulting the algebraists. [img]/images/graemlins/wink.gif[/img]

I'm a probabilist. And in probability, when someone uses the term "uniform distribution on [0,1]", they mean Lebesgue measure. You obviously think the term means something else, but you're wrong. Honestly, I'm stunned. Clearly you know enough so that this is a semantic issue and not a mathematical one, but how can your use of terminology be so off base? If you called a Markov process a martingale, it might be understandable since you're not a probabilist. But the term you're misusing is so basic and taught in every undergraduate probability and statistics course there is. Did someone hack your account? Is this jason_t's younger brother?
Reply With Quote