Open Side Menu Go to the Top
Register
Ask a probabilist Ask a probabilist

08-07-2008 , 02:58 PM
Quote:
Originally Posted by Pokerlogist
Thanks for doing this. (May be should be in the Probability forum.)
Here's a problem I've seen:

There are two distinct secret numbers: call them X and Y, and assume that X < Y, without loss of generality. You have no clue how I came up with them. They could be anything (positive, negative, rational, irrational, etc). They could come from any probability distribution (discrete or continuous). You have no idea. I flip a fair coin. If the coin shows Heads, I reveal to you the larger number, Y; if it shows Tails, I reveal to you the smaller number, X. You do not get to see the result of the coin flip. Your goal is to guess whether the coin was Heads or Tails, based only on your seeing the one number that I revealed to you. Obviously, if you just decide ``Heads'' is your guess, without taking into account the revealed number at all, then you are correct with probability 0.5. But the goal is to be able to be correct with probability greater than 0.5. Can someone devise a method to do this? Explain.
You may be interested in this thread.

http://forumserver.twoplustwo.com/sh...=184643&page=3
Ask a probabilist Quote
08-07-2008 , 03:39 PM
just for the sake of brevity, this is the solution.

let T be the random variable whose value is X when a tail is flipped and Y when a head is flipped. Let Z be a random variable independent of T such that P(Z\in (a,b))>0 for all a<b

we initially guess heads, look at the revealed number, generate Z, and switch our guess if Z > revealed number.

It's easy to see that the probability that we guess correctly is given by

P(T=X and Z>X)+P(T=Y and Z\leq Y)
= 1/2 [P(Z>X)+P(Z\leq Y)] (independence)
= 1/2 [P(Z>X)+P(Z\leq X)+P(X<Z\leq Y)]
= 1/2 [1+P(X<Z\leq Y)]
> 1/2
Ask a probabilist Quote
08-07-2008 , 04:52 PM
Quote:
Originally Posted by blah_blah
just for the sake of brevity, this is the solution.

let T be the random variable whose value is X when a tail is flipped and Y when a head is flipped. Let Z be a random variable independent of T such that P(Z\in (a,b))>0 for all a<b

we initially guess heads, look at the revealed number, generate Z, and switch our guess if Z > revealed number.

It's easy to see that the probability that we guess correctly is given by

P(T=X and Z>X)+P(T=Y and Z\leq Y)
= 1/2 [P(Z>X)+P(Z\leq Y)] (independence)
= 1/2 [P(Z>X)+P(Z\leq X)+P(X<Z\leq Y)]
= 1/2 [1+P(X<Z\leq Y)]
> 1/2
...and since there are uncountably many choices of an
appropriate Z, there are an uncountably infinite number of
"methods" giving a probability strictly greater than 1/2.
Ask a probabilist Quote
08-07-2008 , 07:36 PM
Thanks Jay, bigpooch, blah_blah. I hadn't seen that thread before. Enrique posted the first solution with some typical 2+2 controversy. I wonder whether a computer simulation would ever be able to show this solution to be true (or false).
Ask a probabilist Quote
08-07-2008 , 07:46 PM
i think my solution is cleaner (although it's the one alluded to by bigpooch in the earlier thread -- as i and siegmund noted, the problem is very well known) and, in particular, there's no need to use a computer simulation to see whether it is true or false, nor would it shed any light on things, since P(X<Z\leq Y) (in the notation of my solution) can be arbitrarily small depending on X and Y, and since X and Y are unknown, we can't pick Z to ensure that P(X<Z\leq Y) is larger than some fixed number.
Ask a probabilist Quote
08-07-2008 , 07:52 PM
Quote:
Originally Posted by thylacine
So I take it that you're not too enamored with the idea that every time I flip a coin reality splits into two equally real realities?
No sane person is imo.
Ask a probabilist Quote
08-07-2008 , 08:02 PM
Quote:
Originally Posted by Max Raker
No sane person is imo.
But this is not a rare view in physics, right?
Ask a probabilist Quote
08-07-2008 , 08:20 PM
Quote:
Originally Posted by thylacine
But this is not a rare view in physics, right?
It is not unheard of for sure, I think its just silly, obv you can mathematically formulate the theory in that language but to make the leap that your arbitrary representation makes that much of an (untestable) physical difference is just naive. But I've also said the same thing about KKLT so what do I know?
Ask a probabilist Quote
08-14-2008 , 12:49 AM
Quote:
Originally Posted by thylacine
When some process is modeled probabilistically, what exactly is happening in physical reality (in your opinion at least).
Quote:
Originally Posted by jason1990
In my opinion, the most common scenario is that we either do not know what is happening, or we know, but it is so complicated, or there are so many unknown variables, that a deterministic model is not practical and/or not useful.

It might be worth pointing out that any probabilistic model built in the Kolmogorov system (that is, all of mainstream probability theory) is, in some sense, a model with unknown and/or hidden variables. Our processes may depend on many things, such as time, space, state, etc. But they all necessarily have one variable in common: \omega. A reasonable interpretation in many circumstances is that all of our unknown/hidden variables have been lumped under the umbrella of \omega. We do not know the exact value of \omega, so we impose a probability measure to model our partial knowledge about these unknowns. However, if we were to discover the value of \omega, then, by definition, we would know the exact outcome of our experiment.
Quote:
Originally Posted by thylacine
So I take it that you're not too enamored with the idea that every time I flip a coin reality splits into two equally real realities?
Probabilistic models occur in many more places than just quantum physics, such as finance, biology, queueing theory, the study of turbulence, etc. I do not think that, in any of these applications, the modelers are seriously thinking about the phenomenon in question in terms of "splitting realities." I think they typically regard the process they are studying as being fundamentally deterministic, and consider the probabilistic model as simply capturing statistical properties of various unknown factors that are influencing the dynamics. Maybe they are wrong about that. But I think this is how most probabilistic models are interpreted.

I also think that, to a large degree, this interpretation is built right into the mathematics of probability spaces. The so-called "chance variable," \omega, represents all the unknown factors.

My opinion on the many worlds interpretation is this. If you adopt it, then you are faced with the challenge of reinterpreting the classical probabilistic model in this new, radical context. As stated in the Wikipedia entry,

Quote:
In the Copenhagen interpretation, the mathematics of quantum mechanics allows one to predict probabilities for the occurrence of various events. In the many-worlds interpretation, all these events occur simultaneously. What meaning should be given to these probability calculations?
I believe this question is highly non-trivial and presents enormous philosophical difficulties. I have never seen a satisfactory answer to this question. And without an answer, I cannot take this interpretation seriously.
Ask a probabilist Quote
08-14-2008 , 01:15 AM
Quote:
Originally Posted by Mason Malmuth
As a side note, upon leaving I was quite knowledgeable about blowing up the Soviet Union. Of course that has no value now.
Are you sure?
Ask a probabilist Quote
08-14-2008 , 01:40 AM
Quote:
Originally Posted by Pokerlogist
Thanks for doing this. (May be should be in the Probability forum.)
It looks like others have very nicely addressed your probability question, so I will just address this part. I chose this forum, in part, because I wanted to attract questions more interesting than "What is the probability of flopping a set?" or "Am I due for a downswing since I've been running so hot lately?" But more than that, this forum really is the most appropriate place for this thread. Probability theory is used in a wide array of applied sciences, it involves very sophisticated mathematics, and it is connected to deep philosophical issues.
Ask a probabilist Quote
08-14-2008 , 02:28 AM
Quote:
Originally Posted by jason1990
Probabilistic models occur in many more places than just quantum physics, such as finance, biology, queueing theory, the study of turbulence, etc. I do not think that, in any of these applications, the modelers are seriously thinking about the phenomenon in question in terms of "splitting realities." I think they typically regard the process they are studying as being fundamentally deterministic, and consider the probabilistic model as simply capturing statistical properties of various unknown factors that are influencing the dynamics. Maybe they are wrong about that. But I think this is how most probabilistic models are interpreted.

I also think that, to a large degree, this interpretation is built right into the mathematics of probability spaces. The so-called "chance variable," \omega, represents all the unknown factors.

My opinion on the many worlds interpretation is this. If you adopt it, then you are faced with the challenge of reinterpreting the classical probabilistic model in this new, radical context. As stated in the Wikipedia entry,

...

I believe this question is highly non-trivial and presents enormous philosophical difficulties. I have never seen a satisfactory answer to this question. And without an answer, I cannot take this interpretation seriously.
FWIW, I don't believe in MWI, just as I don't believe in analagous "splitting realities" interpretations in other probabilistic models.

Nevertheless there are non-trivial questions about why it is that before a coin-flip there is 50% chance of heads, but afterwards it is 0% or 100%.

Anyway, here is another general question. Why is it that you can "consider the probabilistic model as simply capturing statistical properties of various unknown factors that are influencing the dynamics"? Why is this so effective. Are there any deep reason, or even deep theorems, why this should be so.
Ask a probabilist Quote
08-14-2008 , 03:42 AM
I'm a civil engineering major. Is having a minor in statistics going to help at all?
Ask a probabilist Quote
08-14-2008 , 02:47 PM
Quote:
Originally Posted by thylacine
Why is it that you can "consider the probabilistic model as simply capturing statistical properties of various unknown factors that are influencing the dynamics"? Why is this so effective. Are there any deep reason, or even deep theorems, why this should be so.
The theorems of probability theory speak only to the mathematics, which is essentially independent of whatever interpretation you wish to attach to your mathematical model. But let me illustrate what I said before with the simple example of a (potentially biased) coin flip.

The standard model would be the probability space (Ω,F,P), where Ω = {0,1} (the sample space), F = {∅,{0},{1},Ω} (the σ-algebra of events), and P (the probability measure) is given by P(∅) = 0, P({0}) = p, P({1}) = 1 - p, and P(Ω) = 1, where p ∈ [0,1] is some fixed real number. The value 0 corresponds to heads, the value 1 corresponds to tails, and we understand p to denote the probability of heads.

Now suppose we take the position that the actual coin flip is deterministic, governed by the laws of physics. If we knew the exact initial conditions, we could compute and infallibly predict the result of the coin flip. The randomness comes only from the fact that we do not know the exact initial conditions and/or we do not know the relevant laws of physics in their entirety. How might we connect this position with the model given above?

One possibility is this. There are n relevant, unknown initial conditions, x_1,...,x_n. Each x_j is an element of some set E_j. The (deterministic) dynamics of the coin flip are represented by a function f, whose domain is the product space

E_1 x ... x E_n,

and whose range is simply {H,T}. If we knew the initial conditions, (x_1,...,x_n), and we knew the function f, then the result of the coin flip would simply be f(x_1,...,x_n).

To connect this model with the original probability space, we simply identify the element 0 ∈ Ω with the set f^{-1}(H), and the element 1 ∈ Ω with the set f^{-1}(T). In this way, our ignorance about the initial conditions, (x_1,...,x_n), and the function f, have all been transferred to the unknown outcome in the probability space, ω ∈ Ω. The case ω = 0 corresponds to the possibility that the unknown initial conditions satisfy f(x_1,...,x_n) = H, the case ω = 1 corresponds to the alternative.

There is no deep philosophy here and nothing so far is at all controversial. The controversy begins when we ask questions about p. (What value, if any, should/can we assign to p? What, if anything, does the value of p correspond to in the real world?) But up until now, everything is completely straightforward. So the very mathematical structure of probability theory seems to fit quite naturally with the concept of hidden variables and unknown factors.

Quote:
Originally Posted by thylacine
FWIW, I don't believe in MWI, just as I don't believe in analagous "splitting realities" interpretations in other probabilistic models.

Nevertheless there are non-trivial questions about why it is that before a coin-flip there is 50% chance of heads, but afterwards it is 0% or 100%.
I agree that this issue raises non-trivial philosophical difficulties. The difference between this and MWI, in my opinion, is that several (different) answers have been seriously developed. People still argue over which are the "right" answers. But the individual interpretations have stood the test of time, they consistently fit within our mathematical framework of probability theory, and they have not so far exhibited any internal inconsistencies. I do not believe the same can be said for attempts to connect probabilistic interpretations with MWI.

Incidentally, in my opinion it is not necessarily true that "afterwards it is 0% or 100%." For example, suppose I shake a pair of dice in a closed, opaque cup. I then set the cup on the table without opening it. I am perfectly happy with the claim that the probability the dice show a sum of 5 is 1/9, even though the dice have already been rolled.
Ask a probabilist Quote
08-14-2008 , 02:54 PM
Quote:
Originally Posted by imjoshsizemore
I'm a civil engineering major. Is having a minor in statistics going to help at all?
I would like to recommend that you seriously consider a minor in math, instead of statistics. It might be more helpful to you, both in terms of what you learn and how it is perceived by potential employers. You might enjoy it more. And you might be able to take some probability and statistics as a math minor, which would essentially give you the best of both worlds.
Ask a probabilist Quote
08-14-2008 , 11:17 PM
Quote:
Originally Posted by jason1990
The theorems of probability theory speak only to the mathematics, which is essentially independent of whatever interpretation you wish to attach to your mathematical model. But let me illustrate what I said before with the simple example of a (potentially biased) coin flip.

The standard model would be the probability space (Ω,F,P), where Ω = {0,1} (the sample space), F = {∅,{0},{1},Ω} (the σ-algebra of events), and P (the probability measure) is given by P(∅) = 0, P({0}) = p, P({1}) = 1 - p, and P(Ω) = 1, where p ∈ [0,1] is some fixed real number. The value 0 corresponds to heads, the value 1 corresponds to tails, and we understand p to denote the probability of heads.

Now suppose we take the position that the actual coin flip is deterministic, governed by the laws of physics. If we knew the exact initial conditions, we could compute and infallibly predict the result of the coin flip. The randomness comes only from the fact that we do not know the exact initial conditions and/or we do not know the relevant laws of physics in their entirety. How might we connect this position with the model given above?

One possibility is this. There are n relevant, unknown initial conditions, x_1,...,x_n. Each x_j is an element of some set E_j. The (deterministic) dynamics of the coin flip are represented by a function f, whose domain is the product space

E_1 x ... x E_n,

and whose range is simply {H,T}. If we knew the initial conditions, (x_1,...,x_n), and we knew the function f, then the result of the coin flip would simply be f(x_1,...,x_n).

To connect this model with the original probability space, we simply identify the element 0 ∈ Ω with the set f^{-1}(H), and the element 1 ∈ Ω with the set f^{-1}(T). In this way, our ignorance about the initial conditions, (x_1,...,x_n), and the function f, have all been transferred to the unknown outcome in the probability space, ω ∈ Ω. The case ω = 0 corresponds to the possibility that the unknown initial conditions satisfy f(x_1,...,x_n) = H, the case ω = 1 corresponds to the alternative.

There is no deep philosophy here and nothing so far is at all controversial. The controversy begins when we ask questions about p. (What value, if any, should/can we assign to p? What, if anything, does the value of p correspond to in the real world?) But up until now, everything is completely straightforward. So the very mathematical structure of probability theory seems to fit quite naturally with the concept of hidden variables and unknown factors.

...

I agree that this issue raises non-trivial philosophical difficulties. The difference between this and MWI, in my opinion, is that several (different) answers have been seriously developed. People still argue over which are the "right" answers. But the individual interpretations have stood the test of time, they consistently fit within our mathematical framework of probability theory, and they have not so far exhibited any internal inconsistencies. I do not believe the same can be said for attempts to connect probabilistic interpretations with MWI.

Incidentally, in my opinion it is not necessarily true that "afterwards it is 0% or 100%." For example, suppose I shake a pair of dice in a closed, opaque cup. I then set the cup on the table without opening it. I am perfectly happy with the claim that the probability the dice show a sum of 5 is 1/9, even though the dice have already been rolled.
(BTW, how are you getting these symbols. I'm cutting and pasting them from yours.)

What you are saying is what I expected. It seems that above you need a probability distribution on E_1 x ... x E_n, then sum or integrate over that to get the probability distribution for the coin flip. In other words, a probability distribution has to be fed into the model at some point. So the question is, how do you pick that initial probability distribution. (I suppose you can use symmetry (but then, why such a symmetry), or just use the fact that the model works effectively.)

Another point of view is that you can experimentally measure p by repeatedly flipping the coin, but then the question is, why do we live in a universe in which the concept of repeating an experiment even makes sense at all. (Again, symmetry is relevant, but again, why such symmetry.)

You can see where I'm coming from here. The maths is fine, but what is the actual nature of reality?
Ask a probabilist Quote
08-15-2008 , 08:52 AM
Here's another question. Can a truly fundamental model of physics (as opposed to an "effective" model), (i.e. a mathematical model that exactly describes the universe), --- can such a model involve probability?
Ask a probabilist Quote
08-15-2008 , 09:17 AM
Quote:
Originally Posted by thylacine
(BTW, how are you getting these symbols. I'm cutting and pasting them from yours.)
Alt codes, when possible, and Unicode for some mathematical symbols and Greek letters.

Quote:
Originally Posted by thylacine
In other words, a probability distribution has to be fed into the model at some point. So the question is, how do you pick that initial probability distribution.
As I said, the controversy begins when we ask questions about p. Now you are asking questions about p. How you pick the distribution is going to depend on what you think probabilities are, philosophically.

In my view, your question is analogous to someone who says: "I understand logic just fine. I can see how, given the premises and the truth values, we can manipulate them and generate various conclusions. But at some point, we have to feed truth values into the model. How do we pick those initial truth values? How do we decide which premises are true and which are false?" Of course, logic itself says nothing about this -- just as probability theory says nothing about how to assign probabilities, only how to manipulate them. It is also interesting to note that "truth" has more interpretations than "probability," so this question is not any easier.

Quote:
Originally Posted by thylacine
what is the actual nature of reality?
Okay, I know I said I would try to answer all questions, but this one might be a little beyond my capabilities. You might try asking these guys.
Ask a probabilist Quote
08-15-2008 , 09:51 AM
Quote:
Originally Posted by thylacine
Here's another question. Can a truly fundamental model of physics (as opposed to an "effective" model), (i.e. a mathematical model that exactly describes the universe), --- can such a model involve probability?
It is difficult for me to even understand this question. But it seems like you are essentially asking if probabilities represent something objective and physical in the real world. In my last post, I linked to the Wikipedia article on probability interpretations. The frequency and propensity interpretations are the main physical interpretations. Personally, I lean toward the position that physical probabilities do not exist.
Ask a probabilist Quote
08-15-2008 , 10:12 AM
Okay, thanks for links. I see this is well-trodden ground (and a can of worms).
Ask a probabilist Quote
08-16-2008 , 11:14 AM
I understand that if Omega = the sample space is countably infinite, then it is possible to have a probability measure P such that P(omega) > 0, where omega = an outcome, for all omega. A simple example is the Poisson distribution. If Omega is countable, then we typically take the sigma field to be the powerset of Omega, correct?

If Omega is uncountable, can you provide some intuition or a simple proof as to why P(omega) cannot be > 0 for uncountably many omega? Thanks.
Ask a probabilist Quote
08-17-2008 , 06:40 AM
Thanks for the help on the maths problem Jason, here's a stats one I'm also struggling with:

consider the estimator Tn = theeta(n-2)

where theeta = 1/sigma^2

from a N(0,sigma^2) distribution

I need to find the sampling variance of Tn...does that make any sense?

Is it found using E(X^2) - [E(X)]^2?
Ask a probabilist Quote
08-17-2008 , 10:19 AM
Quote:
Originally Posted by ncray
If Omega is uncountable, can you provide some intuition or a simple proof as to why P(omega) cannot be > 0 for uncountably many omega?
Let S = {ω: P(ω) > 0}. We want to prove that S is countable. For this, we begin by observing that

S = S_1 ∪ S_2 ∪ S_3 ∪ ...

where S_n = {ω: P(ω) ≥ 1/n}. Next, we observe that S_n has at most n elements, since the probabilities cannot add up to more than 1. Hence, S is a countable union of finite sets, which means S itself is countable.
Ask a probabilist Quote
08-17-2008 , 10:32 AM
Quote:
Originally Posted by philnewall
...does that make any sense?
Not completely. I could try to guess what is being asked, but I think that would be pretty inefficient. As with your previous question, I think you would be best helped with in-person assistance.
Ask a probabilist Quote
08-17-2008 , 11:31 AM
Quote:
Originally Posted by jason1990
Let S = {ω: P(ω) > 0}. We want to prove that S is countable. For this, we begin by observing that

S = S_1 ∪ S_2 ∪ S_3 ∪ ...

where S_n = {ω: P(ω) ≥ 1/n}. Next, we observe that S_n has at most n elements, since the probabilities cannot add up to more than 1. Hence, S is a countable union of finite sets, which means S itself is countable.
Awesome. Thanks a lot.
Ask a probabilist Quote

      
m