Sleeping Beauty Problem
This variant strives for the opposite goal. It creates a dependence on what has happened in the past.
Note that the payout structure for "pressing the button" is
-1 if heads
0 or 1 with some probabilities each on tails.
So if heads and tails were equally likely clearly pressing this button must be -EV, yet the optimal play is to press half the time.
Consider scenario O1: After the experiment, somebody walks up and gets explained the rules for what just happened and asks "did she roll a 5?" and is answered truthfully. This scenario clearly works like your calculations show, P= ~1/3 when the answer is yes.
Now scenario O2: After the experiment, somebody walks up and gets explained the rules for what just happened and has the exchange "want to know a number she rolled? Sure. She rolled a 5". Now we have a problem. I explained in post 664 why I think the answer is 1/2 here, but the reason that simply doing the same calculation from O1 doesn't apply (IMO) is important. The 20 statements "She rolled a x." you can hear form a partition of reality, but E''(x) don't form a partition of the probability space that's being used to model reality. So my conclusion is that the probability space doesn't model reality here. In this case, it's fixable by adding another dimension of (M/T) on top of t x 1..20 x 1..20 to cover which number you're being told when it's tails. Now those 20 statements actually partition the new probability space, so it's reasonable to consider it a model of reality, and you get .5 again. And obviously the yes/no answers to "did she roll a 5?" in O1 partition reality and the space.
Now back to SB: She has the die in hand, and she knows the 20 possibilities "I am looking at a x right now" partition reality, but they again don't partition the probability space, so I conclude that the probability space doesn't model reality here either. In this case though, I don't believe (for #661 reasons) that it's possible to fix it by just changing the space. So representing belief here, if it's possible, apparently requires something other than pure probability theory.
The result of this roll is not a random variable. There are, however, two natural random variables that we could define. X1 is what is rolled on Monday. X2 is what is rolled on Tuesday if the coin is tails, and 0 otherwise. In terms of these random variables,
E'' = {X1 = 5} ∪ {X2 = 5}.
E'' = {X1 = 5} ∪ {X2 = 5}.
1. Say the coin is tails and it is monday and we roll a 2, and tuesday we roll a 5. Then E" is true even though we are looking at a 2. I dont like this.
2. I think its legit to have random variables here for SB in the sense that PTB and I were thinking about them. For example she rolls a d20 right now and doesnt look at it, what is the chance that the top face is a 20? I think its perfectly legit for her to say that chance is 1/20.
{"Beauty looks at a x on Monday": x in {1,2,...,20}},or
{"Beauty looks at a x on Tuesday": x in {1,2,...,20}}.These are different partitions. The first partitions the probability space. The second partitions the event t.
I think it was Spock who said, "Probability is the beginning of wisdom, not the end." Or something like that.
The sentence q = "The top face is a 20 right now" does not correspond to an object in T. Therefore, P(q) is undefined. Again, see Post #661 for details.
Originally Posted by PairTheBoard
"An objective outside observer could be provided this same E''(X) information. Suppose the experiment is run to conclusion and the outside observer is told nothing about what happened except for the infomation E''(X), "Beauty rolled an X during the experiment". As it happens, in this case he is informed E''(5), "Beauty rolled a 5 during the experiment". Should the outside observer now also change his credence for heads to 20/59?"
We don't really need Sleeping Beauty to do this. We can simply construct the following proposition bet. Flip a fair green coin. If heads roll a 20 sided die once. If Tails, roll the die two times.
Suppose after this is done it's reported to you by way of a well defined proceedure which you have no knowledge of, that for a value b, E''(b) is true. ie. "The value b was rolled by a die in the experiment".
For example, the reporting procedure might be as follows: If only one die value is rolled that value is reported. If two die values are rolled, a fair red coin is flipped and if heads the higher die value is reported, if tails the lower die value is reported. So this procedure might be used to report E''(b) to you but you do not know that's the procedure. All you know is that E''(b) is true.
So, upon hearing that E''(b) is true would you be willing to bet that the green coin landed tails and give better than 1-1 odds, up to 39-20 odds? Would you be willing to make that bet for numerous runs of this experiment?
PairTheBoard
"An objective outside observer could be provided this same E''(X) information. Suppose the experiment is run to conclusion and the outside observer is told nothing about what happened except for the infomation E''(X), "Beauty rolled an X during the experiment". As it happens, in this case he is informed E''(5), "Beauty rolled a 5 during the experiment". Should the outside observer now also change his credence for heads to 20/59?"
We don't really need Sleeping Beauty to do this. We can simply construct the following proposition bet. Flip a fair green coin. If heads roll a 20 sided die once. If Tails, roll the die two times.
Suppose after this is done it's reported to you by way of a well defined proceedure which you have no knowledge of, that for a value b, E''(b) is true. ie. "The value b was rolled by a die in the experiment".
For example, the reporting procedure might be as follows: If only one die value is rolled that value is reported. If two die values are rolled, a fair red coin is flipped and if heads the higher die value is reported, if tails the lower die value is reported. So this procedure might be used to report E''(b) to you but you do not know that's the procedure. All you know is that E''(b) is true.
So, upon hearing that E''(b) is true would you be willing to bet that the green coin landed tails and give better than 1-1 odds, up to 39-20 odds? Would you be willing to make that bet for numerous runs of this experiment?
PairTheBoard
But they do form a partition - she just doesn't know which partition it is. It is either
{"Beauty looks at a x on Monday": x in {1,2,...,20}},or
{"Beauty looks at a x on Tuesday": x in {1,2,...,20}}.These are different partitions. The first partitions the probability space. The second partitions the event t.
Let J* be the new background information that includes everything relevant that was just added. If E''(b) is as before, and
Notice how in the earlier information, J'', there is no number being selected and reported. When Beauty sees a 5 on the die, she is not seeing a result from a randomly selected awakening. There is no randomly selected awakening in J''.
D(b) = "The value b was selected from among the numbers rolled during the experiment, and reported.",then I would assign P(h | E''(b) & J*)= 20/59 and P(h | D(b) & J*) = 1/2. I have already demonstrated the first assignment. For the second, I would begin by considering
O(b) = "The value b was rolled only once during the experiment."I would then appeal to the indifference principle to assign
P(D(b) | t & O(b) & J*) = 1/2.(If J* included information about an exchangeable sequence of such experiments, then I would include a parameter to allow this probability to be updated between experiments. But I think this takes us too far astray from the discussion.) I would then compute
P(t & D(b) | J*)Since P(h & D(b) | J*) = P(h & O(b) | J*) = (1/2)(1/20)= 1/40, we have
= P(t & D(b) & ~O(b) | J*) + P(t & O(b) | J*)P(D(b) | t & O(b) & J*)
= P(t & ~O(b) | J*) + P(t & O(b) | J*)(1/2)
= (1/2)(1/400) + (1/2)(19/200)(1/2)
= 20/800 = 1/40.
P(t | D(b) & J*) = (1/40)/(1/40 + 1/40) = 1/2,and this implies P(h | D(b) & J*) = 1/2.
Notice how in the earlier information, J'', there is no number being selected and reported. When Beauty sees a 5 on the die, she is not seeing a result from a randomly selected awakening. There is no randomly selected awakening in J''.
FMP
(Monday is Monday, Monday is Tuesday) = (Ω, Ø),which gives 1(1/2) + 0(0) = 1/2, and if it's Tuesday, then it's the partition
(Tuesday is Monday, Tuesday is Tuesday) = (Ø, Ω),which gives 0(1/2) + 1(0) = 0.
And you're right back where you started.
Yeah that was a brain to keyboard fail.
Do you think the PM idea is absurd? It seems to work with the dice variant as well.
And you're right back where you started.
Say before the experiment we flip a monday coin and a tuesday coin in secret and tell her this is part of the experiment.
On each day we reveal the result of that coin. Here is a small picture of the space:
(*You can also imagine we had do the same thing with a d20, it will result in 20 red squares on the left and 39 on the right. I used a d2 because it made a smaller picture.)
Now basically when we are in E, we were in one of the red squares in this picture, and presumably they all have the same probability, thats how we were getting this 20/59 answer. But I feel like the yellow shaded squares have "half the chance", but I cant quite explain why.
On each day we reveal the result of that coin. Here is a small picture of the space:
(*You can also imagine we had do the same thing with a d20, it will result in 20 red squares on the left and 39 on the right. I used a d2 because it made a smaller picture.)
Now basically when we are in E, we were in one of the red squares in this picture, and presumably they all have the same probability, thats how we were getting this 20/59 answer. But I feel like the yellow shaded squares have "half the chance", but I cant quite explain why.
I'm not sure it always works as an auction because if there are more people than goods, when I have twice as many points as everyone else, I can buy 1 thing nobody can, but they can make me use up more than half my points to do it and keep me from buying anything else.
You can say that the expected value of the amounts in the two envelopes are equal from the problem statement alone without regard to the distribution
I don't think you meant to direct this at me.
Now it's time to read rest of the answers because they look interesting, this thread never dies
P(heads) = P(heads|Mon)*P(Mon) + P(heads|Tues)*P(Tues)
= (1/2)*(2/3) + 0*(1/3)
= 1/3.
Even if you don't like the frequency argument
She wakes up and realizes that if a tail were thrown, this can only be Monday, but if a head were thrown, this could be either of 2 awakenings, only one of which is Monday. So it's a 2-to-1 favorite to be Monday, or P(Mon) = 2/3.
As it's 1/2 + 1/2*1/2 to 1/2*12 = 3 to 1.
Imagine we flip a coin and put a blue ball in the bag if heads but blue ball and green ball if tails. Now we flip a coin, put the ball(s) in and you pull the ball out of the bag at random. What are the chances it's blue ball ?
P(heads) = P(heads|Mon)*P(Mon)
I like frequency argument but it points to 1/2 answer. Number of head awakening/number of all experiments. It's as simple as that. 1/3'ers were challenged again and again to describe experiment which leads to 1/3 answer on frequency basis and nobody could come up with anything.
100 experiments. SB is awakened 150 times on average. 50 times on heads, and 100 times on tails. Thus, 1/3.
I don't think it's correct. If anything this reasoning leads to 3/4 for Monday.
As it's 1/2 + 1/2*1/2 to 1/2*12 = 3 to 1.
Imagine we flip a coin and put a blue ball in the bag if heads but blue ball and green ball if tails. Now we flip a coin, put the ball(s) in and you pull the ball out of the bag at random. What are the chances it's blue ball ?
0.75 * 2/3 = 1/2
As it's 1/2 + 1/2*1/2 to 1/2*12 = 3 to 1.
Imagine we flip a coin and put a blue ball in the bag if heads but blue ball and green ball if tails. Now we flip a coin, put the ball(s) in and you pull the ball out of the bag at random. What are the chances it's blue ball ?
0.75 * 2/3 = 1/2
If you only draw one ball out of the bag, then it is not the SB example because on tails you leave a ball in the bag. That is the case where SB is only asked the question on either Monday or Tuesday, not both. Then the probability is 1/2. That is the point that keeps confusing you and many others with this problem. When you understand that, you will understand 1/3.
I think this latest analysis by jason1990 ought to be published. Even though he concludes that in the original problem, J, Beauty gains no new information from an awakening so should maintain a credence of 1/2, imo his altered version, J'', in which Beauty rolls a d20 twenty sided die provides huge motivation for the 1/3 position. It does not show how a "random awakening" is produced by the experiment with probabilities about 1/3 each for (h,M), (t,M), (t,T) as I've suggested itt ought to be done by 1/3ers. But J'' does provide a rigorous nonindexical probability model which shows that Beauty's situation within the experiment produces a unique state of known information for her which forces a Baysian change of credence to about 1/3 from her perspective.
Furthermore, the law of large numbers applies to this Baysian credence of about 1/3 in the standard way, per experiment. No appeal to frequency per awakening is required. For example, if Beauty awakens and rolls a 5 she knows a 5 is rolled in this experiment and she computes the conditional probability P(heads|"a 5 is rolled in this experiment) = 20/59. The law of large numbers applies to this conditional probability in that if we look at experiments in which a 5 is rolled by Beauty, out of every 59 such experiments on average 20 will be with heads and 39 will be with tails.
What I find very interesting is the way the d20 die in J'' provides a way of identifying and seperating the two awakenings in a nonindexical way - without referring to "this awakening right now". In the nonindexical models J and J'', the statement "Beauty awakens" is encoded in the models as "On Monday OR on Tuesday Beauty experiences an awakening." That is an event in the model with probability 1. So conditioning on the event "Beauty awakens" changes none of the probabilities in the model. P(h) = 1/2. Furthermore, the "OR on Tuesday" part of the statement doesn't really add anything to it because the event "Beauty awakens on Tuesday" is contained in the event "Beauty awakens on Monday". P(Beauty awakens on Monday) = 1 in the nonindexical models J and J''.
But look what happens when "Beauty awakens and rolls a 5". This now is encoded in the nonindexical model J'' as,
"On Monday Beauty awakens and rolls a 5" OR "On Tuesday Beauty awakens and rolls a 5"
Now the "OR" does add information. The event "On Tuesday Beauty awakens and rolls a 5" is NOT contained in the event "On Monday Beauty awakens and rolls a 5". The d20 die roll in J'' distinguishes a Tuesday die roll from a Monday die roll in the nonindexical model whereas that kind of distinction cannot be made in the model for just the facts of an awakening happening on either day.
Furthermore, if "Beauty awakens and rolls a 5" then she knows the truth of the event E''(5) = "On Monday or Tuesday a 5 is rolled" in a way unique to her perspective. She knows it because she just rolled it. And had she rolled any number i=1,2,...,20 she would have known the truth of the event E''(i) = "On Monday or Tuesday an i is rolled" and she would compute the conditional probability of heads as 20/59. Whats unusual to her perspective is that when rolling a 5, while she knows she didn't roll a 10 she does not know ~E''(10). Because of her amnesia she can never know ~E''(i) for any i.
Compare Beauty's unique perspective for learning the truth of an event E''(i) = "On Monday or Tuesday an i is rolled" to how that event might be reported to an outside observer. When tails and two different die rolls i and j you would have to somehow decide whether to report E''(i) or E''(j) to the outside observer. The outside observer cannot learn E''(i) the same way Beauty learns E''(i). Beauty does not choose between E''(i) and E''(j). She learns E''(i) when she rolls the i and she learns E''(j) when she rolls the j. She has a unique perspective on the information, a kind of unique partial knowledge.
Beauty's situation inside the experiment reminds me of the way they talk about quantum states. Before she rolls the d20 die she seems to be in a kind of indeterminate state wrt heads-tails. So maintaining 1/2 credence for each. Then rolling and observing the d20 die acts to partially determine the heads-tails state for her, allowing the credence update to 20/59. If there's anything to this view then maybe Sleeping Beauty has more to teach us than we realize.
PairTheBoard
Furthermore, the law of large numbers applies to this Baysian credence of about 1/3 in the standard way, per experiment. No appeal to frequency per awakening is required. For example, if Beauty awakens and rolls a 5 she knows a 5 is rolled in this experiment and she computes the conditional probability P(heads|"a 5 is rolled in this experiment) = 20/59. The law of large numbers applies to this conditional probability in that if we look at experiments in which a 5 is rolled by Beauty, out of every 59 such experiments on average 20 will be with heads and 39 will be with tails.
What I find very interesting is the way the d20 die in J'' provides a way of identifying and seperating the two awakenings in a nonindexical way - without referring to "this awakening right now". In the nonindexical models J and J'', the statement "Beauty awakens" is encoded in the models as "On Monday OR on Tuesday Beauty experiences an awakening." That is an event in the model with probability 1. So conditioning on the event "Beauty awakens" changes none of the probabilities in the model. P(h) = 1/2. Furthermore, the "OR on Tuesday" part of the statement doesn't really add anything to it because the event "Beauty awakens on Tuesday" is contained in the event "Beauty awakens on Monday". P(Beauty awakens on Monday) = 1 in the nonindexical models J and J''.
But look what happens when "Beauty awakens and rolls a 5". This now is encoded in the nonindexical model J'' as,
"On Monday Beauty awakens and rolls a 5" OR "On Tuesday Beauty awakens and rolls a 5"
Now the "OR" does add information. The event "On Tuesday Beauty awakens and rolls a 5" is NOT contained in the event "On Monday Beauty awakens and rolls a 5". The d20 die roll in J'' distinguishes a Tuesday die roll from a Monday die roll in the nonindexical model whereas that kind of distinction cannot be made in the model for just the facts of an awakening happening on either day.
Furthermore, if "Beauty awakens and rolls a 5" then she knows the truth of the event E''(5) = "On Monday or Tuesday a 5 is rolled" in a way unique to her perspective. She knows it because she just rolled it. And had she rolled any number i=1,2,...,20 she would have known the truth of the event E''(i) = "On Monday or Tuesday an i is rolled" and she would compute the conditional probability of heads as 20/59. Whats unusual to her perspective is that when rolling a 5, while she knows she didn't roll a 10 she does not know ~E''(10). Because of her amnesia she can never know ~E''(i) for any i.
Compare Beauty's unique perspective for learning the truth of an event E''(i) = "On Monday or Tuesday an i is rolled" to how that event might be reported to an outside observer. When tails and two different die rolls i and j you would have to somehow decide whether to report E''(i) or E''(j) to the outside observer. The outside observer cannot learn E''(i) the same way Beauty learns E''(i). Beauty does not choose between E''(i) and E''(j). She learns E''(i) when she rolls the i and she learns E''(j) when she rolls the j. She has a unique perspective on the information, a kind of unique partial knowledge.
Beauty's situation inside the experiment reminds me of the way they talk about quantum states. Before she rolls the d20 die she seems to be in a kind of indeterminate state wrt heads-tails. So maintaining 1/2 credence for each. Then rolling and observing the d20 die acts to partially determine the heads-tails state for her, allowing the credence update to 20/59. If there's anything to this view then maybe Sleeping Beauty has more to teach us than we realize.
PairTheBoard
From a practical perspective, Sleeping Beauty doesn't even need a die. Perhaps she wakes up and observes that the interviewer is standing exactly 32.1 inches from her when he asks her the question. Or perhaps she observes that she has an itch on her right cheek exactly 1 minute and 18 seconds after waking. If her prior credence for these events happening on both days is much smaller than her prior credence for them happening on a given day, then her posterior credence for heads will be close to 1/3.
First, to go back a bit, consider the following two English sentences:
a = "Today is Monday."What can we infer from these two sentences? Clearly, we can infer
b = "Today is jason1990's 50th birthday."
E = "jason1990's 50th birthday is Monday.",and this sentence is formally represented in our logical system from Post #661 by the symbolic proposition "B = 1". Can we infer anything else from a and b? If we stay within our formal system, then no. There is nothing else that they imply.
And yet, we may have an informal, intuitive sense that they say more than this. It feels like they say not only that jason1990's 50th birthday is Monday, but also that we are somehow in the birthday, whatever that might mean. It is as if we are immaterial objects floating through time, sequentially occupying the various stages of the bodies we live in as they change. Or maybe it means something else entirely, which I am not creative enough to imagine or describe. But whatever it might mean, even if it is true, we cannot say it in the formal system with which we do rigorous logic and probability. And so we are left with nothing more than the objective fact that jason1990's 50th birthday is Monday.
In the Sleeping Beauty problem, if one says that Beauty's credence upon waking is 1/3, then one must state what Beauty's new evidence is for the change. Sometimes people answer by saying that she knows:
a = "Beauty is awake today."But as we see in Post #661, this sentence cannot correspond to any proposition in our formal system. However, Beauty might combine a with another subjective observation:
b = "The events X happened today.",where X is a complete, detailed description of everything she observes during this awakening. It includes the fact that the interviewer stood 32.1 inches from her. It includes the itch on her cheek 1 minute and 18 seconds after waking. It includes the fact that she blinked 374 times while awake. It includes everything objective and observable. If Beauty combines a and b, then she obtains the objective proposition:
E = "The events X happened while Beauty was awake."If X is sufficiently detailed, then we will have
P(E | h & J) = ε,where ε is very small. In this case, we get P(h | E & J) ≈ 1/3. This is only approximate. The actual probability would be negligibly larger than 1/3 because of the astronomically small chance that the two awakenings would play out exactly the same, down to every observable detail.
P(E | t & J) ≈ 2ε,
In practice, it would be perfectly natural to consider the limiting situation as ε -> 0. When we do this, we are effectively operating under the assumption that every day is unique. This is how we self-locate, to use the terminology of some of the philosophers. If we are doing rigorously correct logic and probability, then we do not self-locate using some nebulous concept of observer-moments or any other abstract, ghost-in-the-machine type reasoning. We self-locate by making objective observations of our environment, and employing the assumption that these observations uniquely locate us in space and time. This assumption can be justified when the observations we make have extremely small prior probabilities.
As one final application, consider what happens to Beauty if there is no coin. She will just be woken on both Monday and Tuesday, for sure. When she wakes up, she would like to know the probability of
m = "Today is Monday."Her evidence is
e = "Beauty is awake Today."Neither of these are formal, objective propositions. But she can combine them with
b = "The events X happened today.",where X is a complete, detailed description of everything she observes during this awakening. Doing this, she is then led to compute the probability that
M = "The events X happened on Monday.",given
E = "The events X happened while Beauty was awake."In this case,
P(M | E & J) = P(M & E | J)/P(E | J)just as we would expect.
= P(M | J)/P(E | J)
≈ ε/(2ε)
= 1/2,
As an application of this objective approach to self-location, I would like to briefly talk about the so-called Doomsday Argument (DA). Let N denote the total number of humans that will ever be born. (This includes all those that have been born in the past, as well as all those that will be born in the future.) Let pk = P(N = k). These are our prior probabilities for the possible values of N. These unconditional probabilities are all conditioned on some minimal set of background information, J, but I am suppressing this notation for simplicity.
Now suppose we learn
The DA suggests that we should instead condition on
But all of this is not rigorously valid, since e is not a formal proposition. We may, however, combine e with another subjective observation:
One interesting thing to note is that if we take the "Self-Indication Assumption Doomsday argument rebuttal", convert it into an objective argument using the methods described in this and the previous post, and simplify, then we end up with exactly what is written above. Note, however, that the linked argument has been criticized on the grounds that "it seems wrong to treat ourselves as if we were once immaterial souls harbouring hopes of becoming embodied, hopes that would have been greater, the greater the number of bodies to be created." In contrast, the argument presented in this post uses no subjective notions whatsoever. Everything can be framed in a formal, objective logical system, and the underlying notion is that self-location occurs via objective observations of a priori rare events and characteristics.
Now suppose we learn
E = "There is an n-th human."Since E = {N ≥ n}, we have P(N = k | E) = P(N = k | N ≥ n). There is nothing unusual here.
The DA suggests that we should instead condition on
e = "I am the n-th human."From Post #661, however, e is not a formal proposition, so we cannot do this. However, the DA proceeds as though we can, and asserts that P(e | N = k) = 1/k. The reasoning is that if there are k total possible humans throughout all of time, then "I" am a priori equally likely to be any one of them. If e were indeed a valid proposition, then one could use this to show that P(N ≥ k | e) is strictly less than P(N ≥ k | E). For some prior distributions on N, the probability is reduced significantly. Note that {N ≥ k} is the event that the human race survives long enough to give rise to at least k humans. The conclusion is that this event is less likely than what we previously thought. Hence, the word "doomsday" is used to describe this argument.
But all of this is not rigorously valid, since e is not a formal proposition. We may, however, combine e with another subjective observation:
b = "I have characteristics X.",where X is a complete, objective description of me, including my height, weight, eye color, a thorough description of the exact shape of my nose, and so on. When we combine e and b, we get the objective proposition:
E' = "The n-th human has characteristics X."We now have, for k ≥ n,
P(N = k | E') = P(N = k)P(E' | N = k)/P(E')as it should be.
= pkP(E' | N = k)/(pnP(E' | N = n) + pn+1P(E' | N = n+1) + pn+2P(E' | N = n+2) + ...)
= pkε/(pnε + pn+1ε + pn+2ε + ...)
= pk/(pn + pn+1 + pn+2 + ...)
= P(N = k | N ≥ n)
= P(N = k | E),
One interesting thing to note is that if we take the "Self-Indication Assumption Doomsday argument rebuttal", convert it into an objective argument using the methods described in this and the previous post, and simplify, then we end up with exactly what is written above. Note, however, that the linked argument has been criticized on the grounds that "it seems wrong to treat ourselves as if we were once immaterial souls harbouring hopes of becoming embodied, hopes that would have been greater, the greater the number of bodies to be created." In contrast, the argument presented in this post uses no subjective notions whatsoever. Everything can be framed in a formal, objective logical system, and the underlying notion is that self-location occurs via objective observations of a priori rare events and characteristics.
Originally Posted by http://www.anthropic-principle.com/preprints/self-location.html
The Presumptuous Philosopher
It is the year 2100 and physicists have narrowed down the search for a theory of everything to only two remaining plausible candidate theories, T1 and T2 (using considerations from super-duper symmetry). According to T1 the world is very, very big but finite and there are a total of a 200 billion observers in the cosmos. According to T2, the world is very, very, very big but finite and there are a 200 trillion observers. The super-duper symmetry considerations are indifferent between these two theories. Physicists are preparing a simple experiment that will falsify one of the theories. Enter the presumptuous philosopher: “Hey guys, it is completely unnecessary for you to do the experiment, because I can already show to you that T2 is about a billion times more likely to be true than T1 (whereupon the philosopher explains the Self-Indication Assumption)!”
It is the year 2100 and physicists have narrowed down the search for a theory of everything to only two remaining plausible candidate theories, T1 and T2 (using considerations from super-duper symmetry). According to T1 the world is very, very big but finite and there are a total of a 200 billion observers in the cosmos. According to T2, the world is very, very, very big but finite and there are a 200 trillion observers. The super-duper symmetry considerations are indifferent between these two theories. Physicists are preparing a simple experiment that will falsify one of the theories. Enter the presumptuous philosopher: “Hey guys, it is completely unnecessary for you to do the experiment, because I can already show to you that T2 is about a billion times more likely to be true than T1 (whereupon the philosopher explains the Self-Indication Assumption)!”
I believe the ε analysis in post 693 when applied to this situation would agree with the conclusion. Using an E like "A person observed events X", with probability ε for any one person observing events X, the relative probabilities for each population size N are ~N*ε, and the conclusion follows. That doesn't make me a happy panda.
It's basically SB with uncertainty over who you are vs. when you are. Each come with a prior of .5, SB H/T .5, PP T1/T2 .5. The probability space for PP would be 200b entries of 200b element vectors of saw/didn't see and 200t entries of 200t element vectors of saw/didn't see, and the probability of the (all-didnt-see vector|T1) would be .5*(1-ε)^200b ~= .5*(1-200b*ε) so P(somebody saw X|T1) ~=200b*ε. while for T2 it would be ~200t*ε.
Unless you can find a way to translate "I saw X" into more than "somebody saw X" in this probability space, you run into that conclusion. Or in SB terms, translating "I am watching X happen right now" into more than "X happened one of these days".
Unless you can find a way to translate "I saw X" into more than "somebody saw X" in this probability space, you run into that conclusion. Or in SB terms, translating "I am watching X happen right now" into more than "X happened one of these days".
Originally Posted by http://www.anthropic-principle.com/preprints/self-location.html
The Presumptuous Philosopher
It is the year 2100 and physicists have narrowed down the search for a theory of everything to only two remaining plausible candidate theories, T1 and T2 (using considerations from super-duper symmetry). According to T1 the world is very, very big but finite and there are a total of a 200 billion observers in the cosmos. According to T2, the world is very, very, very big but finite and there are a 200 trillion observers. The super-duper symmetry considerations are indifferent between these two theories. Physicists are preparing a simple experiment that will falsify one of the theories. Enter the presumptuous philosopher: “Hey guys, it is completely unnecessary for you to do the experiment, because I can already show to you that T2 is about a billion times more likely to be true than T1 (whereupon the philosopher explains the Self-Indication Assumption)!”
It is the year 2100 and physicists have narrowed down the search for a theory of everything to only two remaining plausible candidate theories, T1 and T2 (using considerations from super-duper symmetry). According to T1 the world is very, very big but finite and there are a total of a 200 billion observers in the cosmos. According to T2, the world is very, very, very big but finite and there are a 200 trillion observers. The super-duper symmetry considerations are indifferent between these two theories. Physicists are preparing a simple experiment that will falsify one of the theories. Enter the presumptuous philosopher: “Hey guys, it is completely unnecessary for you to do the experiment, because I can already show to you that T2 is about a billion times more likely to be true than T1 (whereupon the philosopher explains the Self-Indication Assumption)!”
"A person (i.e. an observer somewhere in the universe) observed events X,"since we know more than that. We should instead, at the very least, condition on
"A human being on Earth observed events X."But the latter proposition would have the same small prior probability ε, regardless of whether the population of the universe is 200 billion or 200 trillion.
Sure, but you can edit the example to something in the spirit of the problem where you have no difference in objective knowledge about the two possible groups except for their size, no ability to objectively locate yourself in a subgroup of size >1 , and some reason to believe that P(N people existing) = P(1000N people existing) = .5.
An awful lot of posts to what seems like an easy question, overcomplicated unnecessarily. Way too much to read.
It all boils down to one question--that is, what question are you asking?
If you want to know the chances of heads or tails of the coin flip, it's 50/50.
It you want to know what side to bet given the parameters of the original question, it's tails, because tails will occur 2/3 times when Sleeping Beauty is questioned. But that doesn't change the odds of the coin flip. It effectively changes the odds being offered (even money for heads, 2:1 for tails), because the question will be asked twice as often for a tails result that a heads result.
I apologize if this has already been stated (possibly ad nauseam).
It all boils down to one question--that is, what question are you asking?
If you want to know the chances of heads or tails of the coin flip, it's 50/50.
It you want to know what side to bet given the parameters of the original question, it's tails, because tails will occur 2/3 times when Sleeping Beauty is questioned. But that doesn't change the odds of the coin flip. It effectively changes the odds being offered (even money for heads, 2:1 for tails), because the question will be asked twice as often for a tails result that a heads result.
I apologize if this has already been stated (possibly ad nauseam).
Feedback is used for internal purposes. LEARN MORE