Open Side Menu Go to the Top
Register
Ask a probabilist Ask a probabilist

05-25-2011 , 10:40 PM
An added wrinkle is if two, or three etc., civilizations do contact each other how much actual communication would be possible. There is no universal language (mathematics may qualify or simple images?) and translation/decoding may be difficult, or only rudimentary communication possible. You may be at a party with only limited ability to peak under the skirt of that sexy alien you have an eye on. Very frustrating, I think.

-Zeno
Ask a probabilist Quote
06-07-2011 , 04:09 AM
This is one awesome thread. Wish Jason would post some more as I respect his knowledge and wit. Kind of rare is the person who posseses both.
Ask a probabilist Quote
06-07-2011 , 02:01 PM
Quote:
Originally Posted by All-In Flynn
I'm told that by conservative estimates, there are at least 200bn stars in our galaxy, and at least 100bn galaxies in the universe. Assuming our galaxy is broadly typical, this means there are at least 2 x 1022 stars in the universe. Let's say that only one star in a hundred thousand has a planet capable of supporting life. Let's say only one planet in a thousand capable of supporting life goes on to develop it. Let's say that only one planet in a thousand that develops life goes on to develop life-forms capable of broadcasting radio signals into outer space.

That leaves 2 x 1011 species in the universe broadcasting radio signals into outer space.

...

What I'm wondering is whether there's any way to calculate a rough probability that somewhere, at some time, two species have made contact with one another.

...

a separate issue would be the probability that somewhere in the universe at least one species has received radio transmissions from another - whether the transmitting species still exists or not.
There are at least a few important parameters that were not mentioned. For instance, when a broadcast-capable species emerges, for how long does it continue to transmit signals? When a transmitted signal enters the vicinity of a broadcast-capable species, what is the probability that they receive the signal and interpret it as having come from another broadcast-capable species, and how does this probability depend upon their distance from the source? In other words, how close in space and time must two species be for them to make contact?

To continue your analogy with the birthday problem, you have considered only the analogue of 24 people being in the room. You must also consider the analogue of there being 365 days in the year.
Ask a probabilist Quote
06-07-2011 , 02:21 PM
Quote:
Originally Posted by jason1990
There are at least a few important parameters that were not mentioned. For instance, when a broadcast-capable species emerges, for how long does it continue to transmit signals? When a transmitted signal enters the vicinity of a broadcast-capable species, what is the probability that they receive the signal and interpret it as having come from another broadcast-capable species, and how does this probability depend upon their distance from the source? In other words, how close in space and time must two species be for them to make contact?
You're right of course - I was hoping some kindly soul might fill in the blanks as even if I know all the parameters, I'm not physics-savvy enough to assign them meaningful values.

I'll have a think about it.

Quote:
To continue your analogy with the birthday problem, you have considered only the analogue of 24 people being in the room. You must also consider the analogue of there being 365 days in the year.
24/365 = 0.065753424657534246575342465753425

(2 x 1011) / (2 x 1024) = 1.0 × 10-13

So yeah; I guess it's not really looking all that likely at first glance.

Hmmm.
Ask a probabilist Quote
06-07-2011 , 04:14 PM
I will try to get to this later with a much better approach than the one i will roughly describe here. I will assume your numbers of 2*10^11 even if i think intelligent life is even harder than you calculate. Additionally it is my firm conviction that whoever is the first they go on to live forever typically and basically expand to the entire galaxy and find all systems within 100-200k yrs after the era of electromagnetism communications started. After that it only takes 1-2 mil years to go to the next galaxy if we use current physics and probably even better with more advanced knowledge not available now. But really even if we forget about better physics in the future impacting this and stand only on light speed as the barrier of transfer you can expect the first to occupy all galaxy within 200k years and then every 1-2 mil yrs to get to another galaxy and do the same. Basically the first after 1-2 bil years has traveled to billions of galaxies and explored all of them (but of course in a branching manner not in a coordinated by a common center manner).

But i will forget what i just said and discuss it better in another post hopefully.

Regarding what you said one has to work as if this is an occupancy problem but you distribute 2*10^11 civilizations not to 10^11 galaxies but rather think spacetime wise (4d not 3d) . Imagine each galaxy's history is divided in segments of say 10k years during which communication happens if we take a pessimistic scenario that they die after 10k of the first high tech era. So what good is it to have in a galaxy 3 civilizations in 10 bil years when they are separated in evolution by say billions of years each to the other. They never exist at the same "time range" (or at least within the relativistic sense of it all ). So imagine we break down each galaxy (take only last 10 bil years for that) to 10^10/10^4=10^6 time bins and we have therefore to distribute 2*10^11 civilizations in 10^11*10^6 spacetime bins or 2*10^11 in 10^17 bins. This is a Poisson distribution for the number of civilizations per bin (with very small average) and for example to have 2 in one bin the chance is ;

1/2*(2*10^11/10^17)^2*e^(-2*10^11/10^17) ~2*10^-12.

Since we have 10^17 bins the chance than in none of them 2 or more exist is basically (1-2*10^-12)^(10^17)= (1-1/(0.5*10^12))^(0.5*10^12*2*10^5)=e^(-2*10^5) which is a ridiculously small number. Basically this means the chance that in one spacetime bin 2 existed or more is basically 1.

So with your numbers and even a modest 10k years survival of post electromagnetism era communications within 1 galaxy are possible (feel free to make it even 50k which is the typical galaxy time light radius or more) .

The typical bin has basically 0 members and rarely 1 or 2 but because there are so many of them, the chance that none has a coincidence (2) is astronomically small. Of course in all this i assume that if they go on to live 10k or 50k years they become advanced enough to be impossible to miss each other in the same galaxy given that they typically survive as much as the radius of the galaxy so they will get a signal at some point or they would have already started expanding in their own galaxy in all directions- but of course if they did they probably live a lot more than 10k or 50 or 100k years, they become immortal.)

I believe i didnt make an error here on this large number approximations to Poisson and then back in answering the question of what is the chance no bin has 2 or more.

So using your numbers it seems that a communication is basically inevitable in the entire universe. (but of course make your numbers say 10^6 times smaller and then this breaks down as argument, although if we increased the survival of the civilization to 50-100k- 1mil years we may arrive at similar inevitability again - remember i took a very conservative approach that they all go extinct after 10k years of developing electromagnetic communications- if they tend to live a lot longer or maybe only some of them do the calculation is much tougher but its more promising too)


However to be honest i am not so optimistic in your numbers being that high. I think the transition from life to advanced civilization is terribly small. Too many things typically will go wrong .

That said however i will claim that whoever is first (in a proper relativistic sense) goes on to explore a huge part of the universe and for this reason establish contact anyway if they so wish. Locally in a big part of the universe but still small in terms of the total there will be small chance that another civilization is close in evolutionary development to the first one. For this reason the first one has millions of years to expore thousands of galaxies. Basically the moment the first one leaves their planet and start expanding it has survived forever spreading at the speed of light or so . They go on a path of billions of years to explore billions of galaxies around the original one and for this reason they force a communication on the other inevitably.

The problem of course is that that number 2*10^11 could still be millions of times smaller making it possible that they are so far from each other that they almost never meet in the time the universe is taking to expand . However even then i am positive that as long as 2 of them survive even at very distant parts of the universe they will eventually meet in the very distant future but of course the meeting is that of billions of years old offsprings of the original ones.

Last edited by masque de Z; 06-07-2011 at 04:29 PM.
Ask a probabilist Quote
07-01-2011 , 11:02 AM
I would be extremely grateful if a probabilist would take a look at
this and show me a better way to condense the thought process. I am
open to all critisism, including the calculations.

Please tell me if this post would be more appropriate in the Probability
forum.

I'm trying to learn to think like a probabilist thinks i.e, "this implies that
when or if some event happens", etc.

I use Excel extensively and use lookup and probability tables
(using the =prob built-in function) to assess the probabilties of good
and bad scenarios in different situations and adjust my pot equity
accordingly. I'm not sure I'm using the correct analysis.

I'm open to all criticism.

Note: I did not want to include much about "opponent's tendencies..." ,
etc. but did so very little to show how I arrived at certain probabilities.

Thanking you in advance!

NL Hold em

Opponent raises 3x BB UTG

Your hand: [Ah Jh] You call on button. All others fold.

Flop: [Kh 2h 4s] You flop a flush draw.

Your opponent has a tight UTG raising range when raising 3x BB:

AA = C(4,2) = 6 COMBOS
KK AK = C(8,2)-6 = 22 COMBOS
KQs = C(4,3) = 4 COMBOS
AQs = C(4,3) = 4 COMBOS
AJs = C(3,2) = 3 COMBOS (I have A, J holecards)
TT-QQ = C(6,2)*3 = 18 COMBOS
--------------
57 TOTAL COMBOS

Post-flop action: tight opponent makes a decent-sized bet.

Thus you are fairly certain he has AA or a King in his holecards .

Percentage of combos that contain a king: 26/57 = ~46%

This opponent doesn't make big C-Bets without a strong hand.

So you adjust the probabilty opponent has a AA or King to 95%.

Estimate the probability opp. HAS a King and doesn't have AA:

There are six combos of AA and twenty-six combos of a King.

Probability that his holecards contain a King and NOT AA is:

Prob of A: 1 - C(4,2) / 26 = ~77%

And your'e 95% sure that Prob of A is true (see adj. above)

So you estimate the probability that opponent has a King in his hand as:

95% * 77% = ~73%

Is this a reasonable anaysis? I'd be especially interested in seeing how a
probabilist would go about solving this using your language which will
reveal your thought process.

The math shows his probability of having a King in his range is 46% and
I more than doubled that figure, throwing that calculation out the
window. But an increase here is mandatory based on opponent's action
and from many years playing experience.
Still, do you think this is out of line? TBH I wanted to make it 99%.

************************************************** *****

I believe it's somewhat important to accurately estimate the probability
that your opponent has a King, as there are some ways you can make
the nut flush and must fold on the river. Every .1% advantage adds up
over time.

Here' one scenario:

A King comes on the turn and you make the flush on the river by
catching the 4h.

So the final board is:

[ Kh 2h 4s Kx 4h ], putting Kings-up on the board.

Event A = you miss the flush AND a King comes AND
you make flush AND it’s the 4h

Order matters so I solved using permutations and combinations.

Probability of Event A occurring: P(4,2) / C(47,2) = ~0.11%

or another way to solve:

Probability of Event A occurring: 38/47*3/47*1/46 = ~0.11%

Conclusion: there is at least a 0.11% probalility the flush will hit but a
fold is in order.

This is just one bad scenario. Other bad scenarios can happen and thus
the inclusion of the words "at least" in the conclusion. And good scenarios
can also happen such as you catching J, J.

Is the math correct? Is the thinking correct?

The reason I'm so interested is I do hundreds of these kind of
calculations using Excel and VBA and write lots of functions. Your input
would give me a better perspective on how to go about setting up a
blueprint before writing a function. I've read through the whole thread
but in most cases the replies are too deep for my understanding.

Thanks again.

Last edited by SMA775; 07-01-2011 at 11:14 AM.
Ask a probabilist Quote
07-20-2011 , 06:42 AM
A question about the word "random".

Suppose a computer "flips a coin at random" repeatedly: will the output always be periodic?
Ask a probabilist Quote
07-23-2011 , 10:22 PM
Quote:
Originally Posted by SMA775
I would be extremely grateful if a probabilist would take a look at
this and show me a better way to condense the thought process. I am
open to all critisism, including the calculations.

Please tell me if this post would be more appropriate in the Probability
forum.
Yes, I think it would be. If you have not yet found the answers you seek, then I would definitely suggest posting this in the probability forum.
Ask a probabilist Quote
07-23-2011 , 10:30 PM
Quote:
Originally Posted by lastcardcharlie
A question about the word "random".

Suppose a computer "flips a coin at random" repeatedly: will the output always be periodic?
I am not sure I understand your question. Are you asking about this:

Wiki - Pseudorandom number generator - Periodicity
Ask a probabilist Quote
07-23-2011 , 10:45 PM
Can you explain a little about the work of Srinivasa Varadhan? I think of all the Abel Prize winners, I know the least about what he achieved
Ask a probabilist Quote
07-24-2011 , 08:41 AM
Quote:
Originally Posted by jason1990
I am not sure I understand your question. Are you asking about this:

Wiki - Pseudorandom number generator - Periodicity
Yes, except I'm really trying to get at what an "infinite sequence" is. I don't agree with the standard definition as a function on the naturals because I don't believe there are any non-computable such functions. The existence of any random sequence would appear to falsify this belief.

The part on hardware random number generators in the article is interesting. I think the two methods of generating an infinite sequence:

(1) use a non-terminating pseudorandom number generator;
(2) keep on physically flipping a coin and never stop;

are fundamentally different and that the output of the second does not qualify as an infinite sequence. I cannot articulate why very clearly, but I think it is something to do with that all the terms of the first sequence are encapsculated by a rule, whereas with the second there is no way of determining what the (n+1)th term is before the nth term has been determined.

No specific question but I would be interested in any comments you might have.
Ask a probabilist Quote
07-25-2011 , 08:23 AM
FWIW there are uncountably many sequences, but there are only countably many periodic sequences. It's very much like asking "if you pick a random number between 0 and 1 will it always be rational?"
Quote:
Originally Posted by lastcardcharlie
A question about the word "random".

Suppose a computer "flips a coin at random" repeatedly: will the output always be periodic?
Ask a probabilist Quote
07-25-2011 , 09:41 AM
Quote:
Originally Posted by thylacine
FWIW there are uncountably many sequences, but there are only countably many periodic sequences.
Standard, but I am questioning the standard. I am questioning whether, if you get a hardware random number generator and leave it running, the output should be considered as a sequence at all.

Quote:
It's very much like asking "if you pick a random number between 0 and 1 will it always be rational?"
Yes, it is, so let's pick one. Let x_1, x_2, ... be the output of the above, where each x_i is a zero or one. Let any finite string x_1, x_2, ..., x_n of zeros and ones be interpreted as the closed interval of numbers:

[(x_1)/2 + (x_2)/4 + ... + (x_n)/2^n, (x_1)/2 + (x_2)/4 + ... + (x_n)/2^n + 1/2^n]

So the output is interpreted as a nested "sequence" of arbitrarily small closed intervals of numbers between 0 and 1. The standard view is to prove that the intersection of all these intervals is a point, and consider this point as the randomly picked number. Now I agree there is plenty that can be said about this number, but in reality it never gets picked and we can never say exactly what it is.

Contrast that with the case that x_1, x_2, ... is the output of a pseudorandom number generator. If we worked out the code we would be able to say exactly what number it picks.
Ask a probabilist Quote
07-26-2011 , 03:37 AM
Pretty awesome thread. I never thought about the differences between probability and statistics; would the fundamental difference between probability and statistics be the former is dealing with complete information and the latter with incomplete?

Quote:
Originally Posted by jason1990
Not really. Although I do think it gave me a slight advantage in things like bankroll management and statistical analysis of my winrate (which are of course related). For example, what is the subtle problem with the maximum likelihood approximation formula for the standard deviation, as presented by Mason in his well-known essay, and how might this affect the results? How does Poker Tracker compute your standard deviation, why might its result be inaccurate, and how can you fix it? How do you do proper bankroll management when you regularly play at different levels and even in different games? How can you incorporate qualitative information about your skill level into a statistical estimate of your winrate? What size downswings can I expect, and after how many hands can I expect them to appear? These questions seem to elude a lot of people. In fact, I have seen very smart people who confidently give the wrong answers to them. I have written a lot about these topics here on the forums, and also in my private notes. When I find the time, I hope to compile it all and make a long article/short book about it.
This is basically everything I'm stuck on in poker atm, would be super awesome if you ever compile all that!!!!



Let's say somebody was going to give me $1.10 if a flip coin landed on heads, but I'd lose $1 if it landed on tails and I had $5; how would figure out the risk of ruin? Would be super awesome if there's a way to work this stuff out in real life quickly(if people somehow figure out I play poker and they're all anti-gambling I sometimes give them the ^ situation, which they sometimes ask what if I lose my $5, would be awesome if I could explain this to them)
Ask a probabilist Quote
07-26-2011 , 05:36 AM
Oh and another question that's been on my mind for a bit now:

Let's say we are omniscient about a number events that will occur very infrequently, let's say <1%, what would be the threshold for the event not to be given any weight and labeled as non existent? Or is there no standard threshold and it will be determined by the "value" of the event? For example, considering an event that happens 10^-999999 is pretty out of scope
Ask a probabilist Quote
07-26-2011 , 10:31 PM
Quote:
Originally Posted by Max Raker
Can you explain a little about the work of Srinivasa Varadhan? I think of all the Abel Prize winners, I know the least about what he achieved
Varadhan received the Abel Prize for his work in large deviations. I have heard it said that large deviations is the next level in the hierarchy of approximation tools consisting of: law of large numbers, central limit theorem, and large deviations.

Let {Xn} be a sequence of mean zero, iid random variables, and let Xn = (X1 + ... + Xn)/n. Given a > 0, we wish to understand the asymptotic behavior of P(Xn > a). A naive application of the central limit theorem would lead us to replace this with P(N > n1/2a), where N has a normal distribution. In the case where each Xn has unit variance, so that N is a standard normal, we have
Or, in terms of a formal limit theorem,
We might speculate, then, that
But we would be wrong. As it turns out, the central limit theorem is not sharp enough to handle this situation, which involves such extremely rare events.

One of the characterizing features of the central limit theorem is that it is an invariance principle. The limiting distribution (i.e. the normal) does not depend on the distribution of the Xn's. The same is not true for large deviations. In our setting, we have
where the function γ depends on the distribution of the Xn's.

The basics of large deviations for iid sequences are discussed in Section 1.9 of Probability: Theory and Examples. The theory of large deviations, however, extends well beyond iid sequences. There are applications, for example, to the modeling and study of metastable behavior, as discussed in Random perturbations of dynamical systems and Large deviations and metastability. Another good reference is Large Deviations for Stochastic Processes.
Ask a probabilist Quote
07-26-2011 , 11:57 PM
http://www.abelprisen.no/nedlastning...aradhan_en.pdf

He won the award at the age of 67. Do you know how old he was when he actually did most of the work? It looks like he has important papers in the 1970s when he would have been in his 30s.
Ask a probabilist Quote
07-29-2011 , 03:13 AM
What are the odds that I am in the Matrix or something equivalent?
Ask a probabilist Quote
07-29-2011 , 05:10 AM
Quote:
Originally Posted by jason1990
Varadhan received the Abel Prize for his work in large deviations.
Obviously his work on large deviations is very significant, but surely his work with Stroock on diffusions and the martingale problem merits some mention?
Ask a probabilist Quote
07-29-2011 , 07:46 PM
Quote:
Originally Posted by lastcardcharlie
Yes, except I'm really trying to get at what an "infinite sequence" is. I don't agree with the standard definition as a function on the naturals because I don't believe there are any non-computable such functions. The existence of any random sequence would appear to falsify this belief.

The part on hardware random number generators in the article is interesting. I think the two methods of generating an infinite sequence:

(1) use a non-terminating pseudorandom number generator;
(2) keep on physically flipping a coin and never stop;

are fundamentally different and that the output of the second does not qualify as an infinite sequence. I cannot articulate why very clearly, but I think it is something to do with that all the terms of the first sequence are encapsculated by a rule, whereas with the second there is no way of determining what the (n+1)th term is before the nth term has been determined.

No specific question but I would be interested in any comments you might have.
Well, as a probabilist, I would not typically speak of any particular sequence of numbers (such as the digits of pi) as being random. For me, no particular sequence of 0's and 1's can represent a sequence of coin flips. A sequence of coin flips is represented by a probability measure on the space of sequences of 0's and 1's. Or, if you prefer, it is represented by the sequence of probability measures {Pn}, where Pn is the uniform measure on {0,1}n.
Ask a probabilist Quote
07-29-2011 , 07:56 PM
Quote:
Originally Posted by jason1990
Varadhan received the Abel Prize for his work in large deviations....
Thanks. Will definitely try to learn more about these concepts when i have the time/if i need them.
Ask a probabilist Quote
07-29-2011 , 08:34 PM
Quote:
Originally Posted by pat3392
Pretty awesome thread. I never thought about the differences between probability and statistics; would the fundamental difference between probability and statistics be the former is dealing with complete information and the latter with incomplete?
No, I would not say that. The laws of probability can be interpreted as a quantification of inductive reasoning, so in this sense, the idea of incomplete information is built into the very foundations of the subject.

Quote:
Originally Posted by pat3392
Let's say somebody was going to give me $1.10 if a flip coin landed on heads, but I'd lose $1 if it landed on tails and I had $5; how would figure out the risk of ruin?
This question is simpler if we put the asymmetry in the probabilities, rather than the payouts. For example, suppose you win $1 with probability 55%, and lose $1 with probability 45%. Again, you start with $5. In this case, your risk of ruin is (.45/.55)5 = 0.367, which can be derived from this formula by taking the limit as n2 goes to infinity.

The situation you described is more complicated. If you want more details about situations like this, I highly recommend the probability forum. There are a lot of smart folks there.

Quote:
Originally Posted by pat3392
Let's say we are omniscient about a number events that will occur very infrequently, let's say <1%, what would be the threshold for the event not to be given any weight and labeled as non existent? Or is there no standard threshold and it will be determined by the "value" of the event? For example, considering an event that happens 10^-999999 is pretty out of scope
It seems you are asking this: If an event E has probability p, how small does p have to be for us to act as though E is impossible. There is no mathematical answer to this question, of course. Probably the best we can say is that the answer varies from person to person and also depends on what the event E actually is. But I can tell you this: Right now, I am acting as though it is impossible for the air in my living room to randomly wander upstairs, leaving me here to suffocate.
Ask a probabilist Quote
07-29-2011 , 08:59 PM
Quote:
Originally Posted by BruceZ
http://www.abelprisen.no/nedlastning...aradhan_en.pdf

He won the award at the age of 67. Do you know how old he was when he actually did most of the work? It looks like he has important papers in the 1970s when he would have been in his 30s.
I am not familiar enough with large deviations to give you a very informed answer. But just a quick look at MathSciNet shows that he and Donsker have joint publications on large deviations with dates ranging from 1975 to 1989. Generally speaking, I think Varadhan has been active and prolific for his entire career, and continues to be even today. In fact, I just heard him speak four months ago.
Ask a probabilist Quote
07-29-2011 , 09:28 PM
Quote:
Originally Posted by Rhaegar
What are the odds that I am in the Matrix or something equivalent?
For this, you must ask a philosopher. Try this guy.
Ask a probabilist Quote
07-29-2011 , 09:49 PM
Quote:
Originally Posted by blah_blah
Obviously his work on large deviations is very significant, but surely his work with Stroock on diffusions and the martingale problem merits some mention?
Absolutely. I focused on large deviations because Max mentioned the Abel Prize, but you are right. I think I first heard the name Varadhan when studying Chapter 5 of Karatzas and Shreve many years ago. I do not think I have the time or energy to post a digestible discussion of the martingale problem (if that is even possible), so I will take the lazy route and just post a link to the classic text: Stroock & Varadhan 1979.
Ask a probabilist Quote

      
m