Open Side Menu Go to the Top
Register
Ask a probabilist Ask a probabilist

12-07-2011 , 09:36 AM
Quote:
Originally Posted by BrianTheMick
Emphasis on teaching?
According to the university, my official time allocation is 50% teaching, 47% research, and 3% service.
Ask a probabilist Quote
12-12-2011 , 09:28 AM
Quote:
Originally Posted by punter11235
I have a question:
If you were to choose what kind of work you do in your career as probabilist what would it be? ... How does it relate to what you actually do?
As a researcher in academics, I do in fact get to choose what I work on.

Quote:
Originally Posted by jason1990
I usually work on limit theorems for stochastic processes. This typically involves finding analogues of the law of large numbers and the central limit theorem, which apply to continuous-time processes. A simple example of this type of theorem is Donsker's theorem.
Ask a probabilist Quote
02-03-2012 , 01:58 AM
probably worth a bump
Ask a probabilist Quote
02-03-2012 , 06:53 PM
Since you work on LLNs and CLTs...I have a couple of questions.

Could you explain (intuitively if possible) the difference between MSE (mean squared error) convergence and convergence in probability? I guess all I need is a simple counterexample that shows the latter doesn't imply the former.


Another issue I'm having trouble with is how there can be something like an asymptotic variance if a r.v. converges in MSE on some value.

Finally, when it comes to estimation, wouldn't it be more worthwhile to identify consistent estimators that converge as rapidly as possible instead of those that are asymptomatically efficient? Is there a connection between the two, like do estimators that achieve the cramer rao lower bound converge more rapidly as well?

Last edited by Vael; 02-03-2012 at 07:01 PM.
Ask a probabilist Quote
02-05-2012 , 11:06 PM
Suppose you make a monkey type on a typewriter for 10^18 seconds and the monkey can type 10 keys per second. There are 44 keys on the typewriter. Suppose Shakespeare's Hamlet is a sequence of 10^5 characters. What is the probability that the monkey will type at least one Hamlet.
Ask a probabilist Quote
02-13-2012 , 02:22 PM
You're offered the chance to buy a Mystery Lottery Ticket. Mystery Lottery means that some things aren't clear to you. You don't know how many other people will enter the lottery, you don't know how many tickets are sold, and you don't know how many numbers you're drawing at - ie, the ticket you receive is encrypted, so you don't know what number or numbers it has and you don't know how many numbers are in the pool to be drawn from. You might be drawing at one number in three or at one in a hundred billion.

If the price is $1 and the pool is $1000, should you enter? At what size prize pool would a perfectly rational bettor consider buying a $1 ticket to be +EV?

Apologies if this is trivial or just nonsense, it just crossed my mind recently.
Ask a probabilist Quote
02-13-2012 , 03:32 PM
Unless you have some reason to think the person booking it is a moron or actively intends to lose money, you should expect the decision to play to be -EV (like rolling a die to bet in a parimutuel horse pool, which has similarities). It's not really answerable from first principles (AFAIK) because you have to assign (at the bare minimum) a couple of pdf's over the natural numbers to cover the possible numbers and the possible entries, and I don't see any properties of the problem that would allow you to justify one as correct (there's no such thing as a uniform distribution over the natural numbers).
Ask a probabilist Quote
02-13-2012 , 03:36 PM
I didn't think about it in terms of the motive of the person booking it.

I haven't been able to figure out anything that looks like an answer or even whether it's answerable, but I was hoping that was just because I don't know much about this kind of thing. Cheers.
Ask a probabilist Quote
02-15-2012 , 10:48 AM
Well probabilists evaluate the scenario in this thread then.

10000 exact copies of someone are created, and I mean *exact* copies, down to detailed configuration of neurons and mental state. Within a week, 5000 of them commit murder. What is the probability for each of the remaining 5001 copies to also commit murder over their lifetime?

(I assumed it was >95%... Close?)
Ask a probabilist Quote
02-15-2012 , 02:23 PM
Are there any truly random functions (on the interval [0,1])?

That is, given a member x of [0,1], I want f(x) to be member of R such that I cannot determine f(x) in any way given the initial value on [0,1].

Maybe this will be more likely to work on (0,1) than on [0,1]... not sure...

I don't think you can do this using elementary or special functions, but maybe something in measure theory has some tricks up it's sleeve. I'm just curious...
Ask a probabilist Quote
02-20-2012 , 11:48 PM
Quote:
Originally Posted by Vantek
Well probabilists evaluate the scenario in this thread then.

10000 exact copies of someone are created, and I mean *exact* copies, down to detailed configuration of neurons and mental state. Within a week, 5000 of them commit murder. What is the probability for each of the remaining 5001 copies to also commit murder over their lifetime?

(I assumed it was >95%... Close?)
There isn't enough information given.

Age until death? Variance on this?

Were 50% put into a kill-or-be-killed situation and the others into solitary confinement?

Etc. Etc. Etc.

Research shows that situation matters. I am pacifist, but have within the last two weeks been in a bar brawl.
Ask a probabilist Quote
02-21-2012 , 12:11 AM
Quote:
Originally Posted by BrianTheMick

Research shows that situation matters. I am pacifist, but have within the last two weeks been in a bar brawl.


We need pics. And not just of you! Though it would be understandable if none were available, or if for privacy reasons you refrain from posting the mug shot.

Did you kick them when down? I would have.


-Zeno
Ask a probabilist Quote
02-21-2012 , 07:23 AM
Quote:
Originally Posted by BrianTheMick
I am pacifist, but have within the last two weeks been in a bar brawl.
TR needed imo

Last edited by Vael; 02-21-2012 at 07:24 AM. Reason: also, pumping thread in hope jason might return
Ask a probabilist Quote
02-21-2012 , 07:30 PM
Quote:
Originally Posted by jewbinson
Are there any truly random functions (on the interval [0,1])?

That is, given a member x of [0,1], I want f(x) to be member of R such that I cannot determine f(x) in any way given the initial value on [0,1].

Maybe this will be more likely to work on (0,1) than on [0,1]... not sure...

I don't think you can do this using elementary or special functions, but maybe something in measure theory has some tricks up it's sleeve. I'm just curious...
Ok, so I had a medium-length chat with a maths buddy of mine... anyway, the conclusion is decided on your definition of "function", and upon answering the questions posed, you automatically narrow down precisely what you mean when you say "function".

If you are forbidden to know the output for a given input, then the function is uncertain of what it is. You have to be told how you can get to the output, and this feat has to be possible (under the usual sense), otherwise the function itself is "uncertain to what it is" and so is not a function at all. I am not 100% convinced of this point, however.

On the other hand, if you can construct a non-repeating iteration of (perhaps even elementary) functions, i.e. f(g(h(...))) then it might be impossible to have the computing power to even estimate the output of the function, even if it does exist. The Weierstrass function doesn't quite do this because for any given input, there is a single output, and this output can be found deterministically. However, I imagine there are functions whose output for a/many (or any) given number is not attainable even though the output does exist (and is just one number). I am not sure if these "functions" really would be functions, and of course it all depends on your definition of a function. I am not sure what the "standard" definition of function has to say about this (that is, if there is one).
Ask a probabilist Quote
02-21-2012 , 11:02 PM
Quote:
Originally Posted by Zeno
We need pics. And not just of you! Though it would be understandable if none were available, or if for privacy reasons you refrain from posting the mug shot.

Did you kick them when down? I would have.
I am much kinder than you. There was no kicking.

Pics are not available. I could put something together based on my recollection using MS Paint.

Quote:
Originally Posted by Vael
TR needed imo
In the interest of silliness:

Characters:

Crazy drunk guy - Angry due to his poor station in life.
Crazy drunk guy's friends - A menacing lot until push comes to shove.
Bartender - 19 year-old catholic school cheerleader type. Possibly* bisexual.
Waitress - 19 year-old adorkable hot dumb girl. Possibly* bisexual.
Bar Owner - Mid-40-ish skinny 5'2" Italian American. A pocket person, if you will.
Bar Owner's Mom - Old and little. Likes to randomly feel up the customers and bar help.
The Hero - Me. Has his eye on the bartender and waitress and has been working for weeks to pick their locks.
Extras - Varios dregs of society. Young wanna-be tough guys. Random career drunks. Nary a spine to be found amongst them.

Setting - Bar in the middle of an industrial park.

Story - Crazy drunk guy called the waitress a bitch. The rest followed inevitably. Very linear story that can be directly derived from the characters.

Things I learned:

"Stop choking me" sounds funny when it is uttered repeatedly. Varying the quality and strength of the choking during the utterance is quite amusing.

Free beer follows decisive action.

Some people actually do poop themselves to stop a beating.

If you look like me, the police take away the other guy with no questions asked.

People's recollection of events have very little to do with the actual events. The next week's trip to the bar was met with a variety of stories about what I did.

*Hopefully
Ask a probabilist Quote
02-21-2012 , 11:13 PM
Quote:
Originally Posted by Vael
Could you explain (intuitively if possible) the difference between MSE (mean squared error) convergence and convergence in probability? I guess all I need is a simple counterexample that shows the latter doesn't imply the former.
As a probabilist, I do not know the jargon of the statisticians, but I guess you want an example of a sequence of random variables that converges in probability, but not in L2. The standard example is this. Let {Un} be iid uniform(0,1), and let
Xn = n1{0 < Un < 1/n}.
Then P(|Xn| > ε) = 1/n → 0, so Xn → 0 in probability. However, E|Xn|2 = n, so it does not converge in L2.

Quote:
Originally Posted by Vael
Another issue I'm having trouble with is how there can be something like an asymptotic variance if a r.v. converges in MSE on some value.
I guess "asymptotic variance" is a statistician's term. After browsing through Wikipedia, here is my best guess as to the meaning of "asymptotic variance". Let V be a real number. A sequence of random variables {Xn} has asymptotic variance V if there exists a real number m such that
n1/2(Xn - m) → Z
in distribution, where Z is a random variable with mean 0 and variance V. If this is right, then I am not sure what the trouble is. If {Yn} are iid with mean m and variance V, and if
Xn = (Y1 + ... + Yn)/n,
then Xn → m in L2 by the law of large numbers, and {Xn} has asymptotic variance V by the central limit theorem.

Quote:
Originally Posted by Vael
Finally, when it comes to estimation, wouldn't it be more worthwhile to identify consistent estimators that converge as rapidly as possible instead of those that are asymptomatically efficient? Is there a connection between the two, like do estimators that achieve the cramer rao lower bound converge more rapidly as well?
Sorry, but there is too much statistician's jargon here for me to understand your question.
Ask a probabilist Quote
05-16-2012 , 10:54 PM
Quote:
Originally Posted by Yodaleheehoo
Suppose you make a monkey type on a typewriter for 10^18 seconds and the monkey can type 10 keys per second. There are 44 keys on the typewriter. Suppose Shakespeare's Hamlet is a sequence of 10^5 characters. What is the probability that the monkey will type at least one Hamlet.
Let
An = "The monkey types Hamlet starting with the n-th keystroke."
If n < m < n + 105, then (An & Am) implies that the first n + 105 - m characters of Hamlet are repeated exactly at the end of Hamlet. I suspect that this is not possible for any number of characters. Therefore, let us assume that (An & Am) is impossible whenever n < m < n + 105.

If M = 1019 - 105 + 1, then
A := (A1 or A2 or ... or AM)
= "The monkey types Hamlet at least once."
By inclusion-exclusion,
Let N = M - 105. Since P(An & Am) = 0 whenever n < m < n + 105, we have
Let p = (1/44)^(105), so that
5.399 x 10-164346 < p < 5.4 x 10-164346.
Then
and
5.398 x 10-164327 < Mp < 5.4 x 10-164327.
Also,
This gives
5.397 x 10-164327 < P(A) < 5.4 x 10-164327.
In other words, P(A) ≈ Mp ≈ 5.4 x 10-164327.
Ask a probabilist Quote
05-16-2012 , 11:36 PM
Quote:
Originally Posted by TomCowley
It's not really answerable from first principles
I agree. More hypotheses are needed to come to any conclusions.

Quote:
Originally Posted by All-In Flynn
If the price is $1 and the pool is $1000, should you enter?
I assume you mean, "If the price is $1 and the pool is $1000, then is the expected value positive?" Given enough hypotheses, probability can tell you about the expected value. But probability cannot tell you whether you should enter.

Quote:
Originally Posted by All-In Flynn
At what size prize pool would a perfectly rational bettor consider buying a $1 ticket to be +EV?
A perfectly rational bettor would conclude that there is not enough information to determine an expected value.

Although I have not given much of an answer here, I think the question itself highlights an interesting phenomenon.

Probability is just a generalization of classical logic that allows us to conduct inductive reasoning in a quantitative manner. In logic, people are not usually surprised by the answer, "Not enough information." But many people expect probabilists and/or statisticians to be able to offer up an answer, no matter what information is put forward. The misconception is that there will always be some probability that answers the question, and that adding or eliminating information simply changes that probability.
Ask a probabilist Quote
05-16-2012 , 11:38 PM
Quote:
Originally Posted by jason1990
Let
An = "The monkey types Hamlet starting with the n-th keystroke."
.........................



In other words, P(A) ≈ Mp ≈ 5.4 x 10-164327.


Exponential Expression Calculator:

http://asknumbers.com/exponents.aspx


That's what I thought.


Thank you for the effort. We now all have a definitive number we can use when we get asked/told about this famous question at parties.
Ask a probabilist Quote
05-20-2012 , 08:18 PM
So, we have the binomial distribution:

Prob(K=k) = p^k * (1-p)^(n-k) where k = 0,1,2,3,...n

Let's change this to:

Prob(K=k/sqrt(n)) = p^k * (1-p)^(n-k) where k = 0,1,2,3,...n

Could we now say that as n approaches infinity, this distribution approaches a continuous distribution?

I asked my lecturer and he said no, but I didn't really understand his explanation.
Ask a probabilist Quote
05-20-2012 , 11:05 PM
What do you mean by "approaches a continuous distribution"? For a simpler example, P(x)=1/a on (0,a) doesn't approach any probability distribution as a->inf (and it's already continuous for any positive a).
Ask a probabilist Quote
05-20-2012 , 11:49 PM
For large n and p>0 the binomial goes into the Gaussian with avg m=n*p and sd= (n*p*(1-p))^(1/2) .

Additionally if n->inf and p->0 but n*p->k>0 it goes to the Poisson distribution.

Is this what you wanted to say?
Ask a probabilist Quote
05-20-2012 , 11:50 PM
yes I think what masque de Z said

edit:

so this is part of the lecture notes:



This part of the notes lead me to wonder about the Prob(K = k/sqrt(n)) type distribution

And also I forgot an (n Choose k) up there obv

Last edited by Clue; 05-21-2012 at 12:01 AM.
Ask a probabilist Quote
06-12-2012 , 10:34 PM
[X-posted from SMP HW Help Thread, realized this is a better place for it]

Hey Jason & co,

Roommate needs some stats help for work. So here's the question:

He has 300 documents and wants to sample X of them to get a 95% confidence interval with 3% margin of error;. How do we figure out what X is without knowing what the probability of an error in a given doc is?

Let me know with any questions; trying to figure this out.

Thanks Jason,
Mariogs
Ask a probabilist Quote
06-13-2012 , 11:30 AM
How would you describe the difference between uncertainty and risk in layman's terms. Are there any nice analogies that you came up with that you use when teaching?

Thoughts on Gödel/Turing/Church and especially Chaitin I suppose from a probabilist's POV?
Ask a probabilist Quote

      
m