Two Plus Two Poker Forums Ask a probabilist
 Register FAQ Search Today's Posts Mark Forums Read Video Directory TwoPlusTwo.com

 Notices 2013 Two Plus Two Party at South Point BET RAISE FOLD screening at 3PM in lounge next to poker room. July 6th Party registration at 5PM. Food served at 6PM

 Science, Math, and Philosophy Discussions regarding science, math, and/or philosophy.

 08-16-2008, 11:14 AM #121 journeyman   Join Date: Apr 2005 Location: Palo Alto, CA Posts: 391 Re: Ask a probabilist I understand that if Omega = the sample space is countably infinite, then it is possible to have a probability measure P such that P(omega) > 0, where omega = an outcome, for all omega. A simple example is the Poisson distribution. If Omega is countable, then we typically take the sigma field to be the powerset of Omega, correct? If Omega is uncountable, can you provide some intuition or a simple proof as to why P(omega) cannot be > 0 for uncountably many omega? Thanks.
 08-17-2008, 06:40 AM #122 veteran     Join Date: Apr 2003 Location: @pnewall Posts: 2,188 Re: Ask a probabilist Thanks for the help on the maths problem Jason, here's a stats one I'm also struggling with: consider the estimator Tn = theeta(n-2) where theeta = 1/sigma^2 from a N(0,sigma^2) distribution I need to find the sampling variance of Tn...does that make any sense? Is it found using E(X^2) - [E(X)]^2?
08-17-2008, 10:19 AM   #123
old hand

Join Date: Sep 2004
Posts: 1,870

Quote:
 Originally Posted by ncray If Omega is uncountable, can you provide some intuition or a simple proof as to why P(omega) cannot be > 0 for uncountably many omega?
Let S = {ω: P(ω) > 0}. We want to prove that S is countable. For this, we begin by observing that

S = S_1 ∪ S_2 ∪ S_3 ∪ ...

where S_n = {ω: P(ω) ≥ 1/n}. Next, we observe that S_n has at most n elements, since the probabilities cannot add up to more than 1. Hence, S is a countable union of finite sets, which means S itself is countable.

08-17-2008, 10:32 AM   #124
old hand

Join Date: Sep 2004
Posts: 1,870

Quote:
 Originally Posted by philnewall ...does that make any sense?
Not completely. I could try to guess what is being asked, but I think that would be pretty inefficient. As with your previous question, I think you would be best helped with in-person assistance.

08-17-2008, 11:31 AM   #125
journeyman

Join Date: Apr 2005
Location: Palo Alto, CA
Posts: 391

Quote:
 Originally Posted by jason1990 Let S = {ω: P(ω) > 0}. We want to prove that S is countable. For this, we begin by observing that S = S_1 ∪ S_2 ∪ S_3 ∪ ... where S_n = {ω: P(ω) ≥ 1/n}. Next, we observe that S_n has at most n elements, since the probabilities cannot add up to more than 1. Hence, S is a countable union of finite sets, which means S itself is countable.
Awesome. Thanks a lot.

 08-17-2008, 09:43 PM #126 newbie     Join Date: Jul 2008 Posts: 47 Re: Ask a probabilist This is a far more pedestrian question than the rest. For a person with no expertise or higher education in math, I still find myself very interested in it - and even moreso in logic and some physics. This lead me to get into the show Numb3rs. From someone who actually understands all the jargon they throw around on the show, how much of it has some fidelity to actual math concepts and solutions? How much is just rhetoric and mumbo-jumbo? And do a lot of math guys like this show, or is it like when we see shows try to do poker-based episodes and it's so dumbed-down that it's laughable? (interestingly, Numb3rs has done both poker and blackjack based episodes)
08-18-2008, 12:16 PM   #127
old hand

Join Date: Sep 2004
Posts: 1,870

Quote:
 Originally Posted by shronkj This lead me to get into the show Numb3rs.
My initial impression when it came out was that Numb3rs is to mathematics what Indiana Jones movies are to archaeology.

Charlie Eppes has an unrealistically broad knowledge of mathematics, but I don't think the show gets too outrageous about that. To me, it feels like the show has less math now than earlier. It seems to have started focusing more on the crime drama aspect. My wife (also a mathematician) and I like the show, but we also like a lot of other crime shows. I have a colleague that loves the show, but he says it is because of the characters and the relations between the Eppes brothers and their father. I have another colleague that hates it. I don't know any mathematicians who love it for the math.

Some of the stuff they say makes me laugh. But I tend to try and defend the show against my wife who very frequently criticizes the math parts. The stuff they say is usually vague enough to allow me to put up a reasonable defense. And they do use professional consultants to try to keep it realistic.

One of the biggest laughs I got was once when Charlie had some formula on a chalkboard. A rival of his entered, saw it, immediately recognized it, and went on to give a detailed criticism. What was funny is that the formula was some totally mundane thing that was completely meaningless out of context, like \sum a_i b_i, or something like that. Also, I don't know anyone who does so much actual work on multiple, large chalkboards. I think most of us use paper, but that would not be as compelling to watch on TV.

08-18-2008, 11:04 PM   #128
newbie

Join Date: Jul 2008
Posts: 47

Quote:
 Originally Posted by jason1990 My initial impression when it came out was that Numb3rs is to mathematics what Indiana Jones movies are to archaeology. Charlie Eppes has an unrealistically broad knowledge of mathematics, but I don't think the show gets too outrageous about that. To me, it feels like the show has less math now than earlier. It seems to have started focusing more on the crime drama aspect. My wife (also a mathematician) and I like the show, but we also like a lot of other crime shows. I have a colleague that loves the show, but he says it is because of the characters and the relations between the Eppes brothers and their father. I have another colleague that hates it. I don't know any mathematicians who love it for the math. Some of the stuff they say makes me laugh. But I tend to try and defend the show against my wife who very frequently criticizes the math parts. The stuff they say is usually vague enough to allow me to put up a reasonable defense. And they do use professional consultants to try to keep it realistic. One of the biggest laughs I got was once when Charlie had some formula on a chalkboard. A rival of his entered, saw it, immediately recognized it, and went on to give a detailed criticism. What was funny is that the formula was some totally mundane thing that was completely meaningless out of context, like \sum a_i b_i, or something like that. Also, I don't know anyone who does so much actual work on multiple, large chalkboards. I think most of us use paper, but that would not be as compelling to watch on TV.
All very interesting to a casual fan.

From my limited math knowledge, it seems like all Charlie's "solutions" lately (past two season really) have just been different ways of saying "I'll write an equation that will sift through the data."

I think it's becoming apparent, even to regular viewers like me, that they're running out of ways to have the math solve the crime. Like your colleague, I also really like the father/sons relationships and I still really like their more abstract discussions that focus on logic and reasoning, but the actual a+b=thebutlerdidit parts seem a little strained lately.

Any other specifics that were completely laughable?

Last edited by shronkj; 08-18-2008 at 11:10 PM.

08-19-2008, 12:15 PM   #129
old hand

Join Date: Sep 2004
Posts: 1,870

Quote:
 Originally Posted by shronkj From my limited math knowledge, it seems like all Charlie's "solutions" lately (past two season really) have just been different ways of saying "I'll write an equation that will sift through the data."
I agree, and this is a great way of putting it.

Quote:
 Originally Posted by shronkj Any other specifics that were completely laughable?
Nothing jumps out at me off the top of my head. I do remember being a little bothered by the way he talked about Brownian motion in the episode, Rampage. But I would have to watch the episode again to give you any details.

 09-06-2008, 03:12 PM #130 Carpal \'Tunnel     Join Date: Jul 2006 Location: Illinois Posts: 6,094 Re: Ask a probabilist Let A and B be two events. Prove the following relations by the elementwise method. (a) (A-[A intersect B]) union B = A union B (b) (A union B) - [A intersect B] = [A intersect not B] union [not A intersect B] I'm trying for a stats/math minor and am having difficulty figuring out elementwise method. Can you help lead me to wards how to solve this problem, and how it should be done?
09-06-2008, 03:38 PM   #131
Carpal \'Tunnel

Join Date: Dec 2003
Posts: 6,173

Quote:
 Originally Posted by imjoshsizemore Let A and B be two events. Prove the following relations by the elementwise method. (a) (A-[A intersect B]) union B = A union B (b) (A union B) - [A intersect B] = [A intersect not B] union [not A intersect B] I'm trying for a stats/math minor and am having difficulty figuring out elementwise method. Can you help lead me to wards how to solve this problem, and how it should be done?
(a) Let x be in LHS. x is in (A-[A interect B]) OR x is in B.
Must show x is in RHS. If x is in B done. If x is not in B, x must be in
(A-[A intersect B]) since x is in RHS. That means x is in A (just not that part of A which intersects with B). Therefore x is either in B and done, OR in A and also done. x in LHS ==> x in RHS.

Must also show x in RHS ==> x in LHS. Let x be in RHS. Then x is in A OR x is in B. If x in B then x in LHS and done. If x not in B then x is in A (and x is not in B), since x is in RHS. x not in B implies x not in [A intersect B]. So x is in A and x is not in [A intersect B]. Thus, x is in (A - [A intersect B]) and therefore x is in RHS. So done and done.

I'll leave you to unravel things in (b).

PairTheBoard

 09-06-2008, 07:17 PM #132 Carpal \'Tunnel     Join Date: Mar 2005 Posts: 6,364 Re: Ask a probabilist Annie Duke played a blackjack dealer on an episode of Numb3rs.
 09-06-2008, 08:36 PM #133 Carpal \'Tunnel     Join Date: Jul 2006 Location: Illinois Posts: 6,094 Re: Ask a probabilist An attempt at b? :\ x is either an element of A-AB or B-AB. If x is an element of A-AB, then x is an element of A(Bnot), so x is an element of the union of ABnot and AnotB. If x is an element of B-AB, then x is an element of AnotB., so x is an element of the union of ABnot and AnotB
09-07-2008, 09:59 AM   #134
old hand

Join Date: Sep 2004
Posts: 1,870

Quote:
 Originally Posted by imjoshsizemore Let A and B be two events. Prove the following relations by the elementwise method. (a) (A-[A intersect B]) union B = A union B (b) (A union B) - [A intersect B] = [A intersect not B] union [not A intersect B] I'm trying for a stats/math minor and am having difficulty figuring out elementwise method. Can you help lead me to wards how to solve this problem, and how it should be done?
I will also illustrate (a), in my own words. First, let me try to use some more readable notation:

(a) (A \ (A ∩ B)) ∪ B = A ∪ B.

To prove this, we want to separately prove two different statements.

(a1) (A \ (A ∩ B)) ∪ B ⊂ A ∪ B,
(a2) A ∪ B ⊂ (A \ (A ∩ B)) ∪ B.

Step 1: proof of (a1). The so-called "elementwise method" means we first assume the premise

(P) x ∈ (A \ (A ∩ B)) ∪ B,

and then use (P) to prove the conclusion

(C) x ∈ A ∪ B.

Okay, so let us assume (P). By the definition of the symbol "∪", this means that x ∈ A \ (A ∩ B) or x ∈ B. So let us treat these two cases separately.

Case 1. x ∈ A \ (A ∩ B)
In this case, we assume that x ∈ A \ (A ∩ B). By the definition of the symbol "\", this means that x ∈ A and x ∉ (A ∩ B). So we know that x ∈ A. But this implies that x ∈ A ∪ B, which is (C). So we have proved (C).

Case 2. x ∈ B
In this case, we assume x ∈ B. But this implies x ∈ A ∪ B, which is (C). So again we have proved (C).

In summary, (P) implies either Case 1 or Case 2 is true. In both cases, (C) is true. We have therefore proved that (P) implies (C). This is logically equivalent to (a1). So we have proved (a1).

Step 2: proof of (a2). As before, we first assume

(P') x ∈ A ∪ B,

and then use (P') to prove

(C') x ∈ (A \ (A ∩ B)) ∪ B.

So let use assume (P'). This means x ∈ A or x ∈ B.

Case 1. x ∈ A
This case is a little tricky, so we will break it down into two subcases.

Subcase 1.1. x ∈ A and x ∈ B
In this subcase, we assume x ∈ A and x ∈ B. But x ∈ B implies x ∈ (A \ (A ∩ B)) ∪ B, which is (C').

Subcase 1.2. x ∈ A and x ∉ B
In this subcase, we assume x ∈ A and x ∉ B. But this implies that x ∉ A ∩ B. So we can say that x ∈ A and x ∉ A ∩ B. But by definition, this means x ∈ A \ (A ∩ B), which implies x ∈ (A \ (A ∩ B)) ∪ B, and this is (C').

Case 2. x ∈ B
In this case, we assume x ∈ B. But this implies x ∈ (A \ (A ∩ B)) ∪ B, which is (C').

In all cases (C') is true, so we have proved that (P') implies (C'), which means we have proved (a2). Since (a) is logically equivalent to "(a1) and (a2)," Steps 1 and 2, combined, give a complete proof of (a), and we are finished.

It may be useful to take a big-picture survey of what we did to prove (a). We proved it by separately proving (a1) and (a2) in two distinct steps. In each step, we proved something of the form

"left" ⊂ "right".

We did this by assuming that x ∈ "left", and then using this assumption to prove that x ∈ "right". This is the skeleton that all problems like this will have.

The details contained within the two separate steps will depend very much on what "left" and "right" actually look like. You will not always have to break the individual steps into multiple cases/subcases, but this is a common and useful technique.

Last edited by jason1990; 09-07-2008 at 10:06 AM.

12-17-2008, 04:17 PM   #135
old hand

Join Date: Sep 2004
Posts: 1,870

In an attempt to elevate the non-religious content of this forum, I thought I would resurrect this old thread. I have said it before, but it is worth repeating that probability, depending on the context, can be considered a science, a branch of mathematics, and a subject of deep philosophical thought. It also is (or should be) of extreme importance and interest to any community of gamblers.

Since I am not a big fan of the no-content bump, I thought I would take this opportunity to answer the following question, which I received by PM:

Quote:
 I recently encountered an interesting solution to the derangements problem using probability theory but the solution has a dangerous step that I would appreciate your input on. Q. What is the probability that a permutation of [n] chosen at random has no fixed points? Let A_i be the event that point i is not fixed. P( A_i ) = (n-1)/n. Clearly, A_i and A_j are not independent but as the size of the set increases, the dependence between them weakens. If we let n go to inf, we can treat the events as if they were independent. P(no fixed) = lim (n-> inf) [ ( (n-1)/n ) ^ n ] = lim (n-> inf) [ ( 1 - 1/n ) ^ n ] = 1/e Is it a valid technique to treat the events as independent in the limit, if so when can this be applied ?
This "proof" looks clever and leads to the right answer. But of course, that does not confer upon it any degree of credibility. Let us first look at this piece:

Quote:
 Let A_i be the event that point i is not fixed. P( A_i ) = (n-1)/n. Clearly, A_i and A_j are not independent but as the size of the set increases, the dependence between them weakens. If we let n go to inf, we can treat the events as if they were independent.
This is not very precise, but it is correct. It is easy to calculate both P(A_i ∩ A_j) and P(A_i)P(A_j). When we do this, we see that they are very close. Their difference is approximately 1/n³.

Now, the next step.

Quote:
 P(no fixed) = lim (n-> inf) [ ( (n-1)/n ) ^ n ]
This notation is a little unclear, but I think the intent is understandable. The goal is to prove that

\lim_{n → ∞} P(no fixed points) = 1/e.

The author is trying to use here the "fact" that

P(A_1 ∩ A_2 ∩ ... ∩ A_n) ≈ P(A_1)P(A_2)...P(A_n) = ((n - 1)/n)^n.

But this supposed "fact" relies on a fallacy. The first part of the proof argued that

P(A_i ∩ A_j) ≈ P(A_i)P(A_j) for all i and j.

This is true, as we mentioned above. In probability, we call this pairwise independence. But pairwise independence does not imply independence. So we cannot automatically conclude (even approximately) that the same thing is true for more than two events.

So this "proof" is not valid. The problem with it is not simply a lack of rigor, or faulty notation. The problem is that it relies on the false premise that pairwise independence implies independence. The standard, correct, probabilistic proof uses inclusion-exclusion, and the Taylor series expansion for e^{-1}.

 Thread Tools Display Modes Linear Mode

 Posting Rules You may not post new threads You may not post replies You may not post attachments You may not edit your posts BB code is On Smilies are On [IMG] code is On HTML code is OffTrackbacks are Off Pingbacks are Off Refbacks are Off Forum Rules

All times are GMT -4. The time now is 03:51 AM.

 Contact Us - Two Plus Two Publishing LLC - Privacy Statement - Top