(Indeterminate, like me. Think outside the box, but when you step outside the box ... try to keep one foot in)
Friday, March 18, 2011
Conditional Probability (Part 1 of 4)
Say that in a particular city, 48% of homes have broadband internet installed, and 6 % of homes have both cable television and broadband internet (we are obviously talking about a second-world country).
The question is: what is the probability that a particular home has cable TV, given that it has broadband?
If X and Y are events, we write the conditional probability of X given Y as P(X/Y).
Mathematically, this is defined as follows;
P(X/Y) = P(X & Y) / P(Y)
(This only makes sense when P(Y) does not equal zero).
In the above example, we take X to be the event that the house has cable TV, and Y to be the event that it has broadband. Notice that we do not have to know P(X) to calculate the answer:
P(X/Y) - 0.06/0.48 = 0.125, or 12.5 %.
In many contexts, conditional probability is extremely useful, as it allows probabilities to be updated as new information becomes available.
This is known as Bayesnian inference.
In 1794, an important paper by the reverend Thomas Bayes was published posthumously. In it he gives a compelling account of conditional probabilities.
The basis is Bayes' Theoerem, which states that for any events X and Y:
P(X/Y) = P(Y/X) x P(X)/P(Y)
In a sense, this formula is not deep. It follows directly from the definition of conditional probability:
P(Y/X) = P(X & Y)/P(X)
so
P(X & Y) = P(Y/X)P(X)
Substituting this into the definition of P(X/Y) produces the result.
However, this theorem has been of great use, for example in the analysis of the problem of false positives.
From: Mathematics 1001, by Dr. Richard Elwes
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment