I was briefly conflicted thinking I should do something Thanksgiving-related, but nah, I've been there and done that. It can mean whatever it means to you today, the unpleasant truth about that Pilgrim feast can be considered at another time.
Anyway, Thomas Bayes was an English minister, mathematician and philosopher who died in 1761. A lot of educated Englishmen in those days trained for the clergy so it wasn't at all unusual for clergymen to make secular intellectual contributions. Anyway here's the theorem:
This means that the probability of some proposition or event called "A", given that we know that "B" is true -- aka the probability of A given B, which is what the leftmost term means -- equals the probability of B given A, times the probability of A before we knew about B -- the prior probability of A, divided by the prior probability of B.
It isn't important that you memorize this or go around applying it formally in real life, but it is important to at least have an intuitive understanding of it and reason in this way qualitatively, even if you can't usually plug in real numbers. The easiest way to explain this is with an example. Suppose you go to the doctor and learn that a test result for a rare disease came back positive. The doctor tells you that if somebody has the disease, the test will always be positive; but if they don't have the disease, it will be negative 99% of the time. If you're doctor is like most doctors, he or she will tell you the chance you have the disease is 99%. But that's because doctors don't understand this. True fact! It's been studied. If any physicians are reading this, read carefully!
Remember that this is a rare disease -- in fact only 1 out of 10,000 people actually has it. That's the prior probability of A. That means if you test 10,000 people, 9,999 won't have the disease; but 1 out of 100 of those people will test positive, so on average 99.99, or almost 100 people who don't have the disease, will test positive, while only one person who tests positive will in fact have the disease. Plugging this into the formula, to make it more exact and mathy and stuff, the probability of B given A is 1 but the prior probability of A is only 0.0001. That's the numerator. The denominator -- the prior probability of B - is that -- .0001, the one true positive) + 0.00999, almost 1 out of 100 false positives. positives out of 10,000 people = just slightly more than .01. So your chance of actually having the disease is still only about 1%.
The lesson for life is that if something seems highly improbable to begin with, you should be skeptical even of what seems like pretty convincing evidence. Now, it would be a whole different story if you actually had symptoms of the disease, in other words this is a diagnostic test, not a screening test. If half the people who have the symptoms actually have the disease, the prior probability is .5. Running that through the Bayes mill, if you test positive, your chance of actually having the disease is now more than 95%. So everything depends on that prior probability.
To build up a picture of the world that is likely to be true. we need to put together evidence from various sources. We need to adjust our beliefs based on new evidence, but we need to look skeptically at evidence that doesn't fit at all with the picture that already seems highly likely, in other words we need to seriously consider the hypothesis that there is something wrong with that new evidence, that we have misinterpreted it, or that it's maybe just a coincidence. If I tested positive for a rare disease, I'd probably at least want to have another confirmatory test or diagnostic procedure, but I also wouldn't assume the worst. Be skeptical, look for confirmation, but be willing to be convinced.
No comments:
Post a Comment