Visualizing the
Base Rate Fallacy
Robert is a man of tiny stature from the United States who wears glasses and likes solving puzzles. He is more likely to be, (a) A truck driver, or (b) A computer science student at Stanford?
If your mind was drawn to the answer (b), this is the base rate fallacy at work here. You probably figured that the description of Robert wouldn’t fit in too well with your image or stereotype of truck drivers. But if you would to take a step back and think about the bigger picture - there are clearly way more truck drivers in the United States, than students in a certain university in California. Hence, it follows that Robert is more likely to be a truck driver!
The vivid and oddly specific description has led you to neglect any statistical or general information - what is known as base rate information (and hence the name of this cognitive error). Your mind did not attempt to incorporate any prior knowledge into your reasoning, as it had never occurred to be relevant, and simply embraced any causal explanations.
Judging someone's profession doesn't seem like much of a big deal. But what happens if our reliance on our intuitive judgments can potentially pose real-life consequences? Let's take a closer look at an example that you might be able to relate to - health screenings. Suppose that you are a woman reaching her forties. You are well aware that breast cancer is the most common cancer among women, and that an early diagnosis can increase the chance of recovery. So why not make that trip down to the clinic? Better to be safe than sorry, right? Plus, a mammography will only incorrectly detect breast cancer on a healthy person 9% of the time, which does seem pretty reliable.
So let's say you got your mammogram done, but unfortunately, you are surprised by a positive result. Here’s the important question: given the accuracy of the mammogram test, what is the probability that you actually have breast cancer?
We're about to do some simple calculations, so let me visualize the numbers for you:
Suppose 100,000 women in their forties decide to undergo a mammogram examination.
Let’s visualize 1 dotfor every 1,000 women, for a total of 100 dots.
For the sake of example, let’s assume breast cancer affects 1% of the population.
The test is rather accurate; it always correctly identifies those who have the disease. However, roughly 9% of mammograms will generate a false positive, meaning that the person tests positive but does not actually have the disease.
Hence, the results of the screening are as follows: 1,000 women are correctly found to have breast cancer. But of the remaining 99,000 healthy ones, around 8,910 are incorrectly identified to have breast cancer.
Putting it all together, we can calculate the probability of an inaccurate positive result: a total of 9,910 women will be revealed to have breast cancer, but only 1,000 of them actually carry the disease. In other words, the probability of you having breast cancer is a mere 10%!
Despite the accuracy of the test, a bulk of the people who test positive in the mammogram examination do not actually have the disease! If you have tried experimenting with the sliders in the visualization, you might have noticed a trend - increasing the prevalence of the disease reduces the proportion of false positives.
Hence, what is responsible for this seemingly counter-intuitive phenomenon is simply our failure to recognize the base rate - which in this case, is the low rate of breast cancer in the population. No matter how reliable the test is, the rare occurrence of the disease in the population will always guarantee a sizeable proportion of false positive results.
This is important as mass screenings could result in overdiagnosis of patients and unnecessary invasive follow-up treatments like surgery and chemotherapy. That is why doctors only recommend health screenings on patients with high risk factors, such as age or a family history of the disease (i.e. increasing the base rate and hence the accuracy of the test).
In the medical profession, a common saying beaten into the heads of budding doctors is: "When you hear hoofbeats, think of horses not zebras." In the context of their practice, it cautions medical students not to make any rare diagnoses before considering the more common diseases. It's easy to fall prey to the base-rate fallacy (even two-thirds of doctors fail to judge accurately the efficacy of health screenings), but hopefully even a rudimentary understanding of it will help you arrive at proper judgments, conclusions and make smarter decisions.
So whether you are estimating the odds of success of your first business venture, the likelihood of your child getting into Havard, or the chance of landing a spot at Google with your computer science degree, be sure to ignore any specific information, and focus on the base rate - remember to hold your zebras, and utilize your horse sense instead.