Today I want to talk about a slightly different topic type. Instead of talking about math, per se, I want to talk about a mathematical reasoning concept that people often get wrong.
Specifically, I want to talk about the difference between a population percentage and an individual likelihood.
I got my DNA sequenced by 23andMe, because I'm fascinated by the incredible technical advances in biotech over the past few years. (For a similar reason, I'm a small investor in Celmatix. Very cool tech and math...)
Anyway, my 23andMe results show me to have far lower risk (than the population average) for Celiac Disease. And yet, I have Celiac!
Clearly, 23andMe is fatally flawed, right?
Nope. Roughly 1% of Americans have Celiac. Let's pretend that my genes suggest I am 50% less likely to have Celiac, which means that (if I'm reincarnated 199 times), out of my 200 lives, I would get Celiac one time. However, for each of those lives, I don't have a 0.5% (1-in-200) chance of getting Celiac.
I have either a 0% chance (i.e., I don't have the disease) or a 100% chance (i.e., I come down with Celiac).
As an individual, something either happens, or it doesn't. As a population, stuff happens to some percentage of folks.
It's not the same.
I don't care whether -- or why -- my population risk is low for Celiac. I have it.
The next time you hear that "X% of brown-haired, left-handed men are likely to be great motorcycle racers, keep in mind that I, a brown-haired, left-handed man, am a terrible motorcycle racer.
And the population averages don't directly apply to you, either.