The odds are stacked against us

Thursday, May 22nd, 2008

Cory Doctorow reiterates that the odds are stacked against us because of our dangerous statistical ignorance:

The rare – and the lurid – loom large in our imagination, and it’s to our great detriment when it comes to our safety and security. As a new father, I’m understandably worried about the idea of my child falling victim to some nefarious predator Out There, waiting to break in and take my child away. There’s a part of me who understands the panicked parent who rings 999 when he sees some street photographer aiming a lens at a kids’ playground.

But the fact is that attacks by strangers are so rare as to be practically nonexistent. If your child is assaulted, the perpetrator is almost certainly a relative (most likely a parent). If not a relative, then a close family friend. If not a close family friend, then a trusted authority figure.

And yet we continue to focus our attention on the meteor-strike-rare paedophile attack instead of protecting our children from the real, everyday dangers they face from the familiar. This has the twin effects of making our children less safe, and of making adults less free, because we are all subjected to scrutiny on the grounds that we may be hunting children.

This is the same calculus that allows the fear of terrorism to take away our liberty: the statistically super-rare terrorist attacks present, on average, a much lower risk to our health, safety and person than, say, depriving us of our liquid medications, or of requiring us to leave our bags unlocked in flight so that sticky-fingered handlers can make off with our laptops and financial data and valuables.

The everyday threat of having our goods stolen, our ability to travel and earn our livings curtailed, and our personal information harvested by every junior terrorist fighter who wants to see your ID before letting you do anything is overshadowed by the one-in-a-billion confluence of someone with terrorist goals, the means to accomplish them, and the intelligence to bring them off (hint: you can’t really blow up an airplane with hair-gel and iPods).

The Paradox of the False Positive is old news around here, of course:

Here’s how that works: imagine that you’ve got a disease that strikes one in a million people, and a test for the disease that’s 99% accurate. You administer the test to a million people, and it will be positive for around 10,000 of them – because for every hundred people, it will be wrong once (that’s what 99% accurate means). Yet, statistically, we know that there’s only one infected person in the entire sample. That means that your “99% accurate” test is wrong 9,999 times out of 10,000!

Terrorism is a lot less common than one in a million and automated “tests” for terrorism – data-mined conclusions drawn from transactions, Oyster cards, bank transfers, travel schedules, etc – are a lot less accurate than 99%. That means practically every person who is branded a terrorist by our data-mining efforts is innocent.

In other words, in the effort to find the terrorist needles in our haystacks, we’re just making much bigger haystacks.

Leave a Reply