Thursday, May 22, 2008

Our innumeracy means that our fight against these super-rarities is likewise ineffective. Statisticians speak of something called the Paradox of the False Positive. Here's how that works: imagine that you've got a disease that strikes one in a million people, and a test for the disease that's 99% accurate. You administer the test to a million people, and it will be positive for around 10,000 of them – because for every hundred people, it will be wrong once (that's what 99% accurate means). Yet, statistically, we know that there's only one infected person in the entire sample. That means that your "99% accurate" test is wrong 9,999 times out of 10,000! Terrorism is a lot less common than one in a million and automated "tests" for terrorism – data-mined conclusions drawn from transactions, Oyster cards, bank transfers, travel schedules, etc – are a lot less accurate than 99%. That means practically every person who is branded a terrorist by our data-mining efforts is innocent.

1 comment:

  1. Anonymous4:45 PM

    Great one- thanks for this.

    ReplyDelete