If you have nothing to hide you have nothing to fear, you just need to trade a little liberty for security against the bad guys. After all, what good is freedom when you’re dead?
These are the platitudes with which the defenders of the surveillance state try to soothe our anxieties over the dangers of government surveillance. And unfortunately, they often work. Far too many people still cherish the attitude that as long as you behave yourself, you will be left alone, and those other troublemakers only got what was coming to them.
These attitudes may start to change when people hear about the newest innovation in law enforcement: the Threat Score. The Washington Post is reporting that police are using an amalgam of “billions of data points” collected from social media, public records, and God knows where else, to develop a statistical profile of people based on how likely they are to commit violence.
Naturally, law enforcement is being cagey about exactly how this profile is calculated and what it is being used for. At present, it seems to be a guide for officers to know what to expect when making an arrest. But the potential for abuse of such a system is downright terrifying.
To begin with, people are not statistics, but individuals. All the data in the world can’t tell you what someone will actually do in a given situation. Just because someone fits a particular profile doesn’t mean that they will engage in violence or criminal activity. If you want an idea of how good we are at predicting complex phenomena based on statistical analysis, just ask a weatherman or an economist.
Nevertheless, the worship of scientism has led many to believe that human behavior is nothing more than an equation to be solved, and this belief leads to some very troubling implications. Since we don’t know how these scores are being calculated, we could be singled out as “dangerous” based on perfectly legal and harmless behavior, like owning a gun or membership in a group with “patriots” in the name. We all remember how willing the government is to pursue these avenues of persecution from the IRS targeting scandal in 2013.
Again, we don’t know exactly what these scores are being used for. Giving guidance to police officers putting their lives in danger is one thing, but what happens when this data is used as sufficient evidence of criminal intent to grant search warrants, or even to use against defendants in court?
If you think it’s far-fetched, you should take a look at what is happening in China, where the government plans to maintain a “Social Credit System” as a way of keeping score for all residents based on their behavior, including speech the government doesn’t like. The scores are intended to be used to grant or deny access to certain social privileges.
Using a statistical profile to target citizens, not based on what they’ve done, but on what they might do in the future, is an idea so chillingly totalitarian that it has been the subject of science fiction dystopias for decades, most notably Philip K. Dick’s 1956 short story Minority Report, later adapted into a film starring Tom Cruise. If pre-crime, as it was called in the story, was an Orwellian nightmare in the 1950s, it has become dangerously close to reality in the 21st century. This is all the more reason why we have to be vigilant in protecting our privacy from warrantless government surveillance.