So far, many companies have tried to dispel such fears by noting that the data they gather, store and analyze remains "anonymous." But that, as it turns out, is not entirely accurate, in that it sells the power of data analysis radically short. Take the analysis of anonymous movement profiles, for example. According to a current study by the online journal Scientific Reports, our mobility patterns are so different that that they can be used to "uniquely identify 95 percent of the individuals." The more data is in circulation and available for analysis, the more likely it is that anonymity becomes "algorithmically impossible," says Princeton computer scientist Arvind Narayanan.
In his blog, Narayanan writes that only 33 bits of information are sufficient to identify a person.
From the standpoint of businesses, the slightly schizophrenic attitude of consumers is the real crux of the issue. On the one hand, we have become shockingly forthcoming -- and apparently accessible -- online. Yet we ascribe the most sinister of motives to those who would analyze that data and collect more.
A study by New York advertising agency Ogilvy One concludes that 75 percent of respondents don't want companies to store their personal data, while almost 90 percent were opposed to companies tracking their surfing behavior on the Internet.
This conflict explains the heated nature of the current controversy over the proposed new European data protection directive. If the European Commission's plans, which also include a "right to be forgotten" on the web, become a reality, many providers could see their Big Data growth fantasies in jeopardy. This is one of the reasons Brussels currently faces a barrage of lobbying from the likes of Amazon, Google and Facebook.
But for a modern society, an even more pressing question is whether it wishes to accept everything that becomes possible in a data-driven economy. Do we want to live in a world in which algorithms predict how well a child will do in school, how suitable he or she is for a specific job -- or whether that person is at risk of becoming a criminal or developing cancer? Data Tyranny
Is it truly desirable for cultural assets like TV series or music albums to be tailored to our predicted tastes by means of data-driven analyses? What happens to creativity, intuition and the element of surprise in this totally calculated world?
Internet philosopher Evgeny Morozov warns of an impending "tyranny of algorithms" and is fundamentally critical of the ideology behind many current Big Data applications. Morozov argues that because formulas are increasingly being used in finance and, as in the case of Predictive Policing, in police work, they should be regularly reviewed by independent, qualified auditors -- if only to prevent discrimination and abuses of power.
A dominant Big Data giant once inadvertently revealed how overdue a broad social and political debate on the subject is. Google Executive Chairman Eric Schmidt says that in 2010, the company toyed with the idea of predicting stock prices by means of incoming search requests. But, he said, the idea was discarded when Google executives concluded that it was probably illegal.
He didn't, however, say that it was impossible.
Translated from the German by Christopher Sultan