A Post-Privacy World? Big Data and the Privacy Paradox

Article by Anna Groh | Illustration by Francesco Moretti

Forecasting the future – a useful gift that mankind could only dream of, until today. Indeed, in the age of digitalization, the rise of the internet and AI, forecasting the future is no longer just a dream. Interestingly, we are precisely the ones that make predicting possible in the first place. How? Imagine you are having a scratchy feeling in your throat. The stores are already closed, so you decide to find a good home remedy for it. To do so as quickly as possible, you simply google: “Home remedies for a sore throat” or “what helps against scratchy throats?”. A few days later, the doctors’ diagnosis: Tonsillitis. Of course, you would start googling again: “Have tonsillitis, is it dangerous?” or “how long will it take me to recover from my tonsillitis?”.

This sequence of Google search queries may be meaningless on its own, but the moment many people google similar things, it starts to get interesting. If all search queries regarding the diagnosis of tonsillitis are collected and linked to previous internet searches done by those very same anonymous users, you will find a correlation between diagnoses and earlier googled symptoms. By combining the data, a pattern emerges, leading to an algorithm that may predict tonsillitis before the doctor even diagnoses it.

Projects of prediction like this are no longer unusual, take “Google Flu Trends”, or the work of Microsoft and Columbia University, who collected and analysed search-engine queries in order to identify early symptoms of rare pancreatic cancer (Stephens-Davidowitz; 2017).

Why your data matters

This doubtlessly demonstrates how Big Data (that is, large amounts of information) has the power to do incredible things, and how our everyday habit of looking up information through search engines provides the basis for identifying repetitive connections and relations within our society. But our data reveals a lot of sensitive information, too. Information we surely would not disclose in a traditional survey, information we would perhaps not even share with a friend. Sadly, it can be misused just as quickly, especially in the context of commercial targeting or political campaigning.

And yet, we do not hesitate for long to give away highly personal and valuable information about us – constantly, voluntary and for free–, the moment we allow apps to track our location, google about our latest fears, or create a customer account on Amazon. Simply because, and that is a fact, it makes our lives much more convenient and social.

You may ask yourself: If there is an insane amount of data being collected every second, will mine even matter? Will it possibly stand out among millions of other datasets? Isn’t there more interesting information about other, more interesting users, anyways? Although these seem very reasonable questions, the answer is yes: YOUR data matters. When giving away information online, your action also affects others. This is because in the age of Big Data, it is not about individual information, but rather about aggregated, anonymous and large data packages (Rouvroy; 2020). Hence, popular arguments like “but we only use your data anonymized” become significantly less consequential.

So, what is the threat, exactly?

The moment data is aggregated, combined and put into context, it becomes very useful. Patterns, linkages and relations are generated – algorithms emerge. Those algorithms can eventually be used to make future predictions or to divide society into different groups (Mülhoff; 2020).

Zuboff, a Business professor at Harvard, is one of the most critical voices in that regard. In her book The Age of Surveillance Capitalism, she claims that our data might be used to control future markets. She argues that the so-called “behavioural surplus” we are producing by leaving behind information will be fabricated into something she calls “prediction products”, namely “Calculations that anticipate what we will do now, soon or later” (Zuboff; 2019).

Big Data scientist Kosinski reveals how accurate algorithms based on your social media activity can be. According to his research, with 70 likes, Big Data knows you better than a friend; with 150 likes, better than a family member; and finally, with 300 likes, the computer algorithm is even better at predicting your behaviour than your own spouse (Kosinski, 2015).

In this sense, believing that humans will always be better than machines is an outdated idea. On the contrary, we should take Big Data seriously and discuss the necessary regulation.

The Privacy Paradox

Research suggests that most of us are very worried about our privacy, and how our data is being used. And yet, judging from our online activities, it does not seem like we care that much. If a stranger came to our door and asked for our best kept secrets, desires, or dreams, we would naturally not tell him. Yet, when we google our way through life, we do not consider Google or any other search engine as this very same stranger standing at our door. We just do not consider it as much as a threat to our privacy, probably because it has no specific face for us.

There is a name for this discrepancy between privacy concerns and attitudes on the one hand, and our privacy settings and behaviour on the other. It is called the “Privacy Paradox”: despite the fears of data being misused, people fail to take “easy and inexpensive steps to protect it” (Barth, Jong; 2017). One possible explanation is presented by Bush et al. (2020), who state that people are simply “poor at assessing future risks”. Instead, the “present bias” comes into play, which means that people tend to choose present gain over future benefits.

What chances do we have left?

To live in a modern world means to share information. In the age of technology, it is nearly impossible not to do so without having major disadvantages or living an isolated life. As Solove (2020) argues, in today’s world, “if people really wanted to keep all their information concealed, they’d have to live in a shack in the woods”. The real question is, are we living in a post-privacy world? If not, to what extend do we have to give up our desires for privacy if we choose to participate in today’s society?

Clearly, we do not have to ban all our social activities in order to protect our privacy. Forget the black-and-white mentality. “Privacy” online is not about keeping all our information away from the world and from everyone – it is more about the selective sharing of it, and the possibility of making sure that it is not being misused by others. It is about “modulating boundaries and controlling data flow” (Solove; 2020), made possible through a maximum level of transparency. Therefore, countries need to reconsider their privacy regulations as more and more data is collected daily, and as the power of digital companies grows to an extent previously unseen.

The symbiosis between society and the internet seems to be perfect – both sides benefit. The internet, social networking sites and multiple technologies are an integral part of our everyday life, helping us to manage our day in a world where we have to master more things in less time. In exchange, we are “paying” with our information and our very own thoughts, which make us who we are. Zuboff goes to the extreme when she warns us about a future where we, as human beings, are not the product being targeted, but instead a means to an end. An “abandoned carcass” that is left behind after the information we give away is collected and fabricated (Zuboff; 2019) – but it is up to each of us to evaluate to what extent there might be truth in that.

Add a Comment

Your email address will not be published. Required fields are marked *

Copy link
Powered by Social Snap