Check out the on-demand sessions from the Low-Code/No-Code Summit to learn how to successfully innovate and achieve efficiency by upskilling and scaling citizen developers. Watch now.

We’re currently facing a privacy paradox. We, the public, are becoming increasingly aware of the risks of sharing our data online, but we continue to share it, regardless. In fact, evidence suggests that, despite the Cambridge Analytica scandal and recent major data breaches, we’ve become even more disposed to sharing data in exchange for “better” and cheaper services.

An Experian study from January revealed that 70 percent of consumers globally “are willing to share more personal data with the organizations they interact with online, particularly when they see a benefit such as greater online security and convenience.” And a January survey conducted by the Center for Data Innovation came to a similar conclusion, finding 58 percent of Americans are “willing to share their most sensitive personal data” (i.e. biometric, medical and/or location data) in return for using apps and services.

That survey further revealed that a number of users would accept even more trade-offs if they resulted in extra benefits. For instance, 70 percent of Americans initially said they wouldn’t permit a mobile app to collect biometric data for no obvious purpose, yet this percentage dropped by 8.7 percent and 19.8 percent respectively if sharing such data let them sign in more easily to their accounts and protected them against hackers.

There is, then, still a readiness on the part of many people to view their data privacy as negotiable, as something that can be sold off given the right inducements, even despite Cambridge Analytica, Equifax, Aadhar, Starwood/Marriott, and other data-related breaches or scandals. And this is a conclusion supported by other recent surveys and research, such as a global study published in May 2018 by the Global Alliance of Data-Driven Marketing Associations (GDMA) and the UK DMA, which found 77 percent of people in 10 nations (including the US, the UK, Spain, France, Germany, and the Netherlands) are either “pragmatic or unconcerned about sharing their data.”

In other words, most people in these 10 nations are either completely unconcerned (accounting for 26 percent of the total) about handing over personal data, or they’re “pragmatic” (51 percent), meaning they say they’ll give up personal data “on a case-by-case basis, dependent on the benefits.” Such proportions come despite respondents to the same study claiming they’re “very concerned” about online privacy, with 82 percent of Americans reporting a level of concern between seven and 10 (out of 10).

It’s certainly puzzling that this concern over privacy coexists with a willingness to continue sharing personal data. And rather than decreasing in the wake of recent high-profile breaches, the willingness appears only to have grown, at least among certain segments of national populations.

For example, the same GMDA report also noted how, since 2012, the proportion of Brits describing themselves as “unconcerned” about sharing their personal data has grown by nine percentage points, from 16 percent to 25 percent. Similarly – and perhaps more astonishingly – the number of “data fundamentalists” (“those who are unwilling to provide personal information even in return for service enhancement”) has decreased in the UK, falling from 31 percent in 2012 to 25 percent in 2017. On a global level, the report also reveals that the “data unconcerned” are more likely to be of a younger generation, while “data fundamentalists” are likely to be older (e.g. in Germany, 58 percent of those aged 18-24 are “unconcerned,” compared to 34 percent of all adults). This suggests that as younger generations raised in a data economy reach maturity (and as older generations pass away), the proportion of the general population willing to hand over its data might grow considerably.

Such trends are troubling, especially given everything we now know about the vulnerability of personal data.

“Many people who are not tech savvy will give up privacy for convenience and better services, not realizing the consequences as long as nothing happens to their personal accounts,” says Ahmed Banafa, an electrical engineering professor and cybersecurity expert at San José State University. Banafa agrees there is something of a privacy paradox today and points to Facebook’s unrelenting growth as an indicator of this. “Just look at the results of Facebook last quarter and you will see an increase in the number of people engaged in Facebook activities or associated apps like Whatsapp, Instagram, and Messenger.”

Banafa points to a behavioral and psychological explanation: We can be disturbed by reports of data breaches while still optimistically believing such breaches won’t affect us. Tim Mackey, a cybersecurity expert with Synopsys, gives similar reasoning: “We are acutely aware of data breaches impacting our personal data,” he says, “but until and unless it becomes personal, the problem is largely an academic one which is safely ignored.”

There are also economic explanations for the privacy paradox, as Marc Rotenberg, president of the Electronic Privacy Information Center (EPIC), points out. “First, the question [of whether consumers are at fault for giving away more personal data] reflects a profound misunderstanding of the Internet economy,” he says. “Consumers do not have meaningful choices. And they know this. So it is entirely rational that they do not waste time with privacy policies or privacy settings.”

Those who work directly within the data industry back this up. Ryan Faber, co-founder of Bloom, a blockchain-based digital ID and credit scoring platform, says that simply wanting to give up less data isn’t enough to avoid the privacy paradox. “It’s less of a question of whether people are willing,” he says. “For many, giving up your data is simply required to interact in the modern world. This is true at work and also to engage in basic services. Even worse, sometimes, their information is taken without their knowledge or consent. For example, in the case of the credit bureaus, nothing you can do will protect you from these businesses profiting off of the back of your daily habits. You can’t opt out.”

That consumers lack meaningful choices is borne out by research. On the one hand, pretty much every single website on the internet requires its users to download cookies (which track how individual websites are used), yet a December 2017 study from cybersecurity firm Ghostery found 79 percent of all websites globally also track their users’ movements online, even when these users are browsing elsewhere. Similarly, another 2017 study from researchers at Stony Brook University and the University of Massachusetts (and elsewhere) discovered that just over 70 percent of smartphone apps report personal data to third-party tracking companies.

As these figures and accounts indicate, there are few corners of the web to which users can turn if they want to make a concerted effort to guard their data. What’s more, the situation is made even worse by another aspect of the privacy paradox: By sucking up so much user data, platforms like Facebook and Google make the services they offer tangibly more personalized and rewarding than those offered by rivals, which in turn makes people even more willing to give up their personal data.

“Where I’m the most concerned is when services start to use that information that they collect about you to prioritize one particular metric of improvement: engagement,” says David O’Brien, an assistant research director at the Berkman Klein Center for Internet & Society at Harvard University. “When they try to use your data against you, to get you to keep using their services. They just feed you the headlines they know you’re going to click on. That doesn’t quite feel right, at least to me.”

The result is a dangerously vicious circle. Users generally believe they’ll receive better services (and even, in some cases, better security) if they hand over more data, yet it’s clear the increasing accumulation of user data creates greater cybersecurity risks. In addition to the Cambridge Analytica scandal (in which the data of 87 million Facebook users was shared with the digital consulting firm), 2018 saw data breaches at Aadhar (1.1 billion people affected), Starwood/Marriott (500 million), Exactis (230 million), Under Armour (150 million), Quora (100 million), MyHeritage (92 million), and others. And what’s important to point out about such breaches is that, in recent years, they’ve grown increasingly large as more of us have been sharing more of our data.

Indeed, of the 20 biggest data breaches of all time (see table below), all but two happened in the last decade, while 14 happened in the last five years. Clearly the global economy’s addiction to data carries serious risks. Advances in cybersecurity can’t completely remove such risks, says Annelie van Milink, a data privacy expert at PA Consulting. “No matter how advanced and secure new technologies become, hacking mechanisms (fueled by a growing dark web of financially or politically motivated individuals) will catch up and breaches will continue to happen,” she says. “The more the digital world grows, the more important it becomes for businesses to keep on top of the latest threats and vulnerabilities and put in place measures to mitigate the likelihood of a breach.”

One way out of this paradox could be greater transparency, as EPIC’s Marc Rotenberg argues. “The real privacy paradox is that transparency is required for effective privacy protection,” he says. “That is why real privacy laws, such as the GDPR, impose transparency obligations on companies and establish access rights for those whose personal information can be collected.”

By imposing a legal framework in which companies have to obtain explicit user consent to gather personal data, in which they have to clearly show what kind of data is being gathered, and in which they also have to offer users the chance to delete their personal data, the EU’s General Data Protection Regulation should in theory reduce the quantity of data we permit the world’s organizations to gather and store. “GDPR obviously has an impact on consumer choice because it establishes a much higher standard for consent,” Rotenberg affirms. “That was the core of the recent decision concerning Google.”

Another key ingredient of this solution would be improved consumer education, since greater transparency isn’t going to mean all that much if people aren’t aware enough to take advantage of it. “Data has now surpassed oil as the world’s most valuable resource, so the data economy is not going away anytime soon,” says Ashley Leonard, a data expert and CEO of Verismic Software, a cloud management software company in California. Educating end users and investigations into shadowy data companies like Cambridge Analytica is the only way forward.”

But even if greater regulation and education would make a big difference, some experts aren’t sure whether the US will see substantial legislative developments anytime soon. Congress has been more willing to discuss data legislation more recently (e.g. Marco Rubio’s newly introduced online privacy bill), David O’Brien points out, but he adds that data harvesting has likely become too economically important for any radical or significant regulation to be possible in America. “I tend to be a little more on the cynical side than the optimistic side,” he says. “That’s in large part because we really do have a data-based economy, at least in the consumer-facing tech sector. And that’s the lifeblood of it, and so once you start to meddle with it, you really are playing with a lot of equities at the policy level. So I’d be surprised if we are able to come to some sort of consensus in Congress over what the best pathway is.”

Time will tell if Congress is able to come to a consensus over Marco Rubio’s privacy bill. Introduced on January 16, it would require the Federal Trade Commission to draw up rules for tech companies to follow based on the Privacy Act of 1974, which in turn required federal agencies to let members of the public see their records upon request and prohibited most disclosures of personal information without consent. Marc Rotenberg has praised the bill, calling it a “very good proposal,” yet it’s worth pointing out that the bill was introduced largely to preempt a privacy law passed in California that’s comparable in scope to the GDPR. Given that the likes of Google and Amazon had already been pushing for several months for a bill similar to Rubio’s to be introduced, it’s likely this new bill is considerably lighter in touch than its Californian predecessor and, if passed, will probably do little to reduce the massive amounts of data Americans still hand over to tech companies on a daily basis.

Simon Chandler is a freelance tech journalist. His areas of expertise include AI, virtual reality, social media, big data, cybersecurity, and cryptocurrencies. He has written for such outlets as Wired, the Daily Dot, The Sun, TechRadar, the Verge, Cointelegraph, Cryptonews, and MakeUseOf.

VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings.