This article is part of a VB special issue. Read the full series here: How Data Privacy Is Transforming Marketing.
Shortly after GDPR went into effect in 2018, Apple began running privacy-focused advertisements and since then, has released several more along the same line — coming out with unique angles to showcase its enhanced security features.
Using privacy as a marketing asset was viewed as a smart marketing move by Estelle Masse, Europe legislative manager and global data protection lead at Access Now, a data privacy advocacy organization that defends the digital rights of users worldwide.
“Privacy is actually a commercial advantage,” Masse said. “Companies need to move beyond thinking it’s part of an annoying compliance checklist. It can be a competitive advantage for you and build trust for your users.”
As other companies clamored to navigate compliance with enhanced privacy regulations while maintaining their marketing data strategies, Apple embraced privacy issues as a key point for its marketing. The company proved privacy could be an asset, rather than the liability it became for its Silicon Valley neighbor, Facebook (now Meta), which spent 2018 navigating the Cambridge Analytica data privacy scandal. Meanwhile, for other tech companies, privacy became a downfall instead of a key feature.
Join us in San Francisco on July 11-12, where top executives will share how they have integrated and optimized AI investments for success and avoided common pitfalls.
Though, the divergent approaches to privacy by these two tech giants may have foreshadowed another problem: A privacy divide that’s only widening between consumers who can afford the products and devices that include strong privacy protections and those who cannot.
Accessing data privacy comes at a cost
Between the high cost of Apple devices and Facebook’s free model, where its users are the commodity sold — the differences paint a picture of the price consumers pay to protect their data and what it costs them if they cannot afford it.
“Privacy should not be a luxury,” said Masse. “We need to see a lot of the privacy features created by Apple, or similar tools, replicated in more affordable products and devices.”
Apple is making products that contribute to protecting privacy, in particular by limiting what other companies can know about us, she explained, but cautioned that Apple doesn’t always apply these standards to itself.
“Apple has made it extremely easy for us as customers to reject ads from other apps and services, and with it, they help us protect our privacy,” Masse said. “Apple should not try to benefit from this feature to then serve us with their own ad services or tracking. Those should be turned off by default in all Apple products and apps.”
Expecting consumers to spend more time and money to have autonomy over their own data isn’t a great way to treat customers, argues Daniel Weitzner, director of MIT’s Internet Policy Research Initiative and principal research scientist at its Computer Science and Artificial Intelligence Lab (CSAIL)
“I give Apple a huge amount of credit for setting high expectations for the apps in their app store and the third-party devices that they interact with,” Weitzner said, “… But I worry that what we’ve done is put a lot more burden on the user to have a sense of privacy protection. Some of the costs are very direct. You have to pay more for a more privacy-protective, smartphone or you have to deny yourself access to certain kinds of savings for free services.”
Data privacy for the powerful?
Masse’s point begs the question: Have can robust privacy protections become a luxury instead of a basic option for consumers?
It’s a question, in fact, that has been asked for years. In 2017, Amanda Hess, internet and pop culture journalist, wrote in The New York Times: “Now that our privacy is worth something, every side of it is being monetized. We can either trade it for cheap services or shell out cash to protect it. It is increasingly seen not as a right, but as a luxury good.”
A Morgan Stanley research report released in 2021 reported that 81% of individuals feel they have little or no control over the data collected. Just as with the digital divide, those of lower socioeconomic backgrounds may not have the resources to take advantage of privacy protections from every angle and may be less likely to shell out extra cash for advanced privacy protections.
Some experts argue that individuals can either pay more to have privacy protections built-in to the services they use, or educate themselves for free on how to take control of their privacy online by turning off cookies, asking apps not to track, scanning lengthy terms and conditions documents or using a VPN.
Others disagree and say that socioeconomic factors contribute to issues around data privacy.
“I think privacy in terms of data should be a fundamental right,” said Rafal Los, head of services and GTM at security solutions company, ExtraHop.
However, he admitted that it can be hard to advocate for a right that it can sometimes seem like few people actually care about.
“It seems like people are willing to trade their passwords for a Snickers,” he said.
Los added that he has a difficult time agreeing that there is a widening privacy gap where protections are more accessible to those who are more affluent.
“Kim Kardashian is just as dumb with her privacy as anybody as, like the barista at Starbucks. It’s just not something people think about unless they’ve had a problem with it,” he said. “… Maybe I’m wrong, but I don’t think there’s a correlation between being wealthier or more affluent, or being better educated and caring about your privacy more … In practice, I just don’t see it.”
Either way, others say it simply isn’t fair to put the responsibility to manage individual privacy on consumers alone. Consumers do care, they say, but often feel powerless in the face of large companies they need to use services from — having to just be okay with clicking through to be able to interface with whatever app or website they need at the moment.
“I’ve done some of the empirical work that supports the argument that people do care,” said Jennifer King, Ph.D., privacy and data policy fellow at the Stanford Institute for Human-Centered Artificial Intelligence. “I certainly think there can be educational holes there.”
Bearing the burden of data privacy
King pointed out that low socioeconomic status individuals, like many of us, may have access to technology, but may not have the knowledge to take advantage of protecting their privacy from every possible angle. They may use location services, for example, or click “agree” without fully knowing what is at stake.
“My own research and others’ has demonstrated that people fundamentally don’t understand the trade-offs in many cases,” she said.
Weitzner agreed, pointing out that the burden on the everyday person to control their privacy is too much. He noted that consumers have to agree to give up data to participate in everyday life such as getting a credit card, to get a loan for a mortgage, or to apply for a job.
“Most people are in a position where they’re forced to trade their personal data for things that they want or even need,” Weitzner said. “So I think it’s true, if you’re prepared to spend a lot of time and effort and extra money, you can put some distance between yourself and the whole kind of profiling process that goes on — but I think it’s really hard for most people in any practical sense… we have to work too hard to get privacy today, and that’s not right.”
Tough challenges for marketers
Companies that aren’t Apple, of course, can’t simply incorporate robust privacy protections without figuring out how to still market to potential customers. Shoppers send a mixed message: As much as consumers do want privacy protections, further research from BCG and Google shows that two-thirds of consumers also want customized ad content — while simultaneously reporting that half are still uncomfortable sharing their data to receive such personalization.
Still, with many regulations already in place and more on the way, no marketing organization will be able to ignore data privacy – whether or not their customers have the ability to pay for more privacy-centric products and tools. So, where do marketing teams go from there?
Just as privacy comes with a price for consumers, companies are shelling out money as well as they work to get up to speed on compliance with laws like GDPR or CCPA.
In fact, a report from McKinsey predicts that companies that don’t figure out privacy solutions and rework their marketing strategies to comply with such, can expect to spend as much as 10-20% more on marketing and sales just to see the same returns.
Enterprise organizations governed by GDPR have had to make hard pivots in their strategies, and it hasn’t been easy. Preparing ahead of legislation as much as possible is ideal, according to Dan Peden, strategy director at performance marketing agency Journey Further.
“We’ve seen marketing efficiencies drop … they’re being asked by their businesses to get more for less budget, or get more for the same budget or hit aggressive targets that came out of COVID,” he said.
Without a lot of data, he said, that gets harder and harder to do because we end up being more general with our targeting and we fall back into what will be more traditional marketing methods.
“We use a lot of holdout testing, which is then designed to look at masses and whether we’re improving marketing — or whether the limited data that we have is actually the right people that we’re trying to reach,” Peden said.
What marketers need to do — now
To remain successful, the same McKinsey report recommends marketers stay vigilant about what’s coming next regarding privacy regulations and work now to demonstrate that privacy protections are a priority. The report notes that trust is key: When a consumer trusts a company, they are twice as willing to share their data than when they don’t.
On top of that, hurdles for organizations may depend on what sector they’re in. McKinsey found that highly regulated industries like healthcare and financial services are already trusted by consumers. Companies in those sectors have regulations around privacy already baked-in and won’t have to work as hard to overcome as many hurdles building that trust. However, companies in technology, travel, transportation, media and entertainment have to work harder, as these are the industries consumers report trusting the least with their data.
As privacy regulations continue to evolve in the U.S., an investment of time, resources and capital should be expected for enterprises in any sector. That means marketers need to get prepared.
“The biggest thing you need to get ready for is your auditing,” Peden said. “Understanding what data you hold, where it comes from, how you store it and how long you store it for — that was a big undertaking for a lot of businesses in the EU.”
For the time being, he added that marketers can prepare by “getting used to not having as much data and it being less personalized, less trackable, and then moving back towards more traditional methods for tracking — so, surveys, polls, surveys, pre- and post-surveys, and holdout testing.”
While it’s difficult to predict what is ahead, Weitzner hopes companies will see the growing need to aid consumers in protecting their privacy and will make it easier to do so. He suggests looking back may actually help marketers as regulations continue to unfold.
“In the early days of the Internet, we had to face the challenge of figuring out how to provide people assurance that you could safely use your credit card numbers, for example, online and it was far from a foregone conclusion in the late 90s,” he said. “But it worked out because we figured out the right kind of trust equation. I think now we have to kind of do it again, looking at much more intensive use of personal data, and providing more detailed accountability while giving people a sense of trust.”
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings.