Want to master the CMO role? Join us for GrowthBeat Summit on June 1-2 in Boston
, where we'll discuss how to merge creativity with technology to drive growth. Space is limited and we're limiting attendance to CMOs and top marketing execs. Request your personal invitation here
How much is your personal data worth? Will photos you post on Facebook or your Foursquare check-in data get you into trouble in five years’ time? In one of the standout talks at this week’s O’Reilly Strata Summit, author and Boing Boing editor Cory Doctorow explained why people undervalue their privacy and how data-driven companies exploit this mis-pricing of privacy.
The privacy bargain we make with tech companies usually involves giving up some personal data in return for a free service, as with Facebook or many mobile applications.
Doctorow argues that it’s hard for people to assign a value to personal data when the full consequences of giving up that data are still unknown. How do you determine whether the privacy bargain is a fair one?
“It’s hard to get worked up about things where the failure and the deed are separated by a long way,” said Doctorow. “It’s the same reason that people start smoking.”
He insists data-driven companies such as Facebook actively exploit users by soliciting as much data as possible. “Facebook trains you to undervalue your privacy. These companies are [full of] social scientists now and those people have read their Skinner (an American behaviorist), have read their Adler (founder of the school of the school of individual psychology) and they understand intermittent reinforcement.” In exchange for posting status updates, photos and other information, Facebook users are intermittently rewarded with attention from people they care about. This mechanism can have addictive qualities similar to gambling.
“Eli Pariser, who wrote The Filter Bubble, told me someone at Facebook explained to him that they know that men who have female friends who post photos of themselves, spend more time on the site,” reports Doctorow. “They know that women who see their friends post photos, upload photos in response. So if a man who used the site a lot then dropped off, they look for women in his social group, show them pictures of their girlfriends, the women post pictures back and then the men stay on. This is not the bargain.”
Another form of social manipulation practiced by tech companies involves search results and news feeds.
“The algorithms by which things like Facebook decide what to show you and what to hide are totally opaque. There’s this kind of weird, big lie about how an algorithm is not a form of editorial control. Google will say ‘we have organic search results’ in contrast with what Alta Vista used to do, where they would take payment to put a result first. It’s ‘organic’ because it’s done with math, but actually it’s editorial by another name. All the companies that do editorial by algorithm claim that there’s something about math that makes it free of bias and will.”
Tech companies often do not offer clear or easy privacy choices to users. Facebook constantly changes its privacy settings to push the default towards more public data, and its Byzantine custom privacy settings are bewildering for a new user. “Complexifying a proposition is usually there to stop you from finding out whether the deal is good,” comments Doctorow.
With mobile applications, the choice is often between giving the application all the data it requests or not installing it at all. “Imagine apps that let you iterate through privacy decisions when they arise, not making a lot of a priori decisions,” explains Doctorow. “Apps that start from a presumption of privacy, and when your privacy settings interfere with your stated desire to access a service, in that moment you are prompted to make the decision.”
More generally, Doctorow says we need simpler cookie managers: “One of the things you can do is give people meaningful choices in their browsers. That would be way more useful to me than giving them hard to enforce, impossible to audit, privacy legislation.”
He also thinks that the way we approach educating children about privacy is flawed. “We have this weird contradiction in our school system where all the grown-ups in the school spend all their time wagging their fingers at kids saying ‘Get off the Facebook, every disclosure you make is something precious that you lose forever ‘ but ‘I’m spying on every click you do, spying on every IM you send, spying on all your Facebook conversations’ just like a parent who has 3 fags in his mouth and says ‘You shouldn’t smoke because it’s bad for you,’” he says.
“We could start by teaching kids to jailbreak every device, break every firewall, to do all the things that will make them good at privacy. It’s a learned skill. If kids can compete to see who can divulge the least information to the grown-ups in their lives, we will, by definition, get kids who are better at not divulging information than kids who are punished every time they try to prevent grown-ups from looking at their information.”
One of the reasons that we undervalue our personal data seems to be that the threat is not visceral and concrete. “In technology we often have this core problem of taking a fairly abstract social harm and rendering it concrete,” concludes Doctorow. “I think science fiction is rubbish at predicting the future, but it can create narratives that become part of our discourse. Imagine it’s 1947 and Orwell hasn’t written 1984 yet, and you’re trying to explain to someone why you don’t want to be electronically surveiled.”
VentureBeat’s VB Insight team is studying email marketing tools.
Chime in here, and we’ll share the results