Facebook is in hot water for manipulating user’s emotions for mass study that explored the contagious effects of positive and negative news stories. The Wall Street Journal reports that the well-publicized experiment was just the tip of the iceberg for Facebook’s fast research data science team. “There’s no review process, per se,” Andrew Ledvina, a Facebook data scientist from 2012 to 2013, told the Journal. “They’re always trying to alter peoples’ behavior”
But, before we break out the pitch forks, it’s important to remember that Facebook does discovery really useful information.
For instance, in one of the first large-scale experiments, a University of California San Diego researcher discovered that they could increase civic engagement through “I voted” badges on election day. Facebook badges, alone, can boost voter turnout 2.2 percent, which is about a fourth the effect of face-to-face canvassing. Canvassing, however, is much (much) more expensive and labor intensive.
For campaigns on a budget looking at how to spread a civic message, Facebook could be a good deal.
To be sure, the messages that pass through Facebook can save lives. Back when Facebook decided to add organ donor status as a sharable message, over 13,000 people signed up on the first day, and registration kept apace at double the normal rate for some time after. Figuring out how to message this issue properly could mean more signups, and hence, lived saved.
Loneliness & suicide
Loneliness is a modern disease, depression and suicide the modern epidemics. Those cut off from the world often find solace online and Facebook data scientists have discovered the clues to identifying lonely people in the texts and behaviors of it’s users.
“Users who consume greater levels of content” finds one academic study from Facebook, report lower levels of offline social activity “and increased loneliness”. Binge-viewing of content is an important warning sign.
The military is interested in using this kind of information to detect depression and suicide in veterans. The Durkheim project will collect social media and mobile data from a voluntary group of veterans to see if they can spot the early warning signs of mental illness, and eventually swoop in to prevent the worst scenarios.
Facebook can be an incredibly useful tool for anyone apartment hunting or in need of parenting advice. And, their data scientists have been busy studying what kinds of requests are more likely to receive a response, and hence, get prioritized in the news feed.
“The chance of getting a response was substantially higher for factual knowledge and recommendation requests, and the chance for favor/requests substantially lower, as compared to mobilizations overall,” reports one Facebook team.
The implications of this are important: perhaps rephrasing a call to action would be more helpful if users requested information on anyone who would be willing to help out a cause.
These kinds of practical tips make Facebook all the more useful for bridging online distribution with offline help. And, it only happens when Facebook conducts research.
A more transparent Facebook
University of Washington law professor Ryan Calo has recommended the creation of “Consumer Subject Review Boards”, which review the research of private companies. It’s akin to the Institutional Review Boards (IRBs) already standard at every major university.
I met Professor Calo last week at the Atlantic Aspen Ideas festival; he later wrote to me, “I think Facebook would have fared better under this regime because they would have had a set of established criteria as well as a record of when and why it was approved.”
With consumer review boards, Facebook can be much more transparent about their research and be assured that someone is monitoring them for potentially harmful studies.
We want Facebook to conduct research, but they can’t just toy with people’s emotions in secret without being accountable for the consequences.