Were you unable to attend Transform 2022? Check out all of the summit sessions in our on-demand library now! Watch here.
Imagine if an agency came into your neighborhood and starting inquiring about each person’s gender and race, religion, and moral beliefs, political affiliations, social likes and dislikes, who every person’s friends were, what they talked about, and the most intimate details of their relationships with loved ones. Then, after gathering these data and having them analyzed, the agency sold this information to businesses, foreign governments, anyone who may be interested in using it in any way they wished — with no questions asked.
Then, consider what would happen if some sinister players started using these data to incite violence, spread hatred, or rig elections. They are provided with enough information at a granular level to identify, say, Muslims with extreme views, or Christians who feel marginalized, or homemakers unhappy with their marriage. These troublemakers are also able to send false information to groups. They could do this at a very low cost, without having to disclose their identity or motivation.
I am describing what Facebook makes possible. The United Nations has accused Facebook of playing a “determining role” in stirring up hatred and genocide against the Rohingya Muslim minority in Myanmar. “It has substantively contributed to the level of acrimony and dissension and conflict, if you will, within the public,” said Marzuki Darusman, chairman of the UN Independent International Fact-Finding Mission on Myanmar.
Facebook also enabled data firm Cambridge Analytica to acquire 50 million user profiles in the U.S. and use these to reportedly help the Donald Trump U.S. presidential election campaign spread misinformation. Facebook data may also have been used to influence the Brexit vote in Britain, as well as regional elections in India.
MetaBeat will bring together thought leaders to give guidance on how metaverse technology will transform the way all industries communicate and do business on October 4 in San Francisco, CA.
To be clear, Facebook isn’t scouting neighborhoods for information or knowingly supporting any malicious actions. It has automated tools, possibly ignorant of the consequences of their use.
We gain a lot from the global exchange of information that Facebook makes possible. It has brought loved ones closer and made the world a smaller place. But Facebook’s ability to recognize faces in photos more accurately than humans can is worrying. It can learn all about us from the comments we post, the news stories we read, and the pages we “like.” Even if we don’t tell Facebook where we were on a particular evening, or who our friends are, it can “see” the photos other people have posted and learn what it needs.
Facebook also owns WhatsApp and harvests data from the app. Using the sensors on smartphones, WhatsApp theoretically has the ability to keep track of our location and activity levels and to know who we are meeting, where, and how long we have been with them. In an exchange of emails, the company indicated it does not track location beyond the country level and does not share contacts nor messages, which are encrypted, with Facebook. Per a WhatsApp spokesperson, “WhatsApp cares deeply about the privacy of our users. We collect very little data and every message is end-to-end encrypted. Contrary to recent reporting, we are not keeping track of the friends and family you have messaged, and we do not monitor your real-time location.”
But WhatsApp did confirm it is sharing user phone numbers, device identifiers, operating system information, control choices, and usage information with the “Facebook family of companies.” So that leaves open the question as to whether Facebook could then track those users at a granular level, even if WhatsApp doesn’t.
Big Brother in George Orwell’s 1984 could have only dreamt of having the information that Facebook has.
Facebook has become a huge part of public, civil, and private life. UN investigator Yanghee Lee said about Myanmar, “…everything is done through Facebook … [but] I’m afraid that Facebook has now turned into a beast, and not what it originally intended.”
Technology has given us many gifts. But it is increasingly being manipulated in ways intended to promote the makers’ profit over individual and collective wellbeing. The good of internet platforms is now being offset by flaws invisible to most users. Social media is being weaponized in the name of profit.
This is happening because Facebook and other internet platforms are consciously turning their users into addicts to make their products and advertising more valuable. They combine propaganda techniques with addiction strategies perfected by the gambling industry.
They provide value to users while creating filter bubbles that reinforce pre-existing beliefs in ways that make those beliefs more extreme and inflexible, causing many users to reject new information and even facts.
This is why governments need to stringently regulate Facebook. France ordered WhatsApp to stop sharing user data with parent company Facebook. Others must do the same. And they must force Facebook to crack down on hate speech — with heavy fines for every single violation.
Not just Facebook’s, but all data must be protected. A good start is Europe’s General Data Protection Regulation, going into effect in May, which requires companies to get unambiguous consent from users to collect data, to clearly disclose how personal data are being used, and to spell out why the data are being collected. Governments must also ban any form of political advertising and the sale of data to third parties.
This is not a matter of protectionism. It is about freedom and democracy itself. Technology is making amazing things possible. But it also has a dark side. We have to balance the risks with the rewards.
Editor’s note: VentureBeat got in touch with Facebook regarding the assertions in this story. The company provided background information, which we included, but otherwise declined to comment.
Vivek Wadhwa is Distinguished Fellow at Carnegie Mellon University Engineering at Silicon Valley and author of The Driver in the Driverless Car: How Our Technology Choices Will Create the Future.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings.