Interested in learning what's next for the gaming industry? Join gaming executives to discuss emerging parts of the industry this October at GamesBeat Summit Next. Register today.
As WhatsApp scrambles to figure out technology solutions to address some of the problems its service has inadvertently caused in developing markets, India’s government has proposed one of its own: bring traceability to the platform so false information can be traced to its source. But WhatsApp indicated to VentureBeat over the weekend that complying with that request would undermine the service’s core value of protecting user privacy.
“We remain deeply committed to people’s privacy and security, which is why we will continue to maintain end-to-end encryption for all of our users,” the company said.
The request for traceability, which came from India’s Ministry of Electronics & IT last week, was more than a suggestion. The Ministry said Facebook-owned WhatsApp would face legal actions if it failed to deliver.
“There is a need for bringing in traceability and accountability when a provocative/inflammatory message is detected and a request is made by law enforcement agencies,” the government said Friday. “When rumours and fake news get propagated by mischief mongers, the medium used for such propagation cannot evade responsibility and accountability. If they remain mute spectators they are liable to be treated as abettors and thereafter face consequent legal action,” it added.
India is WhatsApp’s largest market, with more than 250 million users. The country is struggling to contain the spread of fake news on digital platforms. Hoax messages and videos on the platform have incited multiple riots, costing more than two dozen lives in the country this year alone.
Allowing message tracing, though, would likely undo the privacy and security that WhatsApp’s one billion users worldwide expect from the service.
Bringing traceability and accountability to WhatsApp would mean breaking end-to-end encryption on the platform, the company told VentureBeat. WhatsApp encrypts all the texts and media files that users exchange with each other. As a result, the company does not have the technical means to read the content of an exchange between two or more users.
Moreover, privacy advocates from across the globe have long expressed the need for end-to-end encryption on instant messaging services. When WhatsApp flipped the switch to provide its billion users encryption by default, it received quite a bit of praise.
The Indian government, parties of which themselves are big WhatsApp users, has remained vague on what sort of access it is looking for. Matthew Green, Assistant Professor of Computer Science at the Johns Hopkins Information Security Institute, told VentureBeat the traditional investigative techniques — asking individuals on the receiving end of those communications about the sender’s identity (because often there are dozens of people, if not more, in a group) — tend to work pretty well.
“So the questions that come up from my perspective are: What exactly is law enforcement looking for here? What precisely about the current state of encryption is making this hard? Can the investigative capabilities they require be achieved without breaking encryption? Or is the goal something more powerful, like the ability to proactively filter for specific keywords? That last would be a very significant request,” Green said.
VentureBeat has asked the Ministry of Electronics & IT department to elaborate on its request and has yet to hear back.
With the violence that has resulted from the spread of fake news on the platform, however, it is clear WhatsApp needs to do more. So far the company has rolled out a feature to help users determine when a message they have received is part of a forward chain. It is now testing imposing a limit on how many times a user can forward a message.
The criticism WhatsApp is receiving comes while its parent company, Facebook, is itself in hot water in many nations for the spread of misinformation. Earlier this month, Facebook said it would soon work with fact checkers and threat intelligence agencies in India, Myanmar, and Sri Lanka to review and delete messages disseminating false information that have the potential to cause harm in real time.
Because of how WhatsApp is built, a similar high-scale approach cannot be replicated on WhatsApp. (On Facebook, people willfully share their updates with friends, friends of friends, and to the entire world.) In a statement, a WhatsApp spokesperson said the platform needs other parties to participate in helping it curb these problems.
“To tackle the challenges posed by misinformation we need action by government, civil society, and technology companies. Over the last month we’ve made several changes to WhatsApp including new controls for group admins and limits on forwarded messages,” a company spokesperson said, adding that it has also launched a digital literacy campaign to educate users.
India has over 400 million internet users and more than 300 million smartphone users, according to industry estimates. Much of these new users have come online for the first time only in the recent years. Many of these people are naive about the scope of abuse on internet services and have a tendency to believe everything they see online.
As part of its attempt to address the problems in India, WhatsApp, which is the most popular smartphone app in India, has been running newspaper ads in India (as well as several other markets) for roughly a year to advise people that they should be thoughtful about what they choose to share with their friends and family on the platform.
Manish Singh is a technology reporter based in New Delhi, India. His work has appeared on CNBC, The Outline, Mashable, and CNET.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings.