Facebook is hoping that the Federal Trade Commission and U.S. Attorney General will bear ultimate responsibility for cracking down on rogue apps, according to comments yesterday from the company’s D.C.-based head of public policy Tim Sparapani.
He also said the platform is now hosting 700,000 apps (or 150,000 more than previously reported).
“Who’s in charge of making sure these applications do exactly what they say they’re supposed to do? The vast, vast majority — 99.99 percent — of the applications on our platforms, on Apple’s and Google’s, are going to do exactly what people expect. But some will have bugs. Some will get hacked. And some of them, unfortunately, will be run by people who will try to manipulate data and try to take it and use it in ways that consumers won’t expect.
So who’s in charge? Is it the developer? Is it the FTC? Is it the attorney general? Is it Facebook, Apple and Google? And these are questions that haven’t been quite answered yet. If you look at the way the law is currently structured, we think the FTC and the attorney general have a responsibility to make sure that those very few applications which might not meet consumer expectations are being policed.
There’s an open question as to whether the FTC and attorney general are looking into this and we think they should at least begin to consider how they’re going to police it because the other opportunity is to force Google, Apple, Facebook, Salesforce — companies which are platform companies — to have some sort of enforcement mechanism.
And that’s not a role we want to be in. We want to set the parameters. We want to give our users a good experience. But it’s very difficult for one company to tell other companies — especially 700,000 of them — exactly what they should do and how they should control their interactions with other consumers.”
One of the very interesting questions confronting companies like Apple, Facebook, Twitter and Google is how they police ecosystems involving hundreds of thousands of applications every single day.
Each company has its own style of regulation. Apple exerts upfront control with its review process, rejecting apps if they crash, use private application programming interfaces or don’t work as advertised. But it has weaker power to police apps once they have met its approval. Google, on the other hand, is permissive at first, but can shut down apps that violate its policies even if a user has already downloaded them.
Facebook’s enforcement has evolved since the platform’s inception three years ago in response to spam and unanticipated abuse. Its style resembles Google’s and it cracks down on apps if it discovers violations.
However, the speed and scale with which all of these ecosystems have flourished raises a number of legal questions. Facebook may act in good faith, but with 700,000 applications and just 1,400 employees, its whack-a-mole approach can’t possibly discover every violation.
There are plenty of innocuous apps, like quizzes, that could take user data — which they now have access to for an indefinite amount of time under Facebook’s new policies — and use it in unscrupulous ways. Nor would they have the incentive to overtly reveal how they manipulate user data. This is unlike apps that blatantly steal passwords or violate the company’s policies, which Facebook will shut down.
While there isn’t a prominent case of this sort on Facebook yet, it does exist on the open web. For example, the RealAge ads that permeate millions of web pages and advertise a quiz to tell consumers their “real age,” is actually a front for drug marketers, according to The New York Times. More than 27 million people have filled out detailed medical questionnaires, revealing data that pharmaceutical companies pay for unbeknownst to consumers.
Who is responsible for watching out for cases like these? Should platform companies have the same kind of “intermediary liability” that protects ISPs, web hosting companies and search engines when their users upload illegal content?
Judging by the FTC’s recent interventions with other platform companies like Google — like its six-month inquiry into the company’s acquisition of AdMob — it seems the regulatory agency doesn’t have the resources to understand the dynamics of these complex ecosystems, let alone handle enforcement for millions of apps. So it’s unnerving to see the buck being passed around.