Hear from CIOs, CTOs, and other C-level and senior execs on data and AI strategies at the Future of Work Summit this January 12, 2022. Learn more
A union representing Uber drivers in the U.K. has filed a second legal challenge against the ride-hailing giant in Europe. This time, the union is arguing that Uber’s alleged “robo-firing” practices contravene Article 22 of the EU General Data Protection Regulation (GDPR), which seeks to protect individuals from automated decision-making.
The action has been filed in the District Court of Amsterdam, where Uber’s European HQ is located. It was initiated by three U.K.-based former Uber drivers with the backing of the App Drivers & Couriers Union (ADCU), a U.K. trade union for drivers and couriers who work for app-based companies. Also backing the action is the Worker Info Exchange, a nonprofit organization that aims to help gig economy workers. A fourth driver from Portugal is joining the challenge, with support from the International Alliance of App Based Transport Workers (IAATW).
This is the latest in a line of high-profile cases that have thrust algorithms into the spotlight, alleging bias and a lack of accountability. Last year, for example, Goldman Sachs hit the headlines following allegations of gender discrimination in the algorithms used to determine credit limits for the Apple credit card.
This is the ADCU’s second such case in the last few months, after it sued Uber on behalf of drivers seeking data from alleged “secret profiling.” The group argues that Uber withholds key data it uses to exert “management control” over drivers, such as metrics it uses to monitor drivers’ performance. That case is due to be heard on December 16.
The latest challenge, which will run in parallel to the first, uses GDPR regulatory provisions to counter automated decision-making in which humans have minimal oversight. Article 22 of the GDPR states:
The data subject shall have the right not to be subject to a decision based solely on automated processing, including profiling, which produces legal effects concerning him or her or similarly significantly affects him or her.
The ADCU’s latest case alleges that Uber kicked drivers off its platform based on algorithms and accused them of “fraudulent activity” without offering any avenue for appeal. The filing acknowledges Uber’s argument that it uses “specialized employees” to assess account deactivations. But the drivers’ lawyer, Anton Ekker, argues that Uber has not “further substantiated that this constitutes meaningful human intervention” or that the employees have been trained to understand “how the artificially intelligent system works.” The document reads:
In particular, Uber has not demonstrated that the employees involved in automated decision-making:
- Have a meaningful influence on the decision, which means, among other things, that they must have the “authority and competence” to oppose this decision
- “Weigh” and “interpret” the recommendation of Uber’s artificial intelligence system, considering all available data and taking into account additional factors
- Can predict how the “output” of the system will change if the “inputs” are adjusted
- Are able to determine which input contributed the most to a specific output
- Are able to determine when the output is incorrect
The court filing notes that in each of the four cases, the drivers were “dismissed after Uber said its systems had detected fraudulent activity,” charges the drivers deny. Moreover, the document alleges that Uber has never provided the drivers with any evidence to support its claims. The ADCU says it believes Uber is actually concealing performance-related deactivations behind the allegations of fraud, including situations in which a driver may log out of the Uber app when demand is low and surge pricing is not in place.
This is where the two separate legal challenges overlap, as both cases hinge on allegations that Uber doesn’t give its drivers access to the type of data, including performance metrics, that its automated systems use to make decisions.
“We contend that Uber has automated the process of deactivating drivers to such an extent that it is subject to abuse and error — as is the case with the claimants — and that there has been no meaningful human intervention,” Worker Info Exchange director James Farrar told VentureBeat. “In essence, these are robo-sackings without proper review or right of appeal. As such, that is a violation of the law under GDPR.”
Uber said that it “provides requested personal data and information that individuals are entitled to,” in a statement sent to VentureBeat. “We will give explanations when we cannot provide certain data, such as when it doesn’t exist or disclosing it would infringe on the rights of another person under GDPR. As part of our regular processes, the drivers in this case were only deactivated after manual reviews by our specialist team.”
The ADCU is asking other Uber drivers and couriers who have been impacted by deactivations to register and join a “potential future action.” It has also launched a crowdfunding campaign that seeks to raise £20,000 ($26,000) in the next three weeks to fund the legal action.
This article was updated to include a comment from Uber.
VentureBeatVentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative technology and transact. Our site delivers essential information on data technologies and strategies to guide you as you lead your organizations. We invite you to become a member of our community, to access:
- up-to-date information on the subjects of interest to you
- our newsletters
- gated thought-leader content and discounted access to our prized events, such as Transform 2021: Learn More
- networking features, and more