It’s no secret that facial recognition algorithms — in particular Rekognition, Amazon’s cloud-based image analysis service — have recently become the subject of increased scrutiny.
In a letter addressed to Amazon CEO Jeff Bezos in June, nearly 19 groups of shareholders expressed reservations over the company’s decision to provide Rekognition to law enforcement in Orlando, Florida, and the Washington County (Oregon) Sheriff’s Office, joining the American Civil Liberties Union, Amazon employees, academics, and more than 70 other groups in protest. And in July, after the ACLU demonstrated Rekognition’s susceptibility to error, a trio of Democratic Congressmen raised concerns that the technology would “pose [a danger] to privacy and civil rights.”
Offering a counterpoint to the raft of negative press, Amazon on Thursday published a case study highlighting the ways that Rekognition, which launched as part of Amazon Web Services (AWS) two years ago at Amazon’s Re:Invent conference in November 2016, is being used as a force for good.
Marinus Analytics, a big data analytics company founded in 2014, is employing artificial intelligence tools including Rekognition to help find human trafficking victims and reunite them with their families, Amazon wrote.
VB Event
The AI Impact Tour
Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you!
Specifically, it provides agencies with tools that assist with identifying and finding victims of sexual trafficking. One such service, Traffic Jam, has a facial recognition feature — FaceSearch — that leverages Rekognition to search through millions of records in seconds. Another taps Rekognition’s character recognition algorithms to detect words and phrases in images, extract them, and convert them into machine-readable text, which it organizes and collates in a searchable format.
Amazon said that FaceSearch helped a detective find a phone number registered to a sex trafficking victim’s name in a two-year-old photograph, leading to the identification of 20 total victims and landing the perpetrator behind bars in just three months. In another instance, it helped California investigators find a missing 16-year-old girl who was sold online for sex.
“Without Traffic Jam, investigators are left to sift through thousands of online ads manually. This means they sit at their computer, with a picture of the victim taped to their screen, and compare every photo they see online in the hope that they might find a match,” Emily Kennedy, president and cofounder of Marinus Analytics, said in a statement. “Using AI technology, like Amazon Rekognition, this critical task can now be done with more accuracy and within seconds as compared to days, which is so important in cases where detectives have limited time to find the victim before he or she is moved to the next city.”
In an email to VentureBeat, Amazon drew attention to a second startup — nonprofit organization Thorn — using Rekognition and other AWS products to identify and rescue children who have been sexually abused. Through its Spotlight tool, which sifts through hundreds of thousands of child sex trafficking ads each day, it’s identified 5,894 sex trafficking victims and helped to recover 103 of them to date.
Julie Cordua, CEO of Thorn, said that in all, machine learning algorithms have reduced investigation time by 65 percent.
“Abusers have hijacked the most advanced technology to exploit children — selling children online for sex, circulating abuse images and videos, and engaging in live-streaming abuse. AWS has chosen to be a part of the solution — partnering to leverage their solutions to help find exploited children faster and stop abuse,” she said in a statement.
The case studies are part of a broader campaign by Amazon to push back against Rekognition’s critics. In June, Amazon Web Services general manager Matt Wood wrote that the service’s text and voice analysis capabilities were “materially benefiting” society by “preventing human trafficking, inhibiting child exploitation, reuniting missing children with their families, and building educational apps for children” and by “enhancing security through multi-factor authentication, finding images more easily, or preventing package theft.”
“There has been no reported law enforcement abuse of Amazon Rekognition,” he wrote. “There have always been and will always be risks with new technology capabilities. Each organization choosing to employ technology must act responsibly or risk legal penalties and public condemnation. AWS takes its responsibilities seriously.”
The public relations effort, however, will have a hard time winning over critics who contend its software is susceptible to bias.
Earlier this month, the ACLU demonstrated that Recognition, when fed 25,000 mugshots from a “public source” and tasked with comparing them to official photos of Congressional members, identified 28 as criminals. A majority of the false matches — 38 percent — were people of color.
An Amazon spokesperson told VentureBeat that the ACLU’s test was likely skewed by poor calibration. It used a confidence threshold — i.e., the likelihood that a given prediction is correct — of 80 percent, lower than the 95 percent Amazon recommends for law enforcement applications.
But a growing body of research suggests that on the whole, facial recognition systems tend to readily acquire prejudices from the datasets of images on which they’ve been trained.
A study conducted in 2011, for example, found that facial recognition systems developed in China, Japan, and South Korea had difficulty distinguishing between Caucasian faces than East Asians. And a separate study in 2012 showed that facial algorithms from vendor Cognitec performed 5 to 10 percent worse on African Americans than on Caucasians.
More recently, a House oversight committee hearing on facial recognition technologies revealed that algorithms used by the Federal Bureau of Investigation to identify criminal suspects are wrong about 15 percent of the time. The system deployed by London’s Metropolitan Police, meanwhile, produces as many as 49 false matches for every hit.
Rick Smith, CEO of Axon, one of the largest suppliers of body cameras in the U.S., was recently quoted as saying that facial recognition isn’t accurate enough for law enforcement applications.
“[They aren’t] where they need to be to be making operational decisions off the facial recognition,” he said. “This is one where we think you don’t want to be premature and end up either where you have technical failures with disastrous outcomes or … there’s some unintended use-case where it ends up being unacceptable publicly in terms of long-term use of the technology.”
In spite of Amazon’s increasingly vocal critics, Orlando this month decided to renew an agreement with Amazon to use Rekognition as part of a test involving volunteers from the city’s police force. And Washington County used it to build a smartphone that allows deputies to scan mugshots through a database of 300,000 faces for matches.
“An identification — whether accurate or not — could cost people their freedom or even their lives,” the ACLU said in a statement accompanying the results of its test. “Congress must take these threats seriously, hit the brakes, and enact a moratorium on law enforcement use of face recognition.”
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings.