Hear from CIOs, CTOs, and other C-level and senior execs on data and AI strategies at the Future of Work Summit this January 12, 2022. Learn more
Amazon is selling facial recognition technology to U.S. law enforcement agencies, the American Civil Liberties Union (ACLU) reported today.
Emails obtained by the ACLU through freedom of information requests show that the company worked with the city of Orlando, Florida and the Washington County Sheriff’s Office in Oregon to deploy Rekognition, an AI facial recognition platform that can parse databases of millions of people. Law enforcement agencies in California, Arizona, and other cities have also expressed an interest in adopting the technology.
In most cases, Amazon charged as little as $400 for setup and “just a few dollars” each month thereafter.
Rekognition, despite being cheaper than competing systems from NEC and other traditional vendors, is superior in several respects. It can track up to 100 people in real-time camera footage, according to the company’s marketing materials, even when their faces aren’t visible or they’ve exited and reentered the scene. Furthermore, it can supply context to frames by analyzing facial attributes to determine things like age range and emotional state.
“[Amazon Rekognition] is continually trained on new data to expand its ability to recognize objects, scenes, and activities to improve its ability to accurately recognize,” Amazon says. “When analyzing video, you can also identify specific activities happening in the frame, such as ‘delivering a package’ or ‘playing soccer.'”
This month, Ranju Das, a software development developer on the Rekognition project, told attendees at a developer conference in Seoul, South Korea that the technology could track the mayor of Orlando through public cameras around the city.
“City of Orlando is a launch partner of ours,” Das said. “They have cameras all over the city. The authorized cameras are then streaming the data […] we are a subscriber to the stream, we analyze the video in real time, search against the collection of faces they have.”
In an email statement to VentureBeat, Amazon says that it requires customers to “be responsible” when they use Amazon Web Services and Rekognition. “When we find that AWS services are being abused by a customer, we suspend that customer’s right to use our services,” an AWS spokesperson said. “Amazon Rekognition is a technology that helps automate recognizing people, objects, and activities in video and photos based on inputs provided by the customer. For example, if the customer provided images of a chair, Rekognition could help find other chair images in a library of photos uploaded by the customer. ”
Orlando is piloting the facial recognition technology to target suspected criminals in footage from the city’s surveillance systems. Washington County built a smartphone that allows deputies to scan mugshots through a database of 300,000 faces for matches, which it used to identify and arrest a suspect who stole more than $5,000 from local stores.
In a statement provided to the New York Times, the Washington County Sheriff’s said that it wasn’t using Amazon’s facial recognition system for real-time tracking or with footage from body cameras — only to identify suspects in criminal investigations. The Orlando Police department also told the publication that it hadn’t deployed Rekognition for surveillance, and that it was only testing the service.
In the U.S., there are no laws that bar law enforcement from using real-time facial recognition, but the technology — and the use of artificial intelligence for surveillance purposes — remains controversial.
A recent House oversight committee hearing on facial recognition technologies revealed that the algorithms used to identify matches are wrong about 15 percent of the time, and evidence suggests that those algorithms are susceptible to racial biases. A 2011 study found that systems developed in China, Japan, and South Korea had more trouble distinguishing between Caucasian faces than East Asians. In a separate study conducted in 2012, facial recognition algorithms from vendor Cognitec performed 5 to 10 percent worse on African Americans than on Caucasians.
Then there’s the issue of privacy. More than 130 million Americans, many of whom have never committed a crime, are in state and federal facial recognition databases, according to the Center on Privacy & Technology at Georgetown Law. Some of those databases are managed by the Federal Bureau of Investigation’s Next Generation Identification program, details of which weren’t disclosed to the public until five years after its launch in 2010.
“Once a dangerous surveillance system like this is turned against the public, the harm can’t be undone,” Nicole Ozer, technology and civil liberties director for the ACLU of California, said in a statement. “Particularly in the current political climate, we need to stop supercharged surveillance before it is used to track protesters, target immigrants, and spy on entire neighborhoods.”
Update at 7 p.m. Eastern: Added a statement from Amazon.
VentureBeatVentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative technology and transact. Our site delivers essential information on data technologies and strategies to guide you as you lead your organizations. We invite you to become a member of our community, to access:
- up-to-date information on the subjects of interest to you
- our newsletters
- gated thought-leader content and discounted access to our prized events, such as Transform 2021: Learn More
- networking features, and more