The City of Orlando, Florida today confirmed that a pilot program involving the use of Amazon’s Rekognition facial technology has come to an end. The announcement comes a week after the American Civil Liberties Union (ACLU) and nearly 89 other groups protested the sale of the system to police officers.
“Staff continues to discuss and evaluate whether to recommend continuation of the pilot at a further date,” the Orlando Police Department wrote in a statement issued jointly with the city. “At this time, that process is still ongoing and the contract with Amazon remains expired.”
But the city and police department left open the possibility that they might pursue a contract at a future date.
“The City of Orlando is always looking for new solutions to further our ability to keep our residents and visitors safe,” the joint statement said. “Partnering with innovative companies to test new technology — while also ensuring we uphold privacy laws and in no way violate the rights of others — is critical to us as we work to further keep our community safe.”
An Amazon spokesperson told VentureBeat that the program was a pilot and “had a discernable end date.”
“We did a professional services engagement with the city or Orlando …,” they said. “That this engagement ended was expected and is not news.”
News that Amazon had supplied U.S. law enforcement with facial identification technology broke in May, following a six-month investigation by the ACLU into dealings between Amazon Web Services (AWS) — the division within Amazon that oversees Rekognition — and the Orlando Police Department and Oregon’s Washington County Sheriff’s Office.
The Washington County Sheriff’s Office continues to use Rekognition to identify criminal suspects, a public information officer told the New York Times. Emails obtained by the ACLU show that deputies have also used it to find unconscious and deceased citizens, as well as witnesses who weren’t suspected of any crimes.
In a letter addressed to Amazon CEO Jeff Bezos last week, Amazon shareholders warned that Rekognition could be abused and that increased public scrutiny threatened to negatively impact stock.
“While Rekognition may be intended to enhance some law enforcement activities, we are deeply concerned it may ultimately violate civil and human rights,” the shareholders wrote. “We are [also] concerned the technology would be used to unfairly and disproportionately target and surveil people of color, immigrants, and civil society organizations … [and that] sales may be expanded to foreign governments, including authoritarian regimes.”
In an email statement provided to VentureBeat last month, Amazon said it requires customers to “be responsible” when they use Amazon Web Services and Rekognition. “When we find that AWS services are being abused by a customer, we suspend that customer’s right to use our services,” an AWS spokesperson said. “Amazon Rekognition is a technology that helps automate recognizing people, objects, and activities in video and photos based on inputs provided by the customer. For example, if the customer provided images of a chair, Rekognition could help find other chair images in a library of photos uploaded by the customer.”
VentureBeatVentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative technology and transact. Our site delivers essential information on data technologies and strategies to guide you as you lead your organizations. We invite you to become a member of our community, to access:
- up-to-date information on the subjects of interest to you
- our newsletters
- gated thought-leader content and discounted access to our prized events, such as Transform
- networking features, and more