Join top executives in San Francisco on July 11-12, to hear how leaders are integrating and optimizing AI investments for success. Learn More
(Reuters) — Britain’s most senior police officer on Monday called on the government to create a legal framework for police use of new technologies, such as artificial intelligence.
Speaking about live facial recognition, which police in London started using in January, London police chief Cressida Dick said she welcomed the government’s 2019 pledge to create a legal framework for the police use of new technology involving AI, biometrics, DNA, and other elements.
“The best way to ensure that the police use new and emerging tech in a way that has the country’s support is for the government to bring in an enabling legislative framework that is debated through Parliament, consulted on in public, and which will outline the boundaries for how the police should or should not use tech,” Dick said.
“Give us the law and we’ll work within it,” she added.
Join us in San Francisco on July 11-12, where top executives will share how they have integrated and optimized AI investments for success and avoided common pitfalls.
Dick rejected evidence that facial recognition algorithms are racially discriminatory in that their accuracy rates vary depending on the skin color of the person they detect.
“We know there are some cheap algorithms that do have ethnic bias but, as I’ve said, ours doesn’t, and currently the only bias in it is that it shows it is slightly harder to identify a wanted woman than a wanted man,” she said.
The London police’s facial recognition technology is provided by NEC, a Japanese company.
(Reporting by Elizabeth Howcroft, editing by Guy Faulconbridge.)
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings.