Were you unable to attend Transform 2022? Check out all of the summit sessions in our on-demand library now! Watch here.

Building a custom computer vision model isn’t easy for those who lack the data science expertise, but Amazon hopes to streamline things with a feature launching ahead of its re:Invent 2019 conference in Las Vegas. Amazon Web Services (AWS) today announced that Rekognition, its cloud-based software-as-a-service computer vision platform, will soon gain custom labels that’ll allow customers to craft object-detecting systems for specific use cases.

Starting December 3, AWS customers will be able to use Rekognition to train models with small sets of labeled images (as few as 10), enabling them to detect esoteric items like “turbochargers” and “transmission torque converters” in photographs of, say, machine parts. Unlike traditional approaches, which require training computer vision models from scratch on large corpora, Rekognition’s custom labels tap transfer learning techniques to achieve what Amazon claims is “state-of-the-art” performance.

According to Amazon, customers like the VidMob and the National Football League’s NFL Media are already using custom labels to generate metadata tags and provide searchable facets for various content creation teams. “[Custom labels] significantly improves the speed in which the team can search for content and enables them to automatically tag elements that previously required manual efforts,” an Amazon spokesperson told VentureBeat via email.

“AWS customers can now easily train high-quality custom vision models with a reasonably small set of labeled images,” wrote Rekognition senior product manager Anushri Mainthia in a blog post. “Doing this requires no ML experience, and with only a few lines of code customers can access Amazon Rekognition’s easy-to-use fully managed Custom Labels API that can process tens of thousands of images stored in Amazon S3 in an hour.”


MetaBeat 2022

MetaBeat will bring together thought leaders to give guidance on how metaverse technology will transform the way all industries communicate and do business on October 4 in San Francisco, CA.

Register Here

It’s worth noting that rival Microsoft offers a comparable capability in Azure Custom Vision, which enables developers to build and computer models for specific domains. Google similarly supports bespoke model training in Cloud AutoML Vision Object Detection, which lets users train object-detecting classifiers along with their labels.

Amazon claims that what sets custom labels apart is its “auto” machine learning component, which determines the best algorithm and builds a custom model. Still, it doesn’t so much advance the state of the art as bring Amazon’s offering up to speed. What custom labels’ launch does signal is the degree to which the company considers Rekognition — which debuted in 2016 and which can also track celebrities, people moving through a video, and facial attributes — core to its AI-as-a-service businesses’ growth.

Unfortunately for Amazon, Rekognition is perhaps the most controversial of its AI-driven products. In a test in summer 2018, the American Civil Liberties Union (ACLU) demonstrated that Rekognition misidentified 28 Congressional representatives as criminals. In January, MIT researchers published a study that found that Rekognition failed to reliably determine the sex of darker-skinned faces in specific scenarios. And hundreds of Amazon employees have signed open letters protesting the service’s sale to law enforcement.

Amazon for its part disputes the ACLU’s claims and the results of the MIT study, and it says it’s working to improve the accuracy of Rekognition by making funding available for research projects and staff through the AWS Machine Learning Research Grants. Separately, CEO Jeff Bezos said in September that Amazon’s public policy team is actively developing facial recognition regulations.

VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings.