Check out all the on-demand sessions from the Intelligent Security Summit here.
AI startup Pymetrics today announced it has open-sourced its tool for detecting bias in algorithms. Available for download on GitHub, Audit AI is designed to determine whether a specific statistic or trait fed into an algorithm is being favored or disadvantaged at a statistically significant, systematic rate, leading to adverse impact on people underrepresented in the data set.
The new tool can audit a variety of algorithms, including those made to predict whether a person will pay back a loan or to assign a credit score to people with no banking history.
“We’ve crafted it so it can take the output of virtually any machine learning technique,” Pymetrics lead data scientist Lewis Baker told VentureBeat in an interview. “If you can copy a repo [on GitHub], you can use Audit AI.”
Audit AI is designed to detect bias, but removal of any imbalance in an algorithm is up to the creator.
Intelligent Security Summit On-Demand
Learn the critical role of AI & ML in cybersecurity and industry specific case studies. Watch on-demand sessions today.
Audit AI was first built for internal use by Pymetrics to identify bias in custom-built algorithms the company creates for customers to determine whether a person is a good fit for a job.
To create those algorithms, Pymetrics invites the top performers in a specific role at a company to play a series of games to measure roughly 90 different behavioral, emotional, and cognitive traits such as a person’s willingness to take risks, solve challenges, multitask, or keep focused amid distractions.
Once that algorithm is created, it can be used to compare the performance of job applicants with top performers within a company. But in the course of its work, Pymetrics found that top performers at some companies can be overly represented by a single, homogeneous demographic group.
Pymetrics, whose customers include more than 60 companies like Accenture, LinkedIn, Tesla, and Unilever, creates cognitive games for job applicants to play because asking questions like “Do you take risks?” can be answered differently by men and women.
“We look at what traits make that population [top performers] unique, and sometimes those traits might be predictive not of job performance but the homogeneity of the people who went through it,” Baker said. “And so we use Audit AI to make sure that we don’t overweight any traits that are actually more predictive of a certain demographic group.”
Pymetrics chose to open-source Audit AI, product lead Priyanka Jain told VentureBeat, to help others using machine learning and algorithms.
“As creators of technology, we feel really strongly it’s our responsibility to build AI that is creating a future that we all want to live in, and if we have a way to help other creators of technology continue to build that feature as well, it’s our responsibility to share it,” she said.
The release of Audit AI follows Microsoft’s announcement last week that it’s working on a tool for bias detection in algorithms.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings.