Join Transform 2021 for the most important themes in enterprise AI & Data. Learn more.
During a summer internship with a large Kansas City corporation, college student Louis Byrd was unexpectedly called into the HR office.
Although the incident was about 10 years ago today, it has not yet left his mind, Byrd said.
“I’ll never forget this,” he said. “HR told me that the people on my team had complained that my hair, which was in dreadlocks at the time, was inappropriate for the workplace. … I responded, ‘I don’t say anything when people come to work with a wrinkled shirt or wet hair. Why should my hair — something that is natural to me — be a problem?”
We all have unconscious bias, Byrd said. Unfortunately, the majority of corporate America’s biases are rooted in making white people feel more comfortable, he added.
“There’s just no other way to put it,” said Byrd, CEO of Awari, a new software-as-a-service platform that identifies workplace biases.
“As a black man, I always felt that I had to step up my game in a corporate environment,” he continued. “When the dress code was polo and slacks, I would wear a shirt and tie. My story is common. This is an issue that black people and people of color tend to go through in the workplace. But if it wasn’t for my experience in the corporate world, I wouldn’t be where I am today.
In the time since the incident amid Byrd’s college internship, conversations about race, bias and privilege have become more common, he said.
And now, technology can objectively identify specific types of workplace bias that could help prevent situations like the one Byrd faced, he said.
He knows this because he created it, he said.
Launched in April, Byrd co-founded Awari: a software-as-a-service that utilizes proprietary artificial intelligence and natural language processing technology to identify age, race and gender bias in the workplace.
“Companies can directly upload their HR records and performance reviews,” Byrd said. “The application can cross reference for a pattern within the documents and then identify a person or team’s specific bias, whether that be preferential or negative.”
The technology requires a sample size of about 150 performance reviews for the sentiment analysis to be accurate, he said. Preferential bias toward a group is just as harmful as holding a negative bias, Byrd said.
“The goal is to get people in neutral,” he said. “Awari is literally a proverbial flashlight. It doesn’t necessarily mean that a person is a racist or a sexist or anything like that. All it’s doing is showing when someone is exhibiting bias. … I don’t believe it’s possible to eliminate bias individually — but you can from a system, and that’s the goal.”
Most companies’ diversity and inclusion efforts are focused on recruiting when retaining is the true issue, Byrd said.
“I was doing a lot of research and learned how performance reviews can really impact a person’s career,” he said. “Seventy percent of Americans quit their jobs not because they aren’t being paid enough or because they don’t like what they do, but because they don’t feel appreciated. This is often a cultural or unconscious bias problem.”
The goal of Awari is to create an office culture in which diverse groups feel comfortable, Byrd said. The platform is expected to launch to the market by the end of the year.
“Our main focus is going to be finding believers,” he said. “This application is for companies that want to be proactive, not reactive. That’s just the reality. A lot of companies say they are working on diversity, but really it’s just a catchphrase or a buzzword for them. To truly work on diversity, you first need transparency.”
In growing this company, one challenge Byrd has faced is companies’ fear that their unconscious bias data might be leaked, illuminating their businesses in a negative light. Assuring that the technology is secure, Byrd added that Awari is for forward-thinking companies open to improving their current cultures.
“Some feedback we’ve received is that companies don’t want this data,” Byrd said. “They’ve said it’s better for a company to live in ignorance and bliss. But the bias exists, whether you want to face it or not.”
In addition to providing data, Awari plans to also offer diversity consulting for clients who want to implement intentional changes.
“Many companies currently take a wide-blanketed approach to unconscious bias training and try and hit every single aspect of it in a very short time frame,” Byrd said. “That’s why it’s not really sticking. You have to be intentional and specific about the types of training you put into place.”
Byrd is confident that Awari targets a need in the marketplace. He is, however, less assured that Kansas City firms will be willing to jump on board with the technology, he said.
“If I had to be honest with you, Kansas City has this whole ‘Kansas City nice’ mentality that is a lot of talk and less about progress,” Byrd said. “In my seven years of branding consulting experience, I have had better luck finding clients outside of Kansas City. … I tried to start with a nucleus here and then build out, but the opposite approach has worked for me. For whatever reason, I can’t figure out how to break through the Kansas City business community.”
This perspective might be because of unconscious bias of his own, Byrd admitted. Regardless, he challenges Kansas City to do better.
“My advice to the people of Kansas City is to be willing to work with people outside the comfort zone of their network,” Byrd said.
This story originally appeared on Startlandnews.com. Copyright 2017
VentureBeatVentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative technology and transact. Our site delivers essential information on data technologies and strategies to guide you as you lead your organizations. We invite you to become a member of our community, to access:
- up-to-date information on the subjects of interest to you
- our newsletters
- gated thought-leader content and discounted access to our prized events, such as Transform
- networking features, and more