The Transform Technology Summits start October 13th with Low-Code/No Code: Enabling Enterprise Agility. Register now!
This week, VentureBeat published a collection of predictions about where machine learning is heading in 2020 from industry leaders like PyTorch creator Soumith Chintala, IBM Research director Dario Gil, Nvidia machine learning research director Anima Anandkumar, and Google AI chief Jeff Dean.
Each expert shared insights about subfields they think will make strides in the year ahead, like multitask learning and semi-supervised learning, and everyone seemed to agree that Transformer indeed transformed natural language AI in 2019. But they also coalesced around their shared hope that the AI field will continue to change for the better in 2020.
One person who spoke at length with VentureBeat about how the AI field can evolve in the year ahead is Celeste Kidd, director of the Kidd Lab at University of California, Berkeley. She told us she hopes neural networks lose their reputation for being black boxes, and that more people in machine learning develop realistic opinions of what babies can learn compared to neural nets, but she also talked about the lack of women in machine learning — and the sexual harassment.
She was Time Person of the Year in 2017 along with other women associated with the #MeToo movement, and last month she gave the opening keynote address at NeurIPS, the largest AI research conference in the world.
In her speech, Kidd took a deep dive into what machine learning practitioners should know about the human mind — how people form beliefs and how they can be quickly led to believe falsities when content recommendation AI maximizes for engagement.
She also talked about her own experience with sexual harassment, and the need to dispel the myth among men in machine learning that being alone with a female colleague can lead to unfounded sexual harassment allegations and the end of their careers. When that fear leads to missed opportunities for women in the field, Kidd said, even well-intentioned people with no desire to inflict harm can instead damage the careers of women.
The speech ended with a standing ovation, which was uncharacteristic for a machine learning research conference.
Misperceptions held by men in machine learning is something she said nobody wants to talk about for a NeurIPS keynote address, but it’s something she felt she had to do given the opportunity to speak with so many people who are directly responsible for decisions made at their companies or training female students at universities.
In 2018, analysis by Element AI found that the number of women who authored papers published at major AI research conferences like NeurIPS remains below 20%, while a 2019 Nesta report on gender diversity in AI found that less than 30% of AI research published on arXiv in the U.S. had a female author. Some countries, like the Netherlands, surpass 40% female authors, but no nation achieves gender parity.
Bringing more women into machine learning research requires taking sexual harassment seriously and exposing predators, Kidd said, because she believes that contributes to the leaky tech pipeline. She also stressed that for the average person, it’s not a single dramatic event, but more often a parade of seemingly small events — what she called “death by 1,000 paper cuts” — that push women out of the field.
The day after she gave her speech, Kidd said she tried without success to reach the poster session at the conference but ended up being stopped by men and women — men with thanks for calling this fear misguided, and women who said they were invited to social events with peers.
“You learn just as much from your peers, if not more than you do from your mentors,” she said. “So when you have a lab treating a woman as otherly, if you’re not treating her the same, she doesn’t get the same access to all of the informal training opportunities that exist, all the opportunities that everybody else in the lab has for learning from their peers.”
Inviting women to be part of social outings and dissuading the misperception that mentoring women will lead to sexual harassment allegations and the end of men’s careers is important, but getting rid of serial predators is critical, Kidd believes, to achieving parity and closing the AI research gender gap. Resisting the Pence Rule that men should avoid being alone with a female colleague unless their wife or others are present could also help.
“If you set up a rule like ‘I’m only going to meet with women with the door open’ [or] ‘I’m only going to meet with women when there’s somebody else present,’ you’re introducing a systemic inequity that means that she doesn’t get as much access to your mentorship as somebody that doesn’t have to have those particular circumstances in place,” Kidd said.
One thing that stood out from the interview: It’s not just an individual who loses out when a woman is pushed out of the industry or a persistent gender gap emerges in machine learning research. It’s a loss for the machine learning industry as well. And as AI spreads to all corners of business and society, that means everyone loses.
Thanks for reading,
Senior AI Staff Writer
VentureBeatVentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative technology and transact. Our site delivers essential information on data technologies and strategies to guide you as you lead your organizations. We invite you to become a member of our community, to access:
- up-to-date information on the subjects of interest to you
- our newsletters
- gated thought-leader content and discounted access to our prized events, such as Transform 2021: Learn More
- networking features, and more