We are excited to bring Transform 2022 back in-person July 19 and virtually July 20 - 28. Join AI and data leaders for insightful talks and exciting networking opportunities. Register today!


What happens when you don’t know why a smart system made a specific decision? AI’s infamous black box system is a big problem, whether you’re an engineer debugging a system or a person wondering why a facial-recognition unlock system doesn’t perform as accurately on you as it does on others.

In this episode of the The AI Show, we talk about engineering knowability into smart systems. Our guest, Nell Watson, chairs the Ethics Certification Program for AI systems for the IEEE standards association. She’s also the vice-chair on the Transparency of Autonomous Systems working group. She’s on the AI faculty at Singularity University, she’s an X-Prize judge, and she’s the founder of AI startup QuantaCorp.

Listen to the podcast here:

And subscribe on your favorite podcasting platform:

VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Learn more about membership.

Author
Topics