Were you unable to attend Transform 2022? Check out all of the summit sessions in our on-demand library now! Watch here.
In this age of machine-learning-artificial-intelligence-driven blah blah blah, the folks at YouTube have decided that to win the battle against violent and racist content they must rely more on good old-fashioned human beings.
In a pair of blog posts today, the company elaborated on its strategy for stemming the rising tide of unsavory video content that has turned services such as YouTube, Facebook, and Twitter into bottomless cesspools of fake news, terrorist propaganda, and Nazi-fueled rage.
Over the summer, YouTube trumpeted investments in machine learning designed to find content that violates the company’s terms of service. That effort will certainly continue.
But YouTube CEO Susan Wojcicki wrote that the machine learning tools will now be complimented by expanded use of carbon-based lifeforms.
MetaBeat will bring together thought leaders to give guidance on how metaverse technology will transform the way all industries communicate and do business on October 4 in San Francisco, CA.
“We will continue the significant growth of our teams into next year, with the goal of bringing the total number of people across Google working to address content that might violate our policies to over 10,000 in 2018,” Wojcicki wrote. “At the same time, we are expanding the network of academics, industry groups, and subject matter experts who we can learn from and support to help us better understand emerging issues.”
She applauded the work of YouTube’s bi-peds for their efforts in recent months, as the platform has come under greater scrutiny and heavy criticism.
“Human reviewers remain essential to both removing content and training machine learning systems, because human judgment is critical to making contextualized decisions on content,” she wrote. “Since June, our trust and safety teams have manually reviewed nearly 2 million videos for violent extremist content, helping train our machine-learning technology to identify similar videos in the future.”
In addition to algorithms and Homo sapiens, YouTube is introducing new comment moderation tools. The company will also be more aggressive in shutting down some comments altogether. And YouTube is rewriting its advertising tools to more tightly restrict which content on the platform is eligible for its advertising programs, she said.
Overall, the goal is for machine learning to work in tandem with people, she noted, with the former surfacing dubious content for rapid review.
“We are taking these actions because it’s the right thing to do,” she wrote. “Creators make incredible content that builds global fan bases. Fans come to YouTube to watch, share, and engage with this content. Advertisers, who want to reach those people, fund this creator economy. Each of these groups is essential to YouTube’s creative ecosystem — none can thrive on YouTube without the other — and all three deserve our best efforts.”
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings.