As the U.S. presidential election season swings into high gear, the amount of synthetic media targeting politicians is soaring. According to a new report from brand protection startup Creopoint, the number of manipulated videos shared online grew 20 times over the past year.
While celebrities and executives continue to be targets of deepfakes and other altered media, the company said 60% of doctored videos it found on its platform took aim at politicians. The videos ranged from goofy content, such as a deepfake that placed U.S. President Donald Trump in a scene from the movie Independence Day, to more insidious content designed to make former Vice President Joe Biden appear to be disoriented.
Because these videos use a wide range of techniques, from AI-driven deepfakes to more basic selective editing, they can be hard for platforms like YouTube, TikTok, and Twitter to detect, even as these companies develop more powerful AI to scour content. For that reason, Creopoint CEO Jean-Claude Goldenstein said he expects the current presidential election to be remembered as the “fake-video election.”
“There is a lot more than you think,” Goldenstein said. “And it’s alarming.”
In recent months, some social media companies have taken more public steps to identify, and in some cases remove, doctored videos. Twitter, for instance, placed the label “manipulated” on a video shared by President Trump and another shared by his social media team.
But the surge of manipulated videos continues to overwhelm social media platforms, Goldenstein said. While companies such as Google, Facebook, and Twitter say they are investing in AI and machine learning to combat this issue at scale, Goldenstein believes such algorithmic approaches are doomed to fail.
He argued that algorithms cannot be fed enough information in a timely manner to learn quickly enough to spot fakes. In part, that’s because at the high end, the tools for creating deepfakes are advancing too quickly and becoming too widely available. But the number of ways people manipulate videos is also expanding, adding additional challenges.
Synthetic media includes such simple tricks as relabeling videos to give them a more sinister tone or changing the video speed to make the subject appear slow-witted or disoriented. Not only do these videos aim to discredit or humiliate their subjects, they also serve to undermine people’s trust in video. Goldenstein pointed to the story of a GOP congressional candidate who published a report insisting the video of Minneapolis police killing George Floyd was a deepfake.
Goldenstein does think AI can play a role in the fight against fake news, albeit with limitations. The company created the report based on its own work, which involves helping executives and brands protect their reputation by monitoring online content through text mining and other tools. But the company also has a patent for a system to “contain the spread of doctored political videos.”
Creopoint uses AI to find domain experts in countless fields and add them to a database. When it finds videos that have been potentially manipulated, it signals relevant members of this network, who act like a SWAT team to review and identify possible manipulations. Goldenstein argues that making better use of human expertise is a critical part of augmenting the work being done by AI and moderators on the various platforms.
“I’m concerned about what’s about to hit us in the coming weeks,” he said. “The technology to make these videos is growing much faster than the solutions.”
VentureBeatVentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative technology and transact. Our site delivers essential information on data technologies and strategies to guide you as you lead your organizations. We invite you to become a member of our community, to access:
- up-to-date information on the subjects of interest to you
- our newsletters
- gated thought-leader content and discounted access to our prized events, such as Transform
- networking features, and more