YouTube has long been criticized for the opaque nature of the Google-owned platform’s algorithms. While we don’t really understand how it all works, we do know that it relies on a real-time feedback loop to suggest new videos to watch, with the suggestions differing for each viewer based on their individual viewing activity.

Studies have suggested YouTube plays a big role in fueling conspiracy theories, spreading misinformation, and “radicalizing” individuals, although the company has apparently made some efforts to muzzle the monster it created. But the upcoming U.S. presidential election offers an opportunity to investigate how ideas spread online, which is why Mozilla today lifted the lid on a new research initiative designed to illuminate how YouTube algorithms actually work.

Regrets

With RegretsReporter, a browser add-on for Firefox and Chrome, Mozilla is inviting users to submit reports to Mozilla whenever they are recommended “harmful” videos. Once the add-on is installed, a user can click on the “frowning” RegretsReporter icon in their browser bar, which only works when an active YouTube video is on screen.

Above: RegretsReporter add-on

Clicking the icon will surface a video-viewing history from that session and allow users to hit a “report video and history” button.

Above: RegretsReporter: Report a video

Before submitting the report, users are asked to provide information about what they found regrettable about the video.

Above: RegretsReporter form

Mozilla said it hopes to gain insights into the frequency and severity of harmful content recommendations and the ways viewer habits contribute to the pattern.

Value

Of course, RegretsReporter’s value will depend on the quality and quantity of data it garners. According to Mozilla, each report submitted will include YouTube browsing history for up to five hours prior to a user submitting the report — including YouTube pages visited and how they were reached. Users can manually delete specific videos from their history they don’t want to submit, though this would presumably compromise the integrity of the data.

It’s pretty clear YouTube recommendations are based on viewing patterns dating back much further than five hours, which raises questions about how insightful this research is likely to be — even if it does manage to achieve meaningful scale in terms of installations. Mozilla acknowledges this shortcoming but believes the data is valuable enough to work with.

“Many of our research questions can be studied without knowledge of a user’s full viewing data,” Mozilla senior campaigner Brandi Geurkink told VentureBeat. “For example, are there identifiable patterns in terms of frequency or severity of reported content? And does the report frequency increase for users after they send their first report? That said, we know that the insights we glean won’t be comprehensive. We aim to identify areas where more examination is needed and then build momentum to enable that deeper examination.”

There is also the thorny issue of data privacy. In addition to the data it collects with every form submission, Mozilla’s privacy T&Cs note that the add-on will gather “periodic, aggregated information” about a user’s YouTube usage, such as how often they visit YouTube and for how long — but will not track what they watch or search for on the platform.

So it comes down to how motivated people are to shine a light on YouTube’s murky AI smarts, and how much data they are willing to share.

Campaigns

RegretsReporter is part of Mozilla’s ongoing campaign against YouTube. This effort has included research and recommendations as part of Mozilla’s inaugural “YouTube regrets” campaign last year. Among Mozilla’s suggestions was the idea that YouTube should open up more of its data and commit to working with independent researchers.

“We were shocked by the number of responses to our initial YouTube Regrets campaign, which indicates that this is a widespread problem,” Geurkink added. “We hope that this will translate into thousands of downloads of the extension.”

Based on its findings, Mozilla said it may work with other researchers, journalists, policymakers, and “even engineers within YouTube” on solutions to YouTube’s algorithm problem. And Mozilla plans to share the results of its research with the public — thought it has not offered a timeline for publication.

“It’s hard to say [when Mozilla will publish results] without first an understanding of how users will use the extension and an average rate of regrets,” Geurkink continued. “But we will be performing regular analysis from the time that we launch, rather than waiting a fixed period of time. So we plan to share findings soon after we uncover them.”

A Google spokesperson responded to Mozilla’s latest campaign, stating that the goal of its recommendation system is to “connect users with content they love on any given day” and that it regularly pushes out updates to reduce the visibility of “borderline content.” The spokesperson also confirmed that Google is already “exploring options” to work with external researchers.

“We are always interested to see research on these systems and exploring ways to partner with researchers even more closely,” the spokesperson said. “However, it’s hard to draw broad conclusions from anecdotal examples, and we update our recommendations systems on an ongoing basis to improve the experience for users. For example, over the past year alone, we’ve launched over 30 different changes to reduce recommendations of borderline content. Thanks to this change, watch time this type of content gets from non-subscribed recommendations has dropped by over 70% in the U.S.”

Update 7:50 a.m. Pacific: Included a comment from Google.