YouTube’s AI serves up most videos that viewers wish they hadn’t seen, study says – CNET

youtube-3

YouTube has more than 2 billion monthly visitors, and more than 1 billion hours of video are watched there everyday.

Angela Lang/CNET

YouTube‘s almighty recommendations surfaced most of the videos that a crowdsourced army of volunteers said they regretted watching, according to a study released Wednesday by Mozilla based on “regret” reports from YouTube users. Of the videos people said they regretted, 71% were recommended by YouTube’s artificial intelligence. YouTube also recommended people watch videos that were later pulled down for breaking its own rules, and people in countries where English isn’t the main language reported regrettable videos at a higher rate than people in English-speaking countries, the report said.  

YouTube said its own surveys find users are satisfied by its recommendations, which generally direct people to authoritative or popular videos. YouTube can’t properly review Mozilla’s definition of “regrettable” nor the validity of its data, the company added, and it noted that it works constantly to improve its recommendations, including 30 changes to reduce recommendations of harmful videos in the past year. 

Google’s massive video site is the world’s biggest source of online video. It reaches more than 2 billion viewers every month, and people watch more than 1 billion hours there everyday. For years, YouTube has vaunted its algorithmic recommendations for driving more than 70% of the time people spend watching YouTube. But Mozilla’s report provides a peek into some of those recommendations’ possible shortcomings. 

About 9% of recommended “regrettable” videos — a total of 189 videos in this study — were later taken down from YouTube. YouTube videos can be removed for a variety of reasons, including breaking rules against offensive or dangerous content or infringing copyrights. Sometimes, the person who posted the video takes it down. But the study confirmed YouTube removed some videos for violating its policies after it had previously recommended them.  

“That is just bizarre,” Brandi Geurkink, Mozilla’s senior manager of advocacy and coauthor of the study, said in an interview Tuesday. “The recommendation algorithm was actually working against their own like abilities to…police the platform.”

YouTube — like FacebookTwitterReddit and many other internet companies that give users a platform to post their own content — has wrestled with how to balance freedom of expression with effectively policing offensive or dangerous material posted there. Over the years, YouTube has grappled with misinformationconspiracy theoriesdiscriminationhate and harassmentvideos of mass murder and child abuse and exploitation, all at an unprecedented global scale.

Mozilla’s study, for example, found that YouTube videos with misinformation were the most frequently reported as regrettable. And the rate of regretted YouTube videos is 60% higher in countries that don’t speak English as a primary language, particularly in Brazil, Germany and France. 

The study is based on voluntary reports sent through a special RegretsReporter extension that Mozilla developed for Chrome and Firefox web browsers. Tens of thousands of people downloaded the extension, and 1,662 submitted at least one report on a YouTube video they regretted watching, for a total of 3,362 reports coming from 91 countries between July 2020 and May 2021. 

The study has several limitations. The people reporting these regrettable videos aren’t a random sample — they’re volunteers whose willingness to participate may mean they’re not representative of YouTube’s audience as a whole. The report acknowledges that limitation, as well as the fact that many factors may affect whether a volunteer reports a particular video and that the concept of a regrettable video may differ among volunteers. 

The study is also based solely on regret reports filed from desktop web-browser extensions, which excludes any viewing on mobile devices or connected TVs. Mobile devices, for example, account for more than 70% of time spent watching YouTube. 

What’s next

Mozilla’s report makes several recommendations for YouTube, for lawmakers and for people like you. 

For individual YouTube viewers, Mozilla recommended you check your data settings for YouTube and Google and consider reviewing your “watch” and “search” history to edit out anything that you don’t want influencing your recommendations.

YouTube and other platforms should set up independent audits of its recommendation systems, Mozilla said, as it also called for more transparency about borderline content and greater user control over how your personal data contributes to your recommendations, including the ability to opt out of personalization.

YouTube said it welcomes more research and is exploring options to bring in external researchers to study its systems. 

Mozilla also recommended policymakers require YouTube and others to release information and create tools for researchers to scrutinize their recommendation algorithms. And regulations should ensure platforms take into account the risks they’re taking when designing and running automated systems that amplify content at scale.

Mozilla is a software company best known for its unit that operates the Firefox web browser. Google, YouTube’s parent company, is also one of Mozilla’s biggest sources of revenue, through royalties that Google pays Mozilla for integrating the search engine into Firefox. Google is the default search engine in Firefox in many regions of the world.

Now playing: Watch this: Let’s talk about why privacy settings are a problem


4:10