The Washington PostDemocracy Dies in Darkness

YouTube says viewers are spending less time watching conspiracy theory videos. But many still do.

Video giant says users spent 70 percent less time watching videos pushing miracle cures.

December 3, 2019 at 12:00 p.m. EST
(Patrick Semansky/AP)

YouTube said Tuesday its policies and enforcement helped reduce the length of time viewers watch videos that advance conspiracies and other debunked theories, as the leading video site responds to criticism regarding its failure to police such content.

The Google-owned company said Tuesday it had pared by 70 percent the average time U.S. viewers spend watching videos that it deems “borderline” content, such as those peddling miracle medical cures or flat-earth conspiracy theories. The announcement follows a change in YouTube’s algorithm, announced in January, seeking to limit how often its software recommended videos espousing fringe views.

But in a blog post Tuesday, the company didn’t release the underlying figures, such as how much time viewers still spend watching the videos. It didn’t say whether it had reduced the times the videos are clicked on in the first place or provide global figures.

YouTube is changing its algorithms to stop recommending conspiracies

Many viewers of such content subscribe to channels that regularly peddle it. Ivy Choi, a Google spokeswoman, declined to comment beyond the blog post.

“There will always be content on YouTube that brushes up against our policies, but doesn’t quite cross the line,” YouTube said in the blog post. “We’ve been working to raise authoritative voices on YouTube and reduce the spread of borderline content and harmful misinformation.”

As part of those efforts, YouTube said it was pushing users toward videos from more-reliable news sources, pointing to Fox News and Brazilian radio outfit Jovem Pan as examples. The company said that for searches for ongoing news events such as Brexit, 93 percent of the top 10 recommended videos are from creators YouTube deems “high-authority.” The company didn’t disclose what sample size it used for that data or how many people click on the top videos under a given search or in what order.

YouTube’s arbitrary standards: Stars keep making money even after breaking the rules

YouTube has historically given wide latitude to creators in the name of free speech, although it is legally permitted to prohibit whatever content it wishes. It does not permit hate speech but defines that narrowly as content that promotes violence or hatred of vulnerable groups.

In an interview on “60 Minutes” aired Sunday, YouTube chief executive Susan Wojcicki used the example of videos that suggest people should discriminate in hiring because of race as warranting removal, but said those that simply espouse racial superiority would be allowed.

Silicon Valley firms have been struggling with how to police their sites, particularly as the U.S. presidential election heats up. Twitter, for example, has banned all political advertisements, while Facebook allows political content that may be false or misleading. YouTube hasn’t taken a clear position on the issue, but Wojcicki said the company had removed some ads related to President Trump.

A right-wing YouTuber hurled racist, homophobic taunts at a gay reporter. The company did nothing.

Congress could help untangle the thicket of varying policies by passing laws that require transparency from YouTube and other tech firms, said Jeffrey Chester, executive director of the Center for Digital Democracy, an electronic rights organization. “You can’t put the future of democracy into the hands of companies that are dependent on advertisers for their business,” he said.

To help direct viewers to more-reliable information, YouTube said it has been showing users snippets of text news articles that it verifies as accurate, particularly following breaking news events, or displaying “information panels” that provide additional context. That type of information will appear to viewers watching videos pushing people to eschew vaccines, according to the blog post.

YouTube said it relies on a number of factors to determine reliability, including the amount of time a given video is watched, how many times a video is clicked on, as well as likes and dislikes. It also turns to about 10,000 contract workers around the world who review content, which helps train its software to automate the process.

Some of those workers have complained the company doesn’t always listen to them when they flag content, saying YouTube applies a double standard.