Advertisement

Autoplay On: YouTube Tolerated Toxic Content For Engagement, Report Says

46:39
Download Audio
Resume
A man holds a laptop computer with a YouTube logo on it at YouTube Space LA offices Wednesday, Oct. 21, 2015, in Los Angeles. (Danny Moloshok/AP)
A man holds a laptop computer with a YouTube logo on it at YouTube Space LA offices Wednesday, Oct. 21, 2015, in Los Angeles. (Danny Moloshok/AP)

With Meghna Chakrabarti

Bloomberg reports that YouTube staffers wanted to curb the spread of toxic and incendiary videos on the site. But executives wouldn’t do it. We look at accusations that YouTube prioritizes engagement over safety.


Want more from the show? You can get messages from our hosts (and more opportunities to engage with the show) sent directly to your inbox with the On Point newsletterSubscribe here.


Guests

Mark Bergen, reporter for Bloomberg Technology covering Google/Alphabet. (@mhbergen)

Guillaume Chaslot, former Google engineer who worked on a YouTube project to compute recommendations. Researcher at the University of East Paris and an adviser at Center for Humane Technology. (@gchaslot)


More than 1 billion hours. That's how long people spend watching YouTube on a daily basis.

Of those 1 billion+ hours, 700,000,000 are spent watching videos that YouTube's algorithm recommends.

So what kinds of videos is YouTube recommending?

Use the tool AlgoTransparency to get a better idea.

From The Reading List

Bloomberg: "YouTube Executives Ignored Warnings, Letting Toxic Videos Run Rampant" — "A year ago, Susan Wojcicki was on stage to defend YouTube. Her company, hammered for months for fueling falsehoods online, was reeling from another flare-up involving a conspiracy theory video about the Parkland, Florida high school shooting that suggested the victims were 'crisis actors.'

"Wojcicki, YouTube’s chief executive officer, is a reluctant public ambassador, but she was in Austin at the South by Southwest conference to unveil a solution that she hoped would help quell conspiracy theories: a tiny text box from websites like Wikipedia that would sit below videos that questioned well-established facts like the moon landing and link viewers to the truth.

"Wojcicki’s media behemoth, bent on overtaking television, is estimated to rake in sales of more than $16 billion a year. But on that day, Wojcicki compared her video site to a different kind of institution. 'We’re really more like a library,' she said, staking out a familiar position as a defender of free speech. 'There have always been controversies, if you look back at libraries.'

"Since Wojcicki took the stage, prominent conspiracy theories on the platform—including one on child vaccinations; another tying Hillary Clinton to a Satanic cult—have drawn the ire of lawmakers eager to regulate technology companies. And YouTube is, a year later, even more associated with the darker parts of the web.

"The conundrum isn’t just that videos questioning the moon landing or the efficacy of vaccines are on YouTube. The massive 'library,' generated by users with little editorial oversight, is bound to have untrue nonsense. Instead, YouTube’s problem is that it allows the nonsense to flourish. And, in some cases, through its powerful artificial intelligence system, it even provides the fuel that lets it spread."

The Guardian: "How an ex-YouTube insider investigated its secret algorithm" — "YouTube’s recommendation system draws on techniques in to decide which videos are auto-played or appear 'up next.' The precise formula it uses, however, is kept secret. Aggregate data revealing which YouTube videos are heavily promoted by the algorithm, or how many views individual videos receive from 'up next' suggestions, is also withheld from the public.

"Disclosing that data would enable academic institutions, fact-checkers and regulators (as well as journalists) to assess the type of content YouTube is most likely to promote. By keeping the algorithm and its results under wraps, YouTube ensures that any patterns that indicate unintended biases or distortions associated with its algorithm are concealed from public view.

"By putting a wall around its data, YouTube, which is owned by Google, protects itself from scrutiny. The computer program written by Guillaume Chaslot overcomes that obstacle to force some degree of transparency."

The Verge: "YouTube reportedly discouraged employees from reporting fake, toxic videos" — "For years, YouTube ignored its employees’ pleas to address and take down toxic videos in a quest to boost engagement, reports Bloomberg. According to more than 20 former and current YouTube staffers, employees would offer proposals to curb the spread of videos that contained disturbing, extremist content, and / or conspiracy theories, but leadership was reportedly more interested in boosting engagement than heeding those warnings.

"One proposal offered a way to keep content that was “close to the line” of violating policies on the platform but remove it from the recommended tab. YouTube rejected that suggestion in 2016, a former engineer said, and instead continued to recommend videos regardless of how controversial they were. According to employees, the internal goal was to reach 1 billion hours of views a day.

"'I can say with a lot of confidence that they were deeply wrong,' the engineer told Bloomberg. (YouTube implemented a policy in January 2019 that’s similar to what he initially suggested.)"

Stefano Kotsonis produced this hour for broadcast.

This program aired on April 8, 2019.

Related:

Advertisement

More from On Point

Listen Live
Close