YouTube’s battle against Covid-19 misinformation is causing collateral damage as the world’s largest online video service struggles to pick up on nuances of an increasingly complex and political topic.
Since January, California pulmonologist Roger Seheult has posted regular medical lectures about the novel coronavirus on his YouTube channel, MedCram. His audience jumped to more than 700,000 subscribers. But as the virus spread in the spring, YouTube deleted five of the MedCram clips, including two about the controversial drug hydroxychloroquine and one about Remdesivir, an experimental Covid-19 treatment developed by Gilead Sciences Inc.
Seheult appealed the decisions. Viewer complaints flooded in; one fan started a petition demanding YouTube stop “censoring” the footage. For Seheult, YouTube’s new rules were applied without explanation. “It’s like you’re in a hockey game,” he said. “And you keep getting called for penalties, but you don’t know what the penalties are.”
According to YouTube, the referee made a mistake.
“With the massive volume of videos on our site, sometimes we make the wrong call,” a YouTube spokesman said on Wednesday after Bloomberg News reached out for comment. “When it’s brought to our attention that a video has been removed mistakenly, we act quickly to reinstate it.” The five MedCram videos are back on the site now.
The incident is another flash point in the debate over the role of internet gatekeepers like YouTube and its parent, Alphabet Inc.’s Google. On Wednesday, U.S. President Donald Trump railed against social-media companies after Twitter Inc. fact-checked some of his tweets. He’s also preparing an executive order that could limit liability protections from the largest user-generated networks: Twitter, Facebook Inc. and YouTube.
At the same time, Democratic lawmakers have chastised YouTube for not removing conspiracy theories about the pandemic swiftly enough. In a letter to House Intelligence Chairman Adam Schiff, released on Thursday, YouTube Chief Executive Officer Susan Wojcicki wrote that the company’s automated systems detected a majority of “dangerous or misleading” videos about the virus. “The complex nature of misinformation online presents a number of challenges for platforms such as YouTube and I welcome your suggestions as to what we can do better,” she added.
In addition to this political pressure, YouTube has been inundated with virus videos, while the pandemic has disrupted its contract workforce doing content moderation.
“YouTube has a really tough job,” said Kyle Allred, MedCram’s co-founder and producer. “But the reality is: YouTube’s the second biggest search engine in the world. If our videos aren’t on YouTube, we don’t have the benefit of reaching as many people.”
During the pandemic, YouTube has aggressively moderated virus videos after years of a more laissez-faire approach. The company has pulled thousands of clips for promoting misleading information or advice that conflicts with public health agencies. To filter footage, YouTube relies on viewers flagging videos, automated software and legions of human moderators.
At the best of times, the system is sometimes heavy-handed or too lenient. In March, the company said it had to limit its use of human moderators, due to remote work limitations, and would lean more on machines to make decisions.
YouTube’s challenge has grown even harder as medical videos pour onto the site and the debate about the pandemic response evolves from a mostly scientific discussion into a political fight.
Trump has said he took hydroxychloroquine for about two weeks and the president has promoted it as a possible coronavirus therapy, despite an outcry from medical professionals about its unproven efficacy and potential side effects.
At times, YouTube has filtered out sham science, such as videos promoting fake cures for Covid-19. More controversially, it has acted against doctors departing from public health advice. In April, YouTube removed videos by two doctors in Bakersfield, California, who used their YouTube channel to call for an end to social-distancing policies.
Seheult, the pulmonologist, said his videos are nothing like that. Instead, MedCram clips dissect medical studies and early research, called preprints, related to the virus. Seheult narrates the findings and statistics -- low-key footage that, thanks to the pandemic, now draws a large audience.
It seems Seheult was struck by YouTube’s algorithms scanning footage about Covid-19 cures. All five removed videos focused on potential treatments. YouTube says it relies on medical advisers and public agencies for guidance on how to handle videos about health issues. Yet the scientific consensus on some Covid-19 treatments is still taking shape. A study in The Lancet medical journal, released last week, linked hydroxychloroquine with increased risk of death and heart ailments.
The YouTube spokesperson declined to cite the reason why the MedCram videos were initially removed, beyond noting that it was a mistake.
The team behind MedCram is happy to have their work back on the world’s biggest video site. But they still feel frustrated with the minimal communication from YouTube. “We’re grateful to have our website MedCram.com where we don’t have to worry about censorship,” said Allred.
(Updates with information on Trump executive order, YouTube CEO letter to Congress in eighth and ninth paragraphs.)
For more articles like this, please visit us at bloomberg.com
Subscribe now to stay ahead with the most trusted business news source.
©2020 Bloomberg L.P.