A Quarter of the Most Popular Covid-19 YouTube Videos Are Misleading: Study
May 13, 2020A new study looking at the most-viewed YouTube videos about the coronavirus has found that one quarter of them, together accounting for more than 60 million views, contained misleading information about the virus.
The study, which was conducted by Canadian researchers and published in the journal BMJ Global Health on Thursday, comes as misinformation and conspiracy theories continue to circulate online and emanate from the White House, providing fuel for anti-lockdown protests.
"Our findings are concerning because they show the vast reach of misinformation, which has tremendous potential for harm," lead author Heidi Li wrote in an email. "These harms include destructive behaviours that undermine a successful collective management of this pandemic."
After searching for "coronavirus" and "Covid-19" on YouTube, the researchers collected the most popular videos and filtered out any that were not in English, over an hour in length, contained no audio or video, were duplicates, or unrelated to the virus. They ended up with 69 videos altogether accounting for more than 250 million views. These videos were organized into different categories: consumer (no credentials or institutional affiliations), professional (accredited or affiliated with a non-profit or health organization), entertainment news, network news, internet news (the authors give the example of the YouTube channel Science Insider, run by Insider media), government, newspaper, and educational. Then, they were rated for their factual content.
According to the study, 27.5 percent of the videos contained misleading non-factual information. The worst offenders were videos in the entertainment news, internet news, and consumer categories. None of the professional or government videos included in the study were found to contain non-factual or misleading information.
The study notes that non-factual videos included "statements consisting of conspiracy theories, non-factual information, inappropriate recommendations inconsistent with current official government and health agency guidelines and discriminating statements." In an email, Li said that the consumer category of videos were more likely to contain such remarks.
YouTube did not respond to a request for comment.
Li said that this work highlights the need for YouTube to alter its algorithm to recommend more factual content, but public health agencies also have a role to play. They could partner with popular creators, for example, and make more entertaining videos to draw interest.
But the massive scale of social media platforms, paired with lax moderation, means that fighting misinformation is a lopsided battle.
"In an ideal world, social media platforms should take more responsibility for content uploaded," Li wrote. "This is an unrealistic expectation given the billions of users uploading information every second across the globe."