Join Hafta-Ichi to Research the article “Election misinformation continues staying up on YouTube.”
If you visited Google’s YouTube in the days after the election last week, you may have found a video of an anchor from One America News Network declaring victory for President Trump with the unsubstantiated claim that Democrats were “tossing Republican ballots.”
Or a clip from Mr. Trump stating on his own YouTube channel that if all the “legal votes” were counted, he would win easily. Or a video claiming that Real Clear Politics, a political news site, had “rescinded” its call that President-elect Joseph R. Biden Jr. was projected to win Pennsylvania.
All of these videos were baseless and spread misinformation challenging the validity of the election’s outcome. Yet those videos are still available on YouTube while being shared widely on Facebook and other social media platforms.
This is not a glitch or an oversight in YouTube’s policy enforcement. Instead, YouTube has said its policies are working as intended. Its light-touch approach differs from Twitter and Facebook, which have cracked down on misinformation about the election and have more prominently labeled content that they deem to be misinformation.
“Disinformation is being spread on YouTube, but they’re not being transparent at all about how they’re dealing with it,” said Lisa Kaplan, the founder of Alethea Group, a company that helps fight election-related misinformation. “YouTube has been able to stay out of the limelight because of the nature of the platform.”
Video is harder to analyze than text, Ms. Kaplan said, and videos are not shared in the same way that Facebook posts and tweets are shared.
Before the election, YouTube said it would not permit misleading election information, but limited that mainly to the procedures around voting — how to vote, who was eligible to vote or be a candidate, or any claims that could discourage voting. The policies did not extend to people expressing views on the outcome of a current election or the process of counting votes — meaning, in effect, that videos spreading misinformation about the vote’s outcome would be permitted.
“The majority of election-related searches are surfacing results from authoritative sources, and we’re reducing the spread of harmful elections-related misinformation,” Andrea Faville, a YouTube spokeswoman, said in a statement. “Like other companies, we are allowing discussions of the election results and the process of counting votes and are continuing to closely monitor new developments.”
The company removes content that violates its policies, she said, especially any content that seeks to incite violence. She declined to say how many videos YouTube had removed.
YouTube’s actions are opaque. Its most powerful tool is an algorithm that has been trained to suppress so-called borderline content — videos that bump up against its rules but don’t clearly violate them — from appearing high in search results or recommendations. But YouTube does not disclose which videos are designated as borderline so people have to guess whether the company is taking action or not.
Even if YouTube takes steps to make it harder for people to find the videos on its site, it does not prevent a user from sharing it widely elsewhere. As a result, many YouTube videos have found new life on Facebook. The video spreading falsehoods about Real Clear Politics rescinding its projection of Mr. Biden winning Pennsylvania had about 1.5 million views on YouTube and it had been shared 67,000 times on Facebook as of Tuesday afternoon, according to BuzzSumo, a web analytics tool.
YouTube has marked some videos as ineligible for advertising. For the video from One America News, which ran last Wednesday, YouTube said it removed ads from it because the content undermined “confidence in elections with demonstrably false information.” That left YouTube in an awkward position of acknowledging the potentially harmful effects of the video, while continuing to host the video at the same time.
YouTube has also labeled all election-related videos. On Saturday, it changed the label from a warning that the election outcome may not be final to a statement that “the AP has called the Presidential race for Joe Biden” with a link to a Google page with the results. YouTube said it has displayed this information panel “billions of times.”
Source: The NY Times
Keyword: Election misinformation continues staying up on YouTube.