On Monday, cable outlet One America News Network posted two videos to its YouTube account titled “Trump won.” The clips echoed several others telling viewers, falsely, that U.S. President Donald Trump was re-elected and that the vote was marred by fraud.
YouTube added a label noting that the Associated Press called the election for Joe Biden. But the world’s largest online video service didn’t block or remove the content. That approach differs from Twitter Inc., which has hidden conspiratorial election posts behind warnings.
A few months ago, YouTube released a detailed policy prohibiting manipulated media and voter suppression, but left one gap: Expressing views on the election is OK. The result has been an onslaught of videos aiming to undermine the legitimacy of the election, according to online media and political researchers. Some of this material has spread on other social networks. And several clips, like the two OANN videos on Monday, ran advertisements, profiting from a Google policy that lets content framed as news reporting or talk shows cash in.
“YouTube saw the inevitable writing on the wall that its platform would be used to spread false claims of election victory and it shrugged,” said Evelyn Douek, a lecturer at Harvard Law School who studies content moderation and the regulation of online speech.
One YouTube video claiming evidence of voter fraud in Michigan has more than five million views. Another posted by Trump was selectively edited to appear as if Biden is endorsing voter fraud. That has over 1.6 million views. One of the OANN clips was watched 142,000 times in seven hours on Monday, while the other got 92,000 hits in that time.
Read more about One America News Network here.
Some recent misleading videos came from YouTube accounts that had already promoted unproven claims about mail-in voting months ahead of the election, according to a study by Media Matters for America, which monitors conservative misinformation. President Angelo Carusone said he showed the findings to the company’s policy team in September and got no answer.
“It was clear they had a problem,” Carusone said. “And they didn’t do anything about it.”
On content moderation, YouTube has largely followed the lead of its owner. If Google sees a web page with disreputable or harmful information, it will typically rank it lower in search results. But the company doesn’t remove this content altogether. YouTube treats videos the same way — tweaking its software to recommend misleading videos less often and make them harder to find. When people look for election news, YouTube tries to place clips from established news outlets at the top.
“Our systems are generally working as intended and we are surfacing news organizations prominently in search results and recommendations,” said Farshad Shadloo, a YouTube spokesman. “The vast majority of elections-related queries are surfacing authoritative sources.”
The company has removed some election-related videos, but won’t say which ones or how many. Most YouTube videos with “borderline content and harmful misinformation” get traffic from other outside sources, specifically other social-media platforms, Shadloo added.
YouTube also introduced a label for videos about mail-in voting in September, but its system for enforcing the rules took time to roll out, Shadloo noted. On Saturday, the company added another label on some election-related clips, telling viewers that “Robust safeguards help ensure the integrity of elections and results.” That came with a link to aU.S. government web page debunking disinformation about the 2020 vote.
Before the election, a research group called the Election Integrity Partnership analyzed how each major user-generated digital platform would handle posts that delegitimize the election. YouTube was the only one lacking a plan for posts like “the election is rigged” or those that claimed victory prematurely, the group found. Text is easier to moderate than video, particularly long clips like many of those on YouTube addressing the election.
Both Facebook Inc. and Twitter said they would put fact-check labels on posts from candidates claiming victory prematurely. Twitter has been the more active enforcer, hiding or labeling many tweets from Trump since the morning of Nov. 4. Facebook has applied the following text beneath some Trump posts: “The U.S. has laws, procedures, and established institutions to ensure the integrity of our elections.”
In August, YouTube said it would not allow videos that suppress voting, incite violence or falsely claim that “mail-in ballots have been manipulated.” However, YouTube’s broadcasters have been given relatively wide latitude, and a steady drumbeat of videos have come out in the past week casting doubt on the election result.
On Nov. 5, Trump posted portions of a speech to Google’s video site, including a short clip where he said: “If you count the legal votes, I easily win. If you count the illegal votes, they can try to steal the election from us.” Shadloo said the company kept this video up because it permits footage “expressing views on the outcome of the current election.”
That clip has about 54,000 views, low by YouTube’s standards. But similar content has reached much larger audiences. A video from YouTuber Steven Crowder touting evidence of voter fraud in Michigan has more than five million views. Mike Huckabee, a onetime presidential candidate, posted a clip about the election on Nov. 5 titled “The Art of the STEAL” that has over 493,000 views. (On Friday, he changed the title to “There’s Something ROTTEN In Biden’s Basement.”)
Those two videos ran Google ads. In a statement, a company spokeswoman said Google has removed ads from videos pushing claims that could undermine trust in the election. For instance, it pulled ads from one channel that told viewers Trump had won. But any clips that show testimonials of people alleging voter fraud or making partisan claims, such as the Democrats or Trump are aiming to steal the election — those are fine to run ads, according to Google.
As the vote count continued on Nov. 5, Trump posted the selectively edited Biden video to his account for the second time. In the clip, Biden is discussing election security, but it was cut to appear as if he is endorsing voter fraud. YouTube determined the video was not manipulated in a way to pose “serious risk of egregious harm,” the company said.
Source: Read Full Article