YouTube Is Finally Starting To Crack Down On Right-Wing Conspiracy Theorists
IT'S ABOUT TIME
·Updated:
·

A week after a video promoting the false conspiracy theory that one of the Parkland shooting survivors was a "crisis actor" hit No. 1 on YouTube's trending list, the video-sharing platform is making an effort to crack down on conspiracy theories and hate speech. But there are still plenty of obstacles — many of them algorithmic, some of them psychological — to preventing the spread of dangerous conspiracies online. Here's what's going on. 

Alex Jones, Who Has Long Promoted Conspiracies About The Sandy Hook, Finally Got His First Warning Last Week

Last Friday, CNN reported that the Alex Jones Channel, InfoWar's biggest YouTube account, had received a warning by YouTube for a video it posted claiming that Parkland shooting survivor David Hogg was a crisis actor. According to YouTube's community guidelines, an account that receives three warnings within a three-month period will be permanently banned from the platform.

That video focused on David Hogg, a strong voice among survivors of the mass shooting at Marjory Stoneman Douglas High School. The attention has given him a powerful platform — but it has also made him the subject of demonstrably false conspiracy theories that claim he's so skilled as a public speaker that he must be a paid actor.

On Wednesday, YouTube removed the video from InfoWars' page for violating its policies on harassment and bullying. The video was titled, "David Hogg Can't Remember His Lines In TV Interview."

[CNN]

Today, Jones Got A Second Warning And A Two-Week Freeze On His Account

On Wednesday, the Hill reported that Infowars had received a second warning and a two-week freeze on uploading new content. 

The channel said it received an alert from YouTube on Tuesday morning, saying Infowars received a second strike on a video about the Parkland, Fla., high school shooting and will temporarily be unable to upload new content.

"This is the second strike applied to your account within three months. As a result, you're unable to post new content to YouTube for two weeks," the alert said. "If there are no further issues, the ability to upload will be automatically restored after this two week period."

[The Hill]

After receiving the warning, Jones tweeted at Hogg, the subject of his slanderous video, inviting him to come on Jones' show and urging him to "support the 1st Amendment and not let them use you to end free speech."

 

Update, 5:30 PM: YouTube now tells Bloomberg that some content has been taken down by mistake by new moderators, without mentioning whether that includes InfoWars' video. "As we work to hire rapidly and ramp up our policy enforcement teams throughout 2018, newer members may misapply some of our policies resulting in mistaken removals," said a YouTube spokesperson. "We'll reinstate any videos that were removed in error."

At The Same Time, YouTube Finally Took Down The Channel Of A Notorious, Murderous Neo-Nazi Group

As YouTube is trying to limit Parkland-related conspiracy theories, it has also permanently banned Atomwaffen Division, a neo-Nazi group. Last week, ProPublica reported on Atomwaffen's chat logs after the murder of a college student, leading to calls from the Anti-Defamation League for YouTube to ban Atomwaffen's channel.

[T]he ban comes days after a flurry of media attention on Atomwaffen and YouTube's inaction around the group, and on Tuesday the Anti-Defamation League asked YouTube to remove offending videos immediately.

"This account has been terminated due to multiple or severe violations of YouTube's policy prohibiting hate speech," a banner on an Atomwaffen YouTube channel reads as of Wednesday. All of the account's videos have been removed; it was previously hosting around a dozen propaganda and other videos. The videos had been online since between June and October last year.

[Motherboard]

YouTube's Algorithm Serves Up All Kinds Of Conspiracies To Viewers Who Stumble Upon Fake News

Meanwhile, a researcher has found that YouTube's algorithm has inadvertently created a network of interrelated conspiracy videos, sending unwitting viewers down a rabbit hole of fake news. Jonathan Albright, the research director for the Tow Center for Digital Journalism at Columbia University, pulled the "next up" recommendations for several hundred videos that he found using the search term crisis actors and then mapped the 9,000 results. Here's one of his snapshots of the network.

 Jonathan Albright

As you can see, YouTube's algorithm will eventually expose you to conspiracy theories about the Illuminati, Pizzagate and Hollywood pedophilia rings once you start watching videos about so-called crisis actors. Albright writes, 

Every time there's a mass shooting or terror event, due to the subsequent backlash, this YouTube conspiracy genre grows in size and economic value. The search and recommendation algorithms will naturally ensure these videos are connected and thus have more reach.

In other words, due to the increasing depth of the content offerings and ongoing optimization of YouTube's algorithms, it's getting harder to counter these types of campaigns with real, factual information.

[Jonathan Albright via Medium]

Social Media Algorithms Still Can't Tell The Difference Between Genuine Shares And Hate-Shares

Another researcher, from New Media Frontier, looked at the spread of the Hoggs' "crisis actor" conspiracy theory on Twitter and found that it was at least partially amplified by people who were horrified by the theory — just the latest example of how social media algorithms are unable to distinguish between people sharing content because they like it and people sharing content because they hate it.

People outraged by the conspiracy helped to promote it — in some cases far more than the supporters of the story. And algorithms — apparently absent the necessary "sentiment sensitivity" that is needed to tell the context of a piece of content and assess whether it is being shared positively or negatively — see all that noise the same.

This unintended amplification created by outrage-sharing may have helped put the conspiracy in front of more unsuspecting people.

[Wired]

And YouTube's Decision To Take Down Conspiracy Theories May Only Fuel More Conspiracies

While YouTube's bans on conspiracy theories and hate speech will probably restrict these noxious ideas' spread in the long run, the Guardian points out that in the short run YouTube may inadvertently give conspiracy theorists more ammunition for the idea that they're being censored for telling the truth.

[R]ecent efforts to restrict offensive posts have shone a harsh light on a seemingly intractable problem of the modern conspiracy theory epidemic: that censoring the content can reinforce and enhance false beliefs and that there is no easy way to change the mind of a conspiracy theorist…

"If you believe your institutions are conspiring and then you expose it and then they ban your speech, how could you not think that that's part of it?" said Joseph Uscinski, a University of Miami professor and conspiracy theory expert. If Jones and Infowars continue to face YouTube censorship, he added, "it will convince his fans that he's on to something."

[The Guardian]

<p>L.V. Anderson is Digg's managing editor.</p>

Want more stories like this?

Every day we send an email with the top stories from Digg.

Subscribe