Surging Twitter antisemitism unites fringe, encourages violence, officials say

Comment

Gift Article

Current and former federal officials are warning that a surge in hate speech and disinformation about Jews on Twitter is uniting and popularizing some of the same extremists who have helped push people to engage in violent protests including the Jan. 6, 2021, attack on Congress.

The officials are predicting that Twitter will contribute to more violence in the months ahead, citing the proliferation of extreme content, including support for genocidal Nazis by celebrities with wide followings and the reemergence of QAnon proselytizers and white nationalists.

Since billionaire entrepreneur Elon Musk bought Twitter just over a month ago, he has slashed more than half the staff, including most of the people who made judgment calls about what counts as impermissible slurs against religious or ethnic groups.

Musk announced a broad amnesty for most previously banned accounts and has personally interacted with fringe activists and white nationalists on the site in the weeks since he assumed ownership. Other actors have experimented with racist and antisemitic posts to test Musk’s limits as a self-declared “free speech absolutist.”

Even before Musk’s takeover, some Twitter users were encouraging confrontations with transgender people and others who were falsely depicted as “groomers,” or predators who sexually target underage victims. But the new wave of antisemitism has reached millions of people in just days, brought new followers, and helped galvanize a broader coalition of fringe figures.

“This type of escalation and hate and dehumanization, the hatred of the Jewish population — it’s a really directed target. Violence is inevitable,” said Denver Riggleman, a former Air Force intelligence officer who later served as a Republican member of Congress and then on the staff of the House committee investigating the Jan. 6 attack on the Capitol.

The Department of Homeland Security on Wednesday warned that domestic terrorists were maintaining “a visible presence online in attempts to motivate supporters to conduct attacks,” citing increased risks for racial and religious minorities and gays and transgender people, as well as government institutions. “Recent incidents have highlighted the enduring threat to faith-based communities, including the Jewish community,” it said.

The bulletin made clear that online and offline conduct often reinforce each other in a cycle of escalation. The recent shootings at a Colorado gay bar drew praise online, encouraging potential copycat strikes, it said. Likewise, a New Jersey man was arrested last month after publishing an online manifesto for attacks on synagogues, and a second man was caught with a gun after tweeting about plans to “shoot up a synagogue and die.”

“The idea that there is a difference between online chatter and real-word harm is disabused by a decade of research,” said Juliette Kayyem, a security business founder and former assistant DHS secretary. Open expression on Twitter “re-socializes the hate and rids society of the shaming that ought to occur regarding antisemitism,” she said.

Most alarming to Joel Finkelstein, co-founder of the nonprofit Network Contagion Research Institute, has been the unification and elevation of voices little heard since the Capitol attack.

The NCRI has been tracking various indicators that show antisemitism is on the rise, including fast-multiplying Twitter references to the New World Order, a bogus theory that features cosmopolitan elites, sometimes explicitly Jews, wrecking institutions and values in multiple nations to exert more control.

But Finkelstein said he has seldom seen anything as dramatic as what happened when the rapper and producer Ye, formerly Kanye West, came back to Twitter and posted clips from an appearance on the internet show of bankrupt fabulist Alex Jones where he said, “There’s a lot of things that I love about Hitler.”

It wasn’t just that Ye said something to his 32 million followers on Twitter that even most Nazis keep to themselves. It was that he let the banned Jones and the racist Nick Fuentes tweet from his account, and that Ye gained followers since proclaiming on Twitter that he was going “Death Con 3 on Jewish people.” That remark got his Twitter account restricted, but not before it had caught the world’s attention.

“Kanye is using antisemitism to popularize a list of actors who have been censored for a long time,” Finkelstein said. “Trolls are climbing over the walls to start new accounts. This is a bonanza.”

Musk acknowledged on Twitter that impressions of hate speech spiked Thursday after the Alex Jones incident, but said it had been trending down before that. Musk complained that the raw number of offensive tweets is misleading, however, because it treats tweets that no one sees the same as those with millions of views.

He did not say how Twitter counted hate speech, though, and researchers said it was unlikely to include hateful conspiracy theories or coded language.

Some groups that in the past had a direct line to Twitter’s Trust and Safety team said that they are getting fewer responses to their complaints. The Anti-Defamation League said the proportion of tweets it reports that lead to a suspension or other action has fallen by half, to 30 percent.

Finkelstein said his group has stopped reporting anything, because all of its contacts at Twitter are gone.

Multiple members of Twitter’s long-standing committee of outside safety advisers, including the ADL, said they did not know whether they were going to be disbanded or they would elect to resign.

The overarching problem, Finkelstein said, is that racism and antisemitism work to draw attention. Extreme views get engagement from supporters, critics and observers. Engagement translates into profit. Others will jump on the trend.

That’s why every other major social network has embraced content moderation, Kayyem said.

“Whether it’s the individual or the group dynamics, they are feeding off this crap and this hate — that is the reason why content moderation was created in the first place,” she said. “Content moderation wasn’t invented because they wanted everyone to be nice, it was created because of the realization that these kinds of attitudes, if allowed to foster in society, lead to violent conduct.”

But not everyone thinks political, racial or religious violence is bad.

One of those rejoining Twitter, 10 years after he was banned, is Andrew Anglin, editor of the Daily Stormer, for years one the best known openly racist and fascist publications.

In a leaked style guide, Anglin once explained that his goal is recruiting new neo-Nazis, and that blaming Jews was the best way to do that.

“As Hitler said, people will become confused and disheartened if they feel there are multiple enemies,” Anglin wrote in the guide. “As such, all enemies should be combined into one enemy, which is the Jews.”

On Friday, Twitter’s software recommended Anglin’s revived account under “who to follow” to everyday users, including comedic writer K. Thor Jensen, who shared a screenshot with The Washington Post.

“I do a little monitoring of the far right for comedy purposes but have never Googled him or anything and had no idea he was reinstated on the platform,” Jensen said. “It’s just insane that the algorithm would push him at ANYBODY.”

Loading…

Source: WP