National Security | Military Ethics | Global Far-Right Extremism | Counter-Terrorism | Antisemitism

Twitter’s obligation during crisis and war – opinion

It is incumbent on Elon Musk to ensure that Twitter does not incentivize antisemitic viewpoints and conspiracy theories.

On October 7, as Israeli Jews celebrated Simchat Torah (one of the holiest days on the Jewish calendar), the Palestinian terrorist organization Hamas launched a bloody attack from the Gaza Strip against Israel by land, sea, and air.

Terrorists went from house to house in Israeli villages close to the border, killing, raping, and torturing men, women and children. Among them are many elderly, including Holocaust survivors and people suffering from dementia and other ailments.

Mothers were shot to death, as their babies were decapitated. Non-Israeli nationals from the US, France, Germany, Thailand, Mexico, Spain, and elsewhere, were not spared.Following the initial Hamas attack, Israel launched Operation Iron Swords, which led to a new battlefield over international public opinion on social media. While misinformation exists on many social media platforms, such as TikTok and Instagram, one of the most prominent examples when it comes to this current conflict is Twitter.

Twitter is part of big tech, a term that refers to the largest and most dominant technology companies in their respective sectors. Examples include Google, Amazon, Meta, and Apple. What has characterized big tech is that they are usually more aware of a public need for more regulation.

Those who consume content from alt-tech platforms (such as Gab or MeWe) usually understand that most of their content originates from fake news sources and disinformation, while trusting more in information that they consume from big-tech platforms.

Twitter’s failure to combat misinformation 

Alt-tech refers to technology platforms and services outside the mainstream or technology ecosystem. It is very popular among extremist ideologies as the far-Right, and conspiracy theories such as Holocaust denial, spread more easily due to a lack of regulation.

By enabling disinformation and fake news, as we observe in this war, Twitter has failed. It has behaved more like an alt-tech platform.

ELON MUSK’S purchase of Twitter represents a change in the social media ecosystem. The hi-tech billionaire has fired over 80% of Twitter’s staff in pursuit of his vision of free speech, thereby weakening online content monitoring that has previously existed on the platform. Musk has reopened previously blocked accounts that have gone against their terms of service, such as former president Donald Trump and Ye (formerly known as Kanye West).

Prior to Twitter’s acquisition by Musk, users had to request verification in the form of a blue check mark. Their accounts had to be authentic, notable, and active, while needing to meet the criteria of being a government figure or entity, entertainer, athlete, company, activist, content creator, news organization, or journalist. Spam or parody accounts would not be tolerated, while Twitter users who violated terms of service would be banned.

By ending the blue check verification system on April 1, any user who subscribes to a premium account can now receive a blue check mark for a minimum of three dollars a month. While users are not required to display their real names or images on their profiles, the new terms of service also say that “you may not pose as someone who doesn’t exist to mislead others about who you are or who you represent.”

This contradictory policy has left disinformation on Twitter to be monetized during the war. Some users, like Angelo John Gage (aka “Lucas Gage”) have dedicated their timelines to promoting the worst antisemitic tropes and conspiracies imaginable, such as Jews controlling the media and that the six million Jews were not murdered during the Holocaust.

Gage and other users’ denial of antisemitic atrocities extends to the Hamas rampage on October 7, where he dehumanized Jewish people (including those who were kidnapped, tortured, raped, and murdered) as part of a “satanic religion.” His post has garnered over 241,000 views, more than 5,000 likes, and over 1,000 reposts.

Another user named Jackson Hinkle posted that the Israeli-based newspaper Haaretz claimed that the number of Israelis murdered on October 7 was really 900 and not “nearly 2,000” and that 50% of them were Israeli soldiers. He also claimed that there was “no evidence” that Hamas burned any Israelis and that there were “no beheaded babies.” Haaretz responded by posting that Hinkle’s post “contains blatant lies about the atrocities committed by Hamas” and “has absolutely no basis in Haaretz’s reporting.”

Yet because fake news spreads so quickly on Twitter, Hinkle’s original post received an estimated five million views and over 100,000 likes (versus Haaretz’s correction, which received 1.1 million views and less than 4,000 likes).

After a backlash to the spread of Hinkle’s disinformation, Musk announced that posts with a “community note” would no longer be eligible for monetization. While this is a positive development, this does not change the fact that more users continue to reshare fake news items such as these, in such high numbers.

This war has highlighted the need for better moderation and regulation on Twitter, especially when most people consider it a reliable platform. It is incumbent on Elon Musk to ensure that Twitter does not incentivize antisemitic viewpoints and conspiracy theories. Twitter needs to have better checks and balances, given that it has the power to determine who is victorious in the battle for public opinion.

View in The Jerusalem Post

Liram Stenzler-Koblentz PhD

Dr. Liram Koblentz-Stenzler a scholar and practitioner with a wealth of experience in fields of counterterrorism, antisemitism studies and global far-right Extremism. She is senior researcher and head of the Global Far Right Extremism Desk at the International Institute for Counter-Terrorism (ICT), Reichman University, Israel and lecturer at Yale University. She advises security agencies, technology companies, other organizations, and communities to achieve better understanding of the language, global connections, and action patterns of right-wing extremists in order to prevent acts of terrorism, incitement to violence, and antisemitism.

Skip to content