Twitter Hate Speech Accounts Exploited Israel’s War in Gaza to Grow Four Times Faster

An Israeli army battle tank on the border of Palestinian territory. - Credit: Jack Guez/AFP/Getty Images
An Israeli army battle tank on the border of Palestinian territory. - Credit: Jack Guez/AFP/Getty Images

Last month, a judge tossed a lawsuit from Elon Musk‘s X (formerly Twitter) against the Center for Countering Digital Hate, an anti-extremism watchdog, that alleged the group had financially harmed the company by reporting on violent and hateful speech proliferating across its platform. The ruling stated in no uncertain terms that Musk and X were trying to punish the CCDH for exercising First Amendment rights.

Following that court victory, the organization is once again sounding the alarm about X’s lack of sensible moderation, releasing another trove of data that shows how 10 toxic influencers have capitalized on Israel’s months-long siege of Gaza, rapidly expanding their audiences and raking in money by spreading antisemitic and Islamophobic content. The CCDH found that these accounts, which have together amassed 4 million new followers since the Oct. 7 attack on Israel by Hamas militants, grew four times faster in the months directly following the attack than they had in the months immediately prior. All benefited from the algorithmic boost that comes with paid blue-check verification, and despite repeated community guideline violations, not one has been suspended.

More from Rolling Stone

“None of the accounts deserved the enormous visibility they received by cynically goading, upsetting, and terrorizing others into emotional responses,” writes Imran Ahmed, CEO of the CCDH, in an introduction to the new report, titled “Hate Pays: How X accounts are exploiting the Israel-Gaza conflict to grow and profit.” Ahmed adds that the influencers’ fiercest critics also gave them greater reach and exposure because “even ‘negative’ engagement counts as engagement on social media platforms, increasing their attractiveness to the algorithms that control what content gets promoted in timelines and what does not.”

The result, Ahmed warns, is “a race to the bottom of hate and sensationalism with the most profitable ad next to it.” Indeed, the CCDH research shows that a number of major advertisers, from Oreos to the National Basketball Association to Starlink (owned by Musk’s own SpaceX) are having their promotional materials shown in proximity to hateful posts, despite X’s assurances of brand safety and promise that ads will not “serve around potentially objectionable content.” X did not respond to a request for comment on the nonprofit’s findings.

Rolling Stone previously reported on a number of the influencers tracked in the CCDH report who have used nominal support for Palestine as cover to disseminate antisemitic content, including Ryan Dawson, once banned from the platform but reinstated under Musk, and former UFC fighter Jake Shields, who together openly dabble in Holocaust denial on X. Others quick to exploit the slaughter in Gaza to demonize Jewish people were self-described “raging antisemite” Keith Woods, known to have welcomed white nationalist Nick Fuentes onto his YouTube show; anonymous account @CensoredMen, which originally existed to support misogynist manosphere celebrity Andrew Tate as he faces rape and human trafficking charges in Romania; and Jackson Hinkle, a prolific purveyor of misinformation whose account picked up a staggering 2 million followers in the four months after the Oct. 7 attacks.

The second biggest account in this latest analysis belongs to Dr. Anastasia Maria Loupis, based in Denmark, with 1.2 million followers. She infuses her anti-vax content with a heavy dose of antisemitism, repeatedly implying that mass vaccination is the agenda of a powerful group of Jews that control national and global institutions. But since the outbreak of war in the Middle East, she has mostly shifted from arguments against pandemic safety measures to posts about Israel and Palestine, reaping massive engagement as a result. Like Hinkle, Dawson, Shields and @CensoredMen, she has enabled the premium “subscription” feature, allowing followers to pay a monthly fee for additional exclusive content.

Of course, anti-Muslim influencers have been rewarded as well. One is Dr. Eli David, an Israeli computer scientist and co-founder of healthcare and cybersecurity companies. He, too, was formerly mired in Covid-19 conspiracism but now tweets his support for Israel while likening Palestinians to rats and warning that Europe is being taken over by Muslims and “jihadists.” Likewise, @RadioGenoa — not affiliated with any Italian radio station — posts disinformation intended to stoke fear of a supposed invasion of Europe by Muslim immigrants. The two accounts have nearly 600,000 followers apiece, a significant proportion of those gained since Oct. 7.

The last two hate-fueled feeds the CCDH studied were “Way of the World,” an equal-opportunity racism account just as likely to post offensive memes about Black people or Jews, often while railing against immigration, and the account of failed U.S. senate candidate Sam Parker, who has shared conspiracist content about a cabal of Jews supposedly running the world, and invoked the canard that the Rothschild banking family sits at the top of this hidden organization.

“It is clear from the results of a number of studies by CCDH and others that the new leadership of X has not discouraged antisemites, and instead has effectively welcomed them, and accounts which seek to spread hate,” the report concludes. “The increase in followers and reach for the accounts identified in this report is, we believe, a direct result of the policy changes made under owner, Elon Musk, and CEO, Linda Yaccarino.”

Musk hasn’t merely enabled antisemitism, though, as he sometimes wades into such rhetoric himself, once encouraged followers to follow an antisemitic account for updates on the Gaza war, and effectively endorsed ideas akin to the “Great Replacement” conspiracy theory, many versions of which posit that Jews are deliberately orchestrating mass immigration of non-white populations into the West.

Overall, the report presents a bleak portrait of a platform “failing to uphold its duties to keep users safe,” incentivizing hate speech with the rewards of increased engagement and profit while continuing to serve ads alongside harmful material that violates X’s own policies but is not removed when reported. In fact, these accounts are also eligible for ad-sharing revenue based on the organic impressions on promotions that appear in the replies to their extremist and racist posts.

“It’s titled in favor of hate and lies,” Imran Ahmed writes of social media in his summary of this research. “Those preaching tolerance and goodwill have to ice-skate uphill to keep up.” And in his time as the owner of X, Musk has done nothing to reverse that dynamic — quite the opposite. In the absence of any corporate responsibility, the CCDH can only recommend that regulators and lawmakers demand accountability and transparency, while advertisers and users can reevaluate whether they want any part of this destructive online economy.

Musk, meanwhile, can complain about this damning appraisal all he wants, as he typically does when the sheer volume of malignant discourse on X is laid out in the starkest terms. The only difference is that this time, he probably won’t bother to launch a frivolous lawsuit about it.

Best of Rolling Stone