There is no evidence that what Albert saw on Twitter inspired him to post his own messages, which were posted on Discord, according to court documents. and his attorney did not respond to a request for comment. But former employees and online researchers say physical attacks in the United States follow Twitter spikes in some categories of hate speech, particularly anti-Semitic and anti-gay slurs and rhetoric.
New research to be released later this month by the disinformation tracker Network Contagion Research Institute suggests a link between real-world incidents and variations on the word “groomer,” often targeting gays and suggesting that it’s adults who want to seduce children. Although polls show that a significant minority of the population thinks otherwise, gays are no more predators than straights.
Pre-Musk, Twitter had classified the word “groomer” as hate speech. But usage started to rise not long after Musk said he would buy the platform, and it has risen repeatedly since then, often following real-world incidents like the deadly Colorado gay club shooting.
“Over the last three to four months, we’ve seen an increase in anti-LGBTQ incidents, and you can see a statistical correlation between these real-world incidents and the increased use of the term ‘groomer’ on Twitter,” Alexander said. Reid Ross, a Network Contagion analyst who shared the findings with The Washington Post. He did not say that the use of the term led to the violence.
The second-biggest spike in tweets containing the word “groomed” came just after Musk took control of Twitter. The largest, reaching more than 4,000 a day, came in late November, shortly before a record seven daily anti-gay attacks were recorded in the Armed Conflict Location and Event Data Project, a nonprofit tracker of global political violence, Ross said. .
Twitter hate speech in major foreign markets after Musk takeover
While hate speech leads to incidents could raise the possibility that it inspires violence, hate speech afterward is also important, experts say. Calling the victims of violence “groomers” “feels highly pressurized, toxic discourse that condemns the victims and thus warrants further activity,” Ross said.
Musk plays a role not only by relaxing speech policies and reducing moderation staff, but also by personal choices in his interactions, researchers say.
Recent anti-Semitic incidents included direct references to rapper Ye, who issued tweets against Jews after Musk welcomed him back to the platform after his suspension from Instagram. On his return to Twitter, he made a pledge to go “death con 3 On JEWISH PEOPLE”.
Even after Musk suspended Ye again, tweets made references to Jewish “privilege” or “supremacy,” according to Joel Finkelstein, director of Network Contagion and a senior fellow at Rutgers University.
An assailant who attacked a man in New York City’s Central Park last month shouted “Kanye 2024,” along with anti-Semitic remarks as he did so, police said. In November, vandals spray-painted “Kanye was rite” with swastikas on tombstones at a Jewish cemetery in Illinois.
Hate crimes against Jews in New York have risen from nine in September to 45 in November, accounting for more than half of the city’s prejudice incidents, according to statistics from the New York Police Department.
White nationalists and some black Americans sometimes reinforced each other, Finkelstein said. Neo-Nazi groups posted memes on picture boards featuring Ye as a heroic new Hitler, while Cynthia McKinney of Georgia, a Black American Green Party activist who served six terms in Congress, tweeted that 2022 is “the year of #TheNoticing, the year gaslighting finally stopped working!” That hashtag, driven by hardcore anti-Semites on Twitter and imageboard 4chan, references an alleged discovery that some Jews hold positions of influence. McKinney did not respond to requests for comment.
Finkelstein has seen the same patterns before, including during a conflict between Israel and Hamas in May 2021. A team of analysts from Network Contagion and elsewhere examined 6 billion tweets and Reddit posts and recently found that the number of tweets using human rights language was better predictor of both US street protests and anti-Semitic incidents than the actual fighting in the Middle East.
Musk’s “free speech” agenda is dismantling security work on Twitter, insiders say
“We found that in addition to fighting, there’s a huge spike in words like colonialism and apartheid, and then there’s incidents,” said Aviva Klompas, CEO of Boundless, a nonprofit that also worked on the study. “Then you see the long tail of that weaponized language, and the incidents keep coming.”
Twitter keeps both legitimate debate and hate alive longer and spreading them more widely, Finkelstein said: “Wars in the world are being waged online as well, and social media has become the weapon to expand it from a local conflict to a clash of civilizations.”
In addition to firing most of Twitter’s trust and security team and its outside security advisory board, Musk reinstated accounts that fueled extremism and tweeted an image of Pepe the Frog, the alt-right’s mascot.
He also went out of his way to criticize former trust and security chief Yoel Roth, who resigned after the November midterms and criticized Musk’s habit of deciding content rules on the spot. Musk took an old Roth tweet from an article detailing a criminal verdict against a teacher who had sex with an 18-year-old student and added, “this explains a lot” amid an attempt to portray himself as a major foe of child sexual abuse images and Roth as someone who let it slide on Twitter — prompting hordes of Twitter users to call Roth a “trimmer.”
While Musk had said “incitement to violence” remains grounds for suspension, suggestions that Roth should be killed remained on the site after being reported by a longtime researcher who uses the Twitter handle @conspirator0.
Roth fled his house when accounts tweeted him, including one of a man feeding another into a wood chipper, saying “how else are you going to get off your ass?” and a plural noun. Other images included an uncaptioned firing squad and containers of bullets, one marked “box of pills that cure pedophilia.”
Racist tweets pop up quickly after Musk closes the Twitter deal
Some tweets and accounts were deleted a day later, the researcher said. But similar answers are still upwards. Roth has put his house up for sale and moved out, according to someone in contact with him. He declined to comment.
Musk’s new head of trust and security, Ella Irwin, did not respond to an email asking for comment.
Roth too was selected in tweets from the influential @LibsofTikTok account, which is run by activist Chaya Raichik and has 1.7 million followers.
The account has long crossed against transgender medical treatment of young people in hospitals. A focus on Boston Children’s Hospital in August preceded threats against doctors there, while a Wisconsin school under fire for an investigation into the bullying of a transgender student was temporarily closed in May due to bomb threats and harassment.
Her spotlight was also followed by the Proud Boys and other violent groups protesting parades and other events.
Task Force Butler Institute, an anti-extremism nonprofit, found 281 LibsofTikTok tweets last month that mentioned a specific event, location or person between April and November. In 66 of those cases, reports of digital harassment or real-world incidents followed, including death threats and bomb threats. Several times, organizers have canceled events in response.
Before the Musk acquisition, complaints about LibsofTikTok sometimes resulted in individual tweets being deleted or week-long suspensions, including twice in Sept.
After Musk’s takeover, there have been no such suspensions, and he personally has interacted with the account, convincing some not to try. “It makes no sense,” says activist Erin Reed, who follows the story closely. Asked for comment, Raichik responded by accusing The Post of inciting violence against her.
Favorite topics for the account were drag shows and book readings, especially those for minors. In November it is pointed out an upcoming performance at the Sunrise Theater in Moore County, NC
Minutes after the December 3 show started, the lights went out. Two separate electronic substations had been gunfired and disabled, leaving 40,000 people without power for days. The FBI is investigating the incident and would not say whether the blackout was aimed at the show.
Rumman Chowdhury, former Twitter director for machine learning ethics and accountability, said the escalation of hate speech and violence were predictable results of Musk’s decisions, but still deeply troubling.
“It is certainly very shocking. It’s very sad to see this thing that so many of us cared for and built has been decimated bit by bit,” she told The Post. “It’s very hard to see where it’s going and how bad it’s going to get.”
Cat Zakrzewski contributed reporting.