As the World Cup has begun, Twitter has failed to remove 99% of racist remarks directed against footballers.

Anti-hate speech activists claim that 43 players were the targets of reported posts, which are still up and raising fears about potential abuse during the World Cup.

Twitter has failed to take down the racist tweets that target football players, including those that use the N-word, and monkey emojis, and calls for their deportation.

According to new data, the platform ignored 99 out of 100 reports of racist remarks in the week leading up to the World Cup.

One tweet that contained a racial slur repeated 16 times was the only one taken down after it was reported on Wednesday. This weekend, every other one was still active.

Raheem Sterling and Bukayo Saka, both stars for England, were among the 43 players who received criticism after the Euro 2020 championship.

The Observer was given access to the analysis, which comprised 100 tweets that were reported to Twitter and was done by researchers at the Center for Countering Digital Hate (CCDH). Of those, 11 used the N-word to refer to football players, 25 made monkey or banana faces at players, 13 demanded their deportation, and 25 told players to "go back to" other nations. Footballers were the focus of 13 tweets regarding their English proficiency.

The revelations come at a difficult moment for Twitter and may increase worries that World Cup participants might be targeted.

Since Elon Musk took over the company on October 27, thousands of employees have gone. Musk has argued that the platform's moderation tools are still effective and that he is committed to keeping it from turning into a "free-for-all hellscape."

Musk, though, stated that "negative/hate tweets" will be "deboosted & demonetized," but not necessarily erased, in an amendment to the platform's policies on hate speech last week. Users "won't find the tweet unless you deliberately search it out, which is no different from the rest of [the] internet," the author continued.

Uncertainty exists around how this will be applied to abuse that names or tags specific people, who are likely to read the post without looking for it.

Football players were mentioned in every tweet found by the CCDH's study, either by name or by tagging their Twitter accounts. Many were comments made below official tweets from news organisations or football clubs.

They included tweets urging footballers to be deported, comparing them to animals and chimpanzees, and demanding they "go back to Africa." Through Twitter's in-app reporting feature, the tweets were marked as inappropriate.

Twitter wasn't responsive when asked for a remark. A large portion of its communications crew was laid off.

According to the content policy currently posted on its website, it is against the law to "target others with repeated slurs" and tweets may be deleted in situations where "severe, repetitive usage of slurs, where the primary intent is to harass."

Additionally, it forbids the "dehumanisation" of a group of people based on traits like race.

Latest Posts

For the first time in nine years, a stroke victim moves her hand

Researchers think that new technologies may provide hope for persons with impairments. With spinal stimulation, a stroke survivor could move her hand and arm for the first time in nine years.

Meta will launch premium subscriptions on Facebook and Instagram

Meta Verified will allow Facebook and Instagram users to pay to verify their accounts. Meta is piloting a new subscription service called Meta Verified, allowing Facebook and Instagram users to pay to confirm their reports.