Social media companies weren’t doing enough “because the pain hasn’t become enough for them”, said Sanjay Bhandari, the chair of Kick It Out, an organisation that supports equality in soccer.
This season, Facebook is trying again. Its Instagram photo-sharing app is expected to roll out new features to make racist material harder to view, according to an internal document obtained by The New York Times. Among them, one will let users hide potentially harassing comments and messages from accounts that either don’t follow or recently followed them.
“The unfortunate reality is that tackling racism on social media, much like tackling racism in society, is complex,” Karina Newton, Instagram’s global head of public policy, said in a statement. “We’ve made important strides, many of which have been driven by our discussions with groups being targeted with abuse, like the UK football community.”
But Facebook executives also privately acknowledge that racist speech against English soccer players is likely to continue. “No one thing will fix this challenge overnight,” Steve Hatch, Facebook’s director for Britain and Ireland, wrote last month in an internal note that The Times reviewed.
Some players appear resigned to the abuse. Four days after the European Championship final, Bukayo Saka, 19, one of the black players who missed penalty kicks for England, posted on Twitter and Instagram that the “powerful platforms are not doing enough to stop these messages” and called it a “sad reality”.
Around the same time, Facebook employees continued to report hateful comments to their employer on Saka’s posts in an effort to get them taken down. One that was reported — an Instagram comment that read, “Bro stay in Africa” — apparently did not violate the platform’s rules, according to the automated moderation system. It stayed up.
Much of the racist abuse in English soccer has been directed at black superstars in the Premier League, such as Raheem Sterling and Marcus Rashford. About 30 per cent of players in the Premier League are black, Bhandari says.
Over time, these players have been harassed at soccer stadiums and on Facebook, where users are asked to provide their real names, and on Instagram and Twitter, which allows users to be anonymous. In April 2019, fed up with the behaviour, some players and two former captains of the national team, David Beckham and Wayne Rooney, took part in a 24-hour social media boycott, posting red badges on Instagram, Twitter and Facebook with the hashtag #Enough.
A month later, English soccer officials held their first meeting with Facebook — and came away disappointed. Facebook said that “feedback from the meeting was taken on board and influenced further policy, product and enforcement efforts.”
One employee started cataloging articles about English soccer players who had been abused on Facebook’s platforms. By February, the list had grown to about 20 different news clips in a single month.
Tensions ratcheted up last year after the police killing of George Floyd in Minneapolis. When the Premier League restarted in June 2020 after a 100-day coronavirus hiatus, athletes from all 20 clubs began each match by taking a knee. Players continued the symbolic act last season and said they would also kneel this season.
That has stoked more online abuse. In January, Rashford used Twitter to call out “humanity and social media at its worst” for the bigoted messages he had received. Two of his Manchester United teammates, who are also black, were targeted on Instagram with monkey emojis — which are meant to dehumanise — after a loss.
Inside Facebook, employees took note of the surge in racist speech. In one internal forum meant for flagging negative press to the communications department, one employee started cataloguing articles about English soccer players who had been abused on Facebook’s platforms. By February, the list had grown to about 20 different news clips in a single month, according to a company document seen by The Times.
English soccer organisations continued meeting with Facebook. This year, organisers also brought Twitter into the conversations, forming what became known as the Online Hate Working Group.
But soccer officials grew frustrated at the lack of progress, they said. There was no indication that Facebook’s and Twitter’s top leaders were aware of the abuse, said Edleen John, who heads international relations and corporate affairs for the Football Association, England’s governing body for the sport. She and others began discussing writing an open letter to Mark Zuckerberg and Jack Dorsey, the chief executives of Facebook and Twitter.
“Why don’t we try to communicate and get meetings with individuals right at the top of the organisation and see if that will make change?” John said, explaining the thinking.
In February, the chief executives of the Premier League, the Football Association and other groups published a 580-word letter to Zuckerberg and Dorsey, accusing them of “inaction” against racial abuse. They demanded that the companies block racist and discriminatory content before it was sent or posted. They also pushed for user identity verification so offenders could be rooted out.
But, John said, “we didn’t get a response” from Zuckerberg or Dorsey. In April, English soccer organisations, players and brands held a four-day boycott of social media.
Twitter, which declined to comment, said in a blog post about racism on Tuesday that it had been “appalled by those who targeted players from the England football team with racist abuse following the Euro 2020 Final.”
At Facebook, members of the policy team, which sets the rules around what content stays up or comes down, pushed back against the demands from soccer officials, three people with knowledge of the conversations said.
They argued that terms or symbols used for racist abuse — such as a monkey emoji — could have different meanings depending on the context and should not be banned completely. Identity verification could also undermine anonymity on Instagram and create new problems for users, they argued.
In April, Facebook announced a privacy setting called Hidden Words to automatically filter out messages and comments containing offensive words, phrases and emojis. Those comments cannot then be easily seen by the account user and will be hidden from those who follow the account. A month later, Instagram also began a test that allowed a slice of its users in the United States, South Africa, Brazil, Australia and Britain to flag “racist language or activity”, according to documents reviewed by The Times.
The test generated hundreds of reports. One internal spreadsheet outlining the results included a tab titled “Dehumanisation_Monkey/Primate”. It had more than 30 examples of comments using bigoted terms and emojis of monkeys, gorillas and bananas in connection with black people.
‘The Onus Is on Them’
In the hours after England lost the European Championship final to Italy on July 11, racist comments against the players who missed penalty kicks — Saka, Rashford and Jadon Sancho — escalated. That set off a “site event” at Facebook, eventually triggering the kind of emergency associated with a major system outage of the site.
Facebook employees rushed to internal forums to say they had reported monkey emojis or other degrading stereotypes. Some workers asked if they could volunteer to help sort through content or moderate comments for high-profile accounts.
“We get this stream of utter bile every match, and it’s even worse when someone black misses,” one employee wrote on an internal forum.
But the employees’ reports of racist speech were often met with automated messages saying the posts did not violate the company’s guidelines. Executives also provided talking points to employees that said Facebook had worked “swiftly to remove comments and accounts directing abuse at England’s footballers”.
‘We get this stream of utter bile every match, and it’s even worse when someone black misses.’
In one internal comment, Jerry Newman, Facebook’s director of sports partnerships for Europe, the Middle East and Africa, reminded workers that the company had introduced the Hidden Words feature so users could filter out offensive words or symbols. It was the players’ responsibility to use the feature, he wrote.
“Ultimately, the onus is on them to go into Instagram and input which emojis/words they don’t want to feature,” Newman said.
Other Facebook executives said monkey emojis were not typically used negatively. If the company filtered certain terms out for everyone, they added, people might miss important messages.
Instagram’s chief executive, Adam Mosseri, later said the platform could have done better, tweeting in response to a BBC reporter that the app “mistakenly” marked some of the racist comments as “benign”.
But Facebook also defended itself in a blog post. The company said it had removed 25 million pieces of hate content in the first three months of the year, while Instagram took down 6.3 million pieces, or 93 per cent, before users reported them.
New York Times