One set of rules — Meta told to overhaul moderation system for high-profile users like Donald Trump Facebook accused of leaving dangerous content online to serve business interests.
Hannah Murphy and Cristina Criddle, Financial Times – Dec 6, 2022 2:13 pm UTC EnlargeLiu Guanguan/China News Service reader comments 75 with 0 posters participating Share this story Share on Facebook Share on Twitter Share on Reddit
Meta has been told its treatment of high-profile users, such as former US President Donald Trump, left dangerous content online, serving business interests at the expense of its human rights obligations.
A damning report published on Tuesday from the companys oversight boarda Supreme Court-style body created by the parent company of Facebook, Instagram, and WhatsApp to rule on sensitive moderation issueshas urged the social media giant to make significant changes to its internal system for reviewing content from politicians, celebrities, and its business partners.
The board, which started assessing cases last year, is coordinated by the tech giants policy chief and former UK deputy prime minister Sir Nick Clegg and issues independent judgments on high-profile moderation cases as well as recommendations on certain policies.
The board was asked to look into the system after The Wall Street Journal and whistleblower Frances Haugen revealed its existence last year, raising concerns that Meta was giving preferential treatment to elite figures.
Clegg also has until January 7 to decide whether to allow Trump back on to the platform following a separate recommendation by the board.
After a lengthy investigation spanning more than a year, the board has demanded that Meta more closely audit who is on the so-called cross-check list and be more transparent about its review procedures.
The report is one of the most in-depth probes yet into moderation issues at Meta, as the independent bodycomprising 20 journalists, academics, and politicianshas grappled with concerns that it has little power to hold the company accountable.
It piles further pressure on chief executive Mark Zuckerberg, who last month announced plans to cut 11,000 staff amid declining revenues and growth, to ensure Metas content is policed fairly. Advertisement
Meta has already begun to revamp the system. In a blog post on Tuesday, Clegg said it was originally developed to double-check cases where there could be a higher risk for a mistake or when the potential impact of a mistake is especially severe. He added that the company had now developed a more standardized system, with further controls and annual reviews.
It remains unclear how many people are on the secretive list. The Wall Street Journal, which first reported the list, estimated that by 2020, there were 5.8 million users listed. Meta has previously said there were 666,000 as of October 2021.
The system meant that content posted by well-known personalities, such as Trump and Elizabeth Warren, would remain on platforms until human moderators had reviewed them, even if the messages would have been automatically removed had they been posted by an ordinary user.
It would take five days on average for this human review to take place, with the content left on the platform during this time, and in one case, up to seven months, the report found.
Metas own understanding of the practical implications of the program was lacking, the board said, adding that the company had failed to assess whether the system worked as intended.
The board also accused the company of giving insufficient responses to the investigation, sometimes taking months to respond.
The board referenced a Wall Street Journal report that detailed how Brazilian footballer Neymar posted non-consensual intimate imagery of another person on to his Facebook and Instagram accounts, which was viewed more than 50 million times before removal. According to Meta, this was because of a delay in reviewing the content due to a backlog at the time.
Thomas Hughes, director of the oversight board, said the Neymar incident was one example of how business partnerships could impact moderation processes.
It opens up concerns… about relationships between individuals in the company and whether that might influence decision-making, he said.
There was probably a conflation of different interests within this cross-check process, he added.
The report follows previous public tensions between the board and Meta after the former accused the social media company in September 2021 of withholding information on the system. Many see the board as an attempt to create distance between the companys executives and difficult decisions around free speech.
Meta now has 90 days to respond to the recommendations.
2022 The Financial Times Ltd. All rights reserved. Not to be redistributed, copied, or modified in any way. reader comments 75 with 0 posters participating Share this story Share on Facebook Share on Twitter Share on Reddit Advertisement Channel Ars Technica ← Previous story Next story → Related Stories Today on Ars