How Facebook Practiced Two-Tier Content Moderation

Meta’s oversight board slams Facebook and Instagram content moderation. The American firm would practice a suppression of content at several speeds by favoring VIP accounts, long-standing business partners. In his reportthe control authority requests a correction of the moderation system.

Ranking of users

According to the Meta watchdog, it is the use of the program cross check which would be a problem. While the latter is supposed to defend human rights, “we found that its structure seemed more related to commercial considerations” in content moderation, says the board. This lack of “transparency” would benefit celebrities and companies that make Meta money.

This VIP list would include profiles likely to generate money for the company, either through formal commercial relations, or by attracting users to Meta’s platforms (Instagram, Facebook, WhatsApp), recalls the Guardian.

With this system, reporting a post from an average user is given priority and it is quickly deleted. Conversely, the reporting of content from a so-called VIP user is processed with a much greater latency time. Meta is actually waiting for a human moderator to intervene. A procedure that contributes to exposing a problematic publication at greater length.

The Neymar case on the table

“In other words, because of the system cross check, content identified as violating the Meta rules remains live on Facebook and Instagram despite being extremely viral and potentially harmful”, says the council in its report. In a article published in September 2022the wall street journal unveiled the contours of this system and spoke of a “secret elite […] exempted” traditional rules of moderation.

Advertising, your content continues below

In 2019, the program cross check was used in the Neymar case. Accused of rape, the Brazilian footballer responded by posting videos of his private correspondence with the alleged victim. Naked photos and personal information of the latter had thus been disseminated. Viral, the content was only removed after human intervention, a day later. At the same time, the classic sanction to deactivate his account had not been taken, according to the newspaper.

“In Neymar’s case, it’s hard to understand how non-consensual intimate images posted on an account with over 100 million followers wouldn’t have been pushed to the top of the list for quick review”, questions today the supervisory board of Meta. There are many examples of delayed moderations on favored accounts. Several VIP profiles have notably shared false information about vaccines, the Covid or even the Democratic candidate Hillary Clinton.

32 recommendations made

In order to end this selective impunity, the council is asking Meta to change its content moderation process on Facebook and Instagram. In detail, the supervisory authority recommends additional resources to meet the “volume of content requiring further review”. Additional explanations on how the program works cross check are requested at the same time, as is better consideration of the “expressions important for human rights, and in particular those of public interest”. A total of 32 recommendations are made.

Nick Clegg, president of global affairs at Meta, assures his side that responses will be made to the concerns of the Supervisory Board within 90 days.

Related Articles

Back to top button