It has been found for several months that law enforcement officers, especially police officers, were uttering racist messages on the social network Facebook.
The question is how the social network reacted to such a situation.
The reactions of the French police, who were deemed to be racist in an open manner, did not go unnoticed by public opinion. Because on Facebook groups, they did not fail to publish rather derogatory remarks. The groups pointed at include "TN Rabiot Police Official" and "FDO 22 United", which look like the two loan of 17,000 the actions assistant transfer was laid bare by the website StreetPress. To this end, a preliminary investigation was opened against them for "public provocation of racial hatred" and "racist public insult".
This article will also interest you: Who shares my data on Facebook?
Facebook said: "We removed a lot of content from these groups because they didn't follow our rules and we are grateful to those who brought them to our attention. We will respond to requests for information from the French authorities, in accordance with our cooperation procedures. ».
The social network reportedly removed all content from its groups, with some group members reporting the posts. However, it does not specify whether the deletions occurred after the press reports that denounced this situation or whether the social network specifically targeted these people and their publications. But what we do know is that Facebook moderators tend to work only on content reported by other users. This would mean that if members in a group share the same ideologies and opinions, they would never be worried by the social network. And the fact that the social network is set up the filter bubble makes this kind of phenomenon even easier. "To enforce these policies in private groups, we combine reports from users, group administrators and technology," the social network says.
Using powerful technology, Facebook plans to use artificial intelligence to help detect harmful content. A trend that is very much in vogue near different social networks, probably because of the coronavirus pandemic. While he does not communicate about failed publications by his teams, he assures that: "This technology has made a lot of progress in recent months. However, between January and March, more than 88% of the 9.8 million hate speech we removed from Facebook for violating our hate speech rules was proactively detected by our technology, up from 68% at the same time last year. ».
On the side of one of the indexed groups, "TN Rabiot Police Official", the website StreetPress noted: "1,000 people have left" he adds later "Several comments revealed by StreetPress have been deleted, including those relating to the demonstration in support of Adama Traoré. »
In addition, Facebook has meant that the majority of posts shown in these groups do not violate its Community Standard policy, which you can view by clicking on the following link: https://www.facebook.com/communitystandards/. A policy that affects both public and private groups. "Some groups may break our rules and when an entire group violates the Community Standards, we remove it. However, deciding whether or not an entire group should be removed is often complex – because groups often have tens, hundreds, or even thousands of members and publications, many of whom don't violate our rules," the social network said.
To ensure that groups do not violate established rules, Facebook assures to do everything possible to please no drift is tolerated on its platform. As a result, it would take into account various factors ranging from the group's name to its description to the content published by members. Therefore, if the group description corresponds to hate speech, and the majority of publications go in this direction, then the social network may consider imposing certain sanctions. "If the group doesn't cross that line, it will stay in place, but we will continue to delete individual posts," Facebook said.
Now access an unlimited number of passwords: