The review process of the policy from Facebook. Today the social network in blue announces the changes that concern hate speech and contents attributable to terrorism, violence and extremism. They also apply to Instagram.
Facebook, new policy on hatred and extremism
In detail, some terms included in the Community Standards change to the item People and Dangerous Organizations who now deals with terrorism, organized hatred, mass or serial murder, human trafficking and organized violence or criminal activity. As explained, some of the novelties are the consequence of what happened in recent months on the occasion of Christchurch shootings (New Zealand), when the authors of the mad gesture shared it in live streaming, making it immediately viral.
Facebook primarily refers to automated systems, mostly based on analysis and machine learning algorithms, used in order to identify in an almost real time and without requiring manual intervention the posts and contents to be banned. A technology that despite having evolved with the passage of time still cannot be considered perfect.
It is then stated that although terrorism constitutes a global problem, the interpretation of the term can vary from one place to another. For this reason FB underlines the desire to continue to collaborate with the realities of the whole world so that the necessary and legitimate fight against extremism does not end with the impact on freedom of speech and expression. It will do this by dialoguing with organizations and community members, collecting feedback and eventually adjusting the shot. A team of 350 people is in charge of the task.
Recently the topic has been at the center of a heated debate also in our country triggered by the decision, taken by Facebook, to intervene excluding CasaPound from the platform for incitement to hatred. A position based precisely on the need to ensure compliance with the rules established to protect one global community composed of over two and a half billion people.