TikTok, the rapidly growing app-social network capable of attracting the young to its political exponents, is today again at the center of a delicate question: the team that manages the community can be attributed with a behavior that can be labeled as discrimination?
TikTok: that thin line between moderation and discrimination
Everything comes from the spread of some documents that describe how ByteDance, a Chinese company that controls the platform, is trying to "protect" those it considers the most vulnerable members of its user base: people with disability, those attributable to the category queer and the boys with weight problems. These le categories described as Special on which moderators are called to intervene so that their posts reach a limited audience.
What is the motivation behind such a practice? The official position is one that refers to the desire to protect those directly affected by the bullying. In the material obtained from the Netzpolitik site, "Images depicting a subject highly vulnerable to cyberbullying" or "susceptible to harassment or cyberbullying based on their physical or mental conditions" are cited. Each user is assigned a level of risk: in the case of disability, for example, it is equal to 4 (imagine on a scale of 1 to 5). In the guidelines given to the moderators also the directive to apply the restrictions in case of "facial disfigurement", "autism" and "Down syndrome".
The contents in question are prevented from reaching a wide audience: the appearance in feeds of other members is blocked after reaching 6,000 or 10,000 reproductions, depending on the tag applied.
As written at the beginning, TikTok defends itself by affirming that the intention is to protect the most vulnerable members of your community. First of all, however, we need to ask ourselves how a moderator can recognize a neuropsychic disorder from a film of a few seconds shot with a smartphone.
Such a move has all the air of one preventive measure implemented in order to eliminate a problem (that of bullying) at the root so as not to have to eventually have to deal with its consequences. An issue that other social networks have had to confront. There are no loopholes or shortcuts: the action of bullies and haters must be opposed firmly, but actively, not starting from the assumption that the possibility of their manifestation must justify a penalty for those who can be identified as a potential victim.
The road taken by TikTok seems to us the shortest way to face a delicate dynamic, but also the most wrong for a global platform with dizzying growth rates like that of ByteDance that could and should do instead ofinclusion and of diversity added values, not a burden to be overshadowed. The line that distinguishes protection and discrimination is very thin.