An internal TikTok document that was distributed among its moderators requested that content featuring people "with ugly faces", "fat" or recorded in "messy houses" in the section "For you" ("For You") of the application, where content is recommended to users based, supposedly, on their tastes and usage history of the application.
TikTok is again accused of not deliberately promoting certain content
However, as a group of TikTok users reported a few weeks ago, the rules that guide the video social network, very popular among younger users, are more than questionable. According to the internal document, to which The Intercept has had access, people with "rare" body shapes or "many wrinkles" were less likely to see their promoted content than others who had greater visual appeal.
A person in charge of TikTok has recognized the existence of that document, although he points out that it is already in disuse and that it does not correspond to the United States. Apparently, he would have been employed months ago in "another geographical location" by the company to avoid bullying, as he has pointed out. However, the leaked document clearly explains that if what is seen in the image is not "visually appealing" it should not be recommended to users because it will be uninteresting and they will not end up getting hooked on the contents of the application.
The truth is that it is not the first time that TikTok has faced this type of accusations of biasing the content that it recommends to users. For example, it has been criticized that in some territories it has intentionally censored LGTBQ content, although the company's response has always been the same: “they were outdated user guides that were used months ago, but the application has changed a lot since then. In China, he has also been accused of restricting content related to riots in the country, about the protests in Tiananmen Square, or any reference to Tibet or Taiwan.
Knowing how the algorithm of Instagram works is little less than impossible, since the company does not offer, like other social networks, clear explanations about it. Users have no choice but to try to decipher some theories, such as the one that defends that TikTok is guided by a curious block policy. It is unknown if this is really used, but what is known is that the app has its own human moderators, whom it teaches depending on the country they are in when it comes to selecting which content is displayed and which is not. The interests behind it? They are not known for sure.
UPDATE: TikTok has sent us its official position regarding the internal use guide:
"The Community Rules mentioned appear to be largely the same or similar to what the Guardian already showed in a report last year and which were removed before it was published and also before The Intercept says it agreed to the document.
In the past year, we have established Trust and Safety centers in California, Dublin and Singapore, which oversee the development and execution of our moderation policies and are led by industry experts with extensive experience in these areas. Local teams apply the updated Community Rules that we already released in January that are aimed at keeping TikTok as a place of open self-expression and a safe environment for both users and creators.
Most of the rules featured in The Intercept are no longer used, or in some cases seem to have never been in force, but it is correct that for live streaming, TikTok remains especially vigilant to keep sexual content out of the platform."