TikTok algorithm facilitates sexual and drug content to minors

TikTok algorithm facilitates sexual and drug content to minors


Bot accounts posing as minors were exposed to 569 TiktToks related to drugs and sex

TikTok’s algorithm, a function that helps to provide personalized content in the feed of users of this short video social network, shows content without restrictions to minors. This results in constant and uncontrolled exposure of younger users to TikToks of a sexual nature and with references to drugs.

As a context, content-oriented social networks, such as Youtube and TikTok, use a set of algorithms to determine user preferences and increase interactions and time spent on the platform. In the case of TikTok, the main indicator for determining these preferences is how long you stay viewing a certain piece of content.

The situation became known after an investigation by the American media The Wall Street Journal( WSJ), a publication that manufactured a total of 31 registered bot accounts with ages between 13 and 15 years. Each of these accounts was programmed to trigger a very narrow set of preferences, including drugs and sexual content.

This exercise yielded, with only little exposure to this type of content, the algorithm exposed one of the underage bot accounts to 569 drug-related videos. Another account was exposed to more than 100 publications promoting porn sites and sex shops. More potentially dangerous videos involve the glorification of unhealthy lifestyles, such as alcoholism and eating disorders.

Nearly 2,800 posts listed as adults-only were shown to these accounts, despite TikTok having several policies to prevent overexposure of adult videos to underage users. Creators can label their TikToks as exclusive to people over the age of 18, although many videos recorded in the experiment appear with this restriction. Also, the company has a moderation team of up to 10 thousand people, in charge of reviewing publications that potentially violate the terms of use.

However, the rapid growth of this platform has greatly complicated content moderation, as the number of users in the United States has quadrupled since 2019 alone. This, in addition to the fact that determining transgressions in videos requires a contextualization work impossible for automated algorithms. On the other hand, there is a risk of overmoderating content that, in the eyes and judgment of an adult, does not represent a danger.

Following these findings, TikTok’s parent company ByteDance removed 616 of these videos, about half of which were sampled by WSJ’s bot accounts. Incomplete moderation and the presence of sexual and drug content on the networks is inevitable. However, it is important to pay attention to how technology encourages certain habits and to increase the vigilance of the guardians towards the consumption habits of minors on the Internet.