The operation of TikTok’s algorithm is probably the best kept secret of the company. However, the Wall Street Journal (WSJ) thinks it has been able to figure out what it’s all about with a lot of stick accounts. The results are both impressive and disturbing.
A former Google engineer who worked on YouTube’s algorithm says the U.S. video-sharing uses essentially the same approach, but in a less extreme form.
Officially, TikTok says it uses four aspects: what you watch, what you share, what you like and what you follow. However, the analysis suggests that one in four is by far the most important. The WSJ made a 13-minute video to share the results.
It turned out that TikTok only needed one of the four to figure out how long a given user was staying over a piece of content.
Every second, as soon as you start hesitating or watching something again, the app follows the user. Through this single powerful signal, TikTok learns about its most hidden interests and emotions and draws so deep into the cave of content that it’s hard to get out of it.
The TikTok experience begins the same way for everyone. When you open the app, we immediately see an endless line of videos in the For You feed. The service will initially offer the account a selection of very popular videos reviewed by the moderators of the application. Based on these, it identifies the user’s interests.
WSJ has programmed bots to be characterized by age, location, and a range of specific interests. These interests were never included in the app, they only served as a basis for selecting the videos watched by the bot. The bot checked each video in its feed to see if it could find hashtags or AI-identified images related to its interests. He then stopped scrolling to watch these videos and watched some of them again.
The study found that the selection of videos and the number of views decreased, from popular videos about anything to closely. to videos that focus on the interest it identifies. The results were analyzed by Guillaume Chaslot, a data scientist and algorithm expert who previously worked as a Google engineer on the YouTube algorithm.
The expert is now an advocate for transparency in algorithms. According to him, TikTok is different from other social media platforms. “TikTok’s algorithm can become much stronger and can learn vulnerabilities much faster,” he declared. In fact, in less than two hours, TikTok has fully identified and learned the interest of bot accounts. Some were deciphered in less than 40 minutes.
One of the sticks was programmed for the “interest” of sadness and depression. After less than three minutes of use, TikTok gets the first indication that the new user may feel depressed recently.
The information in this single video has provided important insights for the app. Author of the video, audio track, description of the video, hashtags. After the first sad video, TikTok will serve another 23 videos later, for about another four minutes of video viewing.
This is a love break video with the #sad hashtag. TikTok is still trying to get to know this new user with several high-view videos, but at video 57, the user continues to watch a video about heartache and feelings of resentment. Then you watch something about emotional pain in video 60.
Based on the videos you’ve watched so far, TikTok thinks maybe this user wants to see more about love, breakup, and dating. Thus, at about 80 videos and 15 minutes, the app starts serving more of the connections. But the user doesn’t care. . However, the user lingers over videos with the #depression hashtag and videos about anxiety.
224 videos of the bot’s entire journey, meaning that after about 36 minutes of total viewing time, TikTok develops an understanding of the user. The number of videos about depression and struggles with mental health exceeds the number of videos about relationships and breakups. From now on, the user will face a flood of depressed content. 93 percent of the videos in your account are about sadness or depression.
TikTok says bots are not representative of real users, but even bots programmed with diverse interests saw their content very quickly.
As TikTok is hugely popular, especially among teenagers, this is clearly a cause for concern. Someone who is depressed can easily become even more depressed by watching a stream of such content. And in a person who professes conspiracy theories, he can give the impression that such views are mainstreams. Such an algorithm is likely to divert those who take extreme views to more and more extreme content.
TikTok seems to prioritize mental health, and Chaslot says YouTube does something similar, but less so.
Hardware, software, tests, curiosities and colorful news from the IT world by clicking here