Chinese Communist Party controlled video-based social influencing app TikTok learns a user’s desires and attachments in as little as 40 minutes before beginning the process of bombarding the client with curated content designed to keep watchers fixated, according to a new investigation by Wall Street Journal.
In a July 21 report, WSJ created more than 100 automated TikTok bot accounts programmed with different likes and interests in an attempt to unravel how the influencing app’s algorithm ticks. The bots watched hundreds of thousands of videos combined, providing TikTok with only an IP address to randomize geographical location and a date of birth as information about who the user is.
WSJ points out that officially TikTok claims that shares, likes, follows, and videos watched determine the content the app shows you. However, the experiment found only one of those variables mattered: how long and how many times the user stays watching a specific video.
“Every second you hesitate or re-watch, the app is tracking you,” says the video report. “Through this one powerful signal, TikTok learns your most hidden interests and emotions and drives you deep into rabbit holes of content that are hard to escape.”
The journey of a depressed 24-year old robot from Kentucky
The video report chronicles the journey of one bot account, identifying as a genderless 24-year old from Kentucky, dubbed by WSJ for the purposes of easy reference as @kentucky_96. The bot was given a different name in the app. While each bot account was assigned different interests, no tags were entered into the app. The only input the bots provided was through watching, rewatching, or pausing on videos with hashtags related to their assigned interests.
You are now signed up for our newsletter
Check your email to complete sign up
WSJ says for new users who provide little information about themselves, the app displays a wide array of highly popular, moderated videos in an attempt to discover what they’re truly interested in.
The initial videos displayed to each account had an average 6.31 million views. As the algorithm determined the bots’ tastes based on their interaction activity, the average view count dropped to 0.78 million as the app served more and more niche content, much of which was no longer moderator-approved.
WSJ says while TikTok’s algorithm successfully determined most of the accounts’ interests in under two hours, some were sniffed out in as little as 40 minutes. The findings are especially significant in TikTok’s case, as WSJ noted as much as 95 percent of the content TikTok users intake come directly from its AI-powered recommendation engine.
For @kentucky_96, WSJ said its programmed interests were sadness and depression.
Only three minutes into the bot’s TikTok journey, the 15th video curated by the app is a video by an account titled @shareyoursadness of a car driving down a snowy road at night with sad music titled “Ofw mixed thoughts” as an overlaid voice speaks in generalities about following the course of nature in human relationships. The video is captioned “Everything happens for a reason” and carries the hashtags #sad #heartbroken.
WSJ’s bot watches the video twice.
23 videos later, or about 4 minutes of usage for the Journal’s bot account, TikTok serves up a second video by the same @shareyoursadness of a very similar theme. This time, it’s a car driving down a rainy road at night with a very similar sad song titled “Eeeee – Briannah” and captioned as “I care about your feelings more” with the hashtags #sad #f #viral. The voice overtop of the music talks about breaking up, saying “You know why I’ll leave you alone? Because I care more about your feelings than mine.”
After this algorithmic probing, the app once again curates a wide variety of high view count videos in different genres, such as friendship, epic fails, and even home repair. The app also showed the bot several Kentucky-specific videos related to mainstream SARS-CoV-2 pandemic narratives.
At video 57 of @kentucky_96’s journey, TikTok feeds the bot another piece of pro-depression bait from an account titled @boys._.only.08. The video is captioned “who agrees” and has the hashtags #heartbreak #f #viral. A series of still photos, overlaid with the text “things girls do to make guys lose feelings (heartbreak emoji)” is backdropped by a basic r&b track.
Only three videos later, TikTok serves up a video by an account named @sadboycap captioned “Late night thoughts” with the hashtags #pain #sadboycap that is a black background overlaid with the text “You would be surprised how many times a guy sits in his car, his room, or the bathroom, holding back the tears because he’s confused, hurt, or lost.”
WSJ says the app then began serving up videos about relationships, but when it did not receive affirmative feedback from the user, it presented a video tagged with #mentalhealthmatters #pain #fyp #sadness by @staysimple.bry talking about men who cry.
The bot watched the video, yet skipped over others themed around missing a former lover or advice about moving on from a breakup.
The WSJ bot then spends time on another video tagged with #depression #foryou #quotezzz that has song lyrics saying “I think something’s wrong with me” and a video of a grey, winter sky serving as a background for a social media post by a user with a female silhouette avatar wearing devil horns that talks about “I’m f***** up you know?,” in addition to several other videos about anxiety and social anxiety.
WSJ found through this process TikTok’s algorithm had cemented its view of @kentucky_96 in 224 videos amounting to a mere 36 minutes of total screen time. By this time, TikTok was providing what was almost entirely a pure mix of mental health and depression videos with relationship and breakup videos.
However, after the profiling was complete, the social influencing app completely rabbit holed @kentucky_96, burying it in 278 depression-related videos, amounting to 93 percent of all content curated.
WSJ says a TikTok spokesperson said the remaining 7 percent shown to the account was “disruptive videos” that was allegedly purposed to show the user other content it may be interested in. However, WSJ found @kentucky_96 was only shown advertisements outside of its depression-themed motiff.
TikTok also claimed the experiment was irrelevant to real world user experiences, positing that genuine humans will have a wider base of interests. However, WSJ found some of its remaining 99 bot accounts that were coded with a diverse set of interests were also rabbit holed in curated content by the algorithm.
The only way for the experiment’s accounts to escape their respective niches was if programmers altered their interests. Once the accounts cut back on engagement with their already established themes, TikTok changed its game plan and began serving other content.
Notably, the Journal found accounts with a general interest in politics found themselves being rabbit holed into video about 2020 Presidential Election conspiracy theories and the QAnon disinformation campaign, while bots with interests of a sexual nature were often pushed into videos with unhealthy BDSM-themed narratives.
Concerningly, the farther to the extremes the accounts were pushed as WSJ’s bots naturally indulged their programmed interests, researchers found content curated by TikTok’s algorithm became increasingly dangerous, displaying videos encouraging bulimia, suicide, prescription drug and alcohol abuse, and videos grooming children to respond to illicit sexual advances.