Truth, Inspiration, Hope.

Investigation Reveals TikTok Promoted Videos of Sex and Drugs to Minors

Arvind Datta
Arvind is a recluse who prefers staying far away from the limelight as possible. Be that as it may, he keeps a close eye on what's happening and reports on it to keep people rightly informed.
Published: September 13, 2021
An Indian mobile user browses through the Chinese owned video-sharing TikTok app on a smartphone in Amritsar on June 30, 2020.
An Indian mobile user browses through the Chinese owned video-sharing TikTok app on a smartphone in Amritsar on June 30, 2020. (Image: NARINDER NANU/AFP via Getty Images)

With over 60 million users in the United States, TikTok has become exceedingly popular among youngsters, with an assortment of short videos spanning topics such as dance, comedy, sports, and education.

However, a recent investigation by the Wall Street Journal found that the app also exposes youngsters to videos that are likely to make parents uncomfortable. Minors as young as 13 years old were exposed to an endless stream of content about sex and drugs.

To find out what kind of content users between the ages 13 to 15 are exposed to on the platform, the media outlet created 31 fake accounts and studied their “For You” feeds. The bot accounts were quickly directed to hundreds of videos that featured adult content. 

One particular account registered to a 13-year-old showed more than 500 videos of drug use, references to cocaine and meth addiction, and promotional videos for online sales of drugs. Hundreds of similar feeds were seen in the other fake accounts as well.

“TikTok also showed the Journal’s teenage users more than 100 videos from accounts recommending paid pornography sites and sex shops. Thousands of others were from creators who labeled their content as for adults only,” the Journal wrote.

According to TikTok’s community guidelines, some of the content the Journal ran into was actually banned by the platform. However, the publication did manage to share 974 videos ranging from drugs to pornography, as well as accounts that brought up topics such as eating disorders and alcohol use.

A TikTok spokeswoman declared that the majority of videos did not violate the guidelines. She informed the Journal that the company deleted some of the videos after the fake accounts viewed them. In order to prevent the app from recommending the videos to other users, TikTok also restricted the distribution of other videos, according to the exposé article.

However, the app does not distinguish between videos recommended to adults and minors, said the spokeswoman. The company is reportedly working on creating a tool that filters content for minors.

A call for protection

According to TikTok’s terms of service, users must be at least 13 years of age, and those under 18 need parental approval to use the app.

“Protecting minors is vitally important, and TikTok has taken industry-first steps to promote a safe and age-appropriate experience for teens,” the spokeswoman said in a statement. The minor accounts can also be managed and monitored by parents, she added.

A number of accounts were shown videos with a link asking them to sign up for OnlyFans.com, a subscription-based site featuring pornography and sexual content. However, a TikTok spokesperson told the Journal that the app deletes links directing users to sexual services, including OnlyFans.

TikTok tracks users across the app and calculates the amount of time spent on each video. It then uses the data for video recommendations to keep consumers engaged.

The National Center on Sexual Exploitation (NCOSE) had previously placed TikTok on its 2020 Dirty Dozen List. In order to protect children from online predators, it called on authorities to institute more precautionary measures.