Truth, Inspiration, Hope.

Communist China-controlled TikTok Encourages Suicide, Helps Kids Buy Drugs

Neil Campbell
Neil lives in Canada and writes about society and politics.
Published: March 21, 2023
The Chinese Communist Party's TikTok is encouraging 13 year olds to commit suicide and selling drugs, research found
A file photo of the TikTok logo on a building in California on March 16, 2023. New research utilizing a series of accounts reporting their age as only 13-years-old found that Chinese Communist Party social influencing video app TikTok is strongly pushing suicide and drug sales content. (Image: PATRICK T. FALLON/AFP via Getty Images)

TikTok, the Chinese Communist Party-backed social media platform used by almost half of Americans, has come under increasing scrutiny. A recent feature by Vice criticizes the app for pushing information about methods of suicide and allowing accounts that advertise Telegram channels that sell drugs via international shipping to 13-year-olds.

Vice’s reporting is based on the content of a March 2023 report by Eko, described as a “corporate accountability group.”

MORE ON THE IMPACT OF SOCIAL MEDIA

The 23-page report states that a cascading endemic of depression and suicidal behavior plaguing youth can be attributed to the “contributing role of social media platforms like TikTok, Instagram, and Snapchat” in what it describes as a “mental health crisis.”

Eko focuses on Tiktok, noting that of its 1 billion monthly users, a third are estimated at being aged 13 or younger.

The group found that TikTok’s data collection and analysis methodology “is highly precise in tracking a users’ actions, swipes, and movements, including factors such as likes, comments, and time spent watching content to calibrate its algorithm and funnel content to users through its For You Page.” 

Results were similar to a July 2021 investigation by the Wall Street Journal that used a series of automated accounts using different ages and geographical locations to uncover that after just 40 minutes of use, TikTok began trapping users in a “rabbit hole.” 

Downward spiral

Owned by Chinese company Bytedance, TikTok is the international version of Douyin. The two apps have very similar branding and functions, but TikTok is off-limits to internet users in mainland China, and Douyin promotes educational or uplifting content in line with the CCP’s “positive energy” social policies — while the company does the exact opposite overseas.

TikTok relied on no more information than what videos the bots scrolled past and viewed in order to curate content such as pornography and being sad from romantic relationships to match the accounts’ internally defined desires.

Eko used a similar methodology using bot accounts to pose as 13-year-olds and were able to uncover “a network of harmful suicide, incel, and drug content easily accessible to a 13-year-old account, some of which can be found in as little as three clicks.”

Findings were alarming. Eko discovered that the app’s For You page “automatically served up highly viral and dangerous suicide content including videos with guns being loaded and text suggesting suicide, alongside comments listing users’ exact dates for their own self-harm or suicide.”

The hashtag system was heavily employed to curate content, the group found, noting that the two most popular, #sh (self harm) and #imdone, had 926,000 and 230,000 videos respectively that had been viewed over 6 million and 2 million times.

Not only did TikTok directly encourage suicide and self harm, the app took the inculcation to the next step by serving up a litany of “videos promoting dismal and otherwise hopeless content and commentary around death, toxic relationships, sexual abuse, depression, and domestic violence,” perhaps to encourage the despondency that leads to suicide.

TikTok also serves up content that directly tells viewers how to end their lives, the group found that “multiple videos were found on TikTok offering advice regarding at-home suicide with videos suggesting, ‘5 ways to end it without feeling any pain’.”

One video, they said, encouraged kids to create a “potentially lethal dose of salt and water in order to induce organ failure.”

Digital opium

Eko researchers also found TikTok is being used to directly traffic drugs, stating that after just a few hours of digging using an account declaring its age as 13 and the app’s basic search algorithm, they uncovered 18 accounts selling anything from cocaine to MDMA to marijuana.

They found that the accounts were encouraging viewers to join a Telegram channel that sold M30s, described as “the street name for imitate Oxycodone pills which have routinely contained fentanyl.”

The distribution of fentanyl in the United States is well established to be connected to the CCP’s United Front Work Department in cooperation with Mexican drug cartels.

Researchers joined the Telegram channel and found “an entire menu of illicit drugs for sale, including Percocet, Xanax, LSD, Cocaine, hydrocodone, marijuana, and MDMA” with international shipping available.

Maen Hammad, co-author of the report for Eko, told Vice in an email that “ten minutes and a few clicks on TikTok is all that is needed to fall into the rabbit hole of some of the darkest and most harmful content online.”

“The algorithm forces you into a spiral of depression, hopelessness, and self harm, and it’s terribly difficult to get out of that spiral once the algorithm thinks it knows what you want to see. And it’s extremely alarming to see how easy it is for children to fall into this spiral,” Hammad added.

Hammad’s statements were also consistent with what the Wall Street Journal found in their research, in that once accounts were characterized as having certain interests by TikTok’s algorithm, it was impossible to shake off the types of content curated in response.

Vice said that a spokesperson for TikTok claimed in November 2022 that it would start to “work aggressively to combat hateful behavior by proactively removing accounts and content that violate our policies” after an earlier investigation found the app was pushing the “incel” subculture, which revolves around the negative feelings of people who cannot or feel that they cannot get into intimate relationships. “Incels” often turn to outright misogyny in response to their frustration.