The social media app that took the world by storm in the last couple of years was found to be serving videos featuring inappropriate content to young users who are minors. These videos featured content about drugs, sex, eating disorders, and more. Using search algorithms, these videos are being served to its millions of underage users every day.

The Wall Street Journal published a report on the investigation it launched on the social media app, titled "How TikTok Serves Up Sex and Drug Videos to Minors." In the investigation, the WSJ created bot accounts disguised as users aged 13 to 15, assigning them a date of birth and unique IP address to find out what types of content are being served to this age group on the social media app. The bot accounts were found to have been exposed to videos featuring inappropriate content about porn, drugs, sexual roleplay, eating disorders, and more.

For this investigation, WSJ set up over 100 TikTok accounts, of which 31 were registered as users aged 13 to 15. These accounts were given interests in terms of keywords and machine learning image classifications. The bot user would then dwell on a video that matched his or her interest.

"TikTok served one account registered as a 13-year-old at least 569 videos about drug use, references to cocaine and meth addiction, and promotional videos for online sales of drug products and paraphernalia," the report revealed. "Hundreds of similar videos appeared in the feeds of the Journal's other minor accounts."

The investigation also revealed that TikTok showed the WSJ's teen bot users "more than 100 videos from accounts recommending paid pornography sites and sex shops" and thousand more from creators who labened their content as "for adults only."

More than 974 videos with TikTok were shared by WSJ, informing the company that these videos with sensitive content were "served to the minor accounts - including hundreds shown to single accounts in quick succession."

Of the 974 videos, 169 were removed from the site before WSJ shared them with TikTok, but it was unclear if they were removed by the social media app or by its creators. Of 255 more videos that were also removed, dozens showed adults as "caregivers" entering relationships with adults and pretending to be children.

Dozens of videos were served to the bot users who were minors as indicated in their birthdate, despite the creators labeling them for mature audiences only, a major oversight of TikTok. An earlier video investigation also by WSJ showed how TikTok only considers one thing to determine what a user wants to see: how much time the user lingers over a piece of content, and in this case, how long he or she watches the video. Every single time a user swipes away or lingers and watches the entire video, is used to determine what videos are served next.

"Through that one powerful signal, TikTok can learn your most hidden interests and emotions, and drive users of any age deep into rabbit holes of content-in which feeds are heavily dominated by videos about a specific topic or theme," the earlier report said.

"It's an experience that other social-media companies like YouTube have struggled to stop."

According to the Christian Post, TikTok was placed by the National Center on Sexual Exploitation on its annual Dirty Dozen List, which counts down companies and entities believed to be profiting from sexually exploitative content. NCOSE Communications Director Jake Roberson cited the lack of safety controls and proper moderation content, which enabled the social media app to be a "space for sexual grooming by abusers and sex traffickers."