A new investigation from the Wall Street Journal this week revealed that Tik Tok’s algorithms are routinely serving up sexual content and content containing drugs and alcohol to children as young as 13.
Using the app under the guise of being a 13-year-old user who searched for the term “onlyfan”, Tik Tok even presented the user of two instances of selling pornography.
Sexually oriented videos were also included in Tik Tok’s “For You” page, the investigation revealed. Content on this page, which is similar to Twitter’s timeline, is based off of prior searches and content that a user has spent the most time searching for, The Daily Mail added.
As the “13-year-old” user browsed more sexual content, more was recommended to them, despite the user’s age clearly being marked as 13 in their profile.
Read Also: Reddit Bans Vaccine-Skeptic Subreddit r/NoNewNormal Days…
Read Also: Media Liable For Comments Posted On Their Facebook Pages – High Court
Tik Tok has responded by saying they are “working on a new filter tool for younger accounts,” the Daily Mail wrote.
The investigation took place using 31 bot accounts that were set between the ages of 13 and 15 in order to try and determine what type of content younger Tik Tok users were being shown. It revealed that feeds would eventually wind up becoming more focused on increasingly inappropriate content.
The Journal wrote: “Even when the Journal’s accounts were programmed to express interest in multiple topics, TikTok sometimes zeroed in on single topics and served them hundreds of videos about one in close succession.
“TikTok served one account, which had been programmed with a variety of interests, hundreds of Japanese film and television cartoons. In one streak of 150 videos, all but four featured Japanese animation—many with sexual themes.”
One account that was set up to appear as though it was being used by a 13-year-old, was shown 569 videos about drug user, including cocaine and meth addiction. It was also shown promotional videos for the online sale of drug products.
Other videos featured alcohol and eating disorders.
Reporters from the WSJ sent Tik Tok more than 1,000 videos showing the inappropriate content that was shown to the teenage accounts. 255 videos were removed from the platform, including several videos about adults entering relationships with other adults who were pretending to be children.
Tik Tok responded: “Protecting minors is vitally important, and TikTok has taken industry-first steps to promote a safe and age-appropriate experience for teens.”
A company spokesperson told The Mail: “While the activity and resulting experience of these bots in no way represent the behaviour and viewing experience of a real person, we continually work to improve our systems and we’re reviewing how to help prevent even highly unusual viewing habits from creating negative cycles, particularly for our younger users.”
The company continued: “We care deeply about the safety of minors, which is why we build youth well-being into our policies, limit features by age, empower parents with tools and resources, and continue to invest in new ways to enjoy content based on age-appropriateness or family comfort.”