A study has revealed that Tik Tok is pushing 13-year-olds towards videos containing sexual acts, drugs and alcohol.
The social media platform works on how users will watch videos based on what they have seen in the past using the algorithm.
This means users will see content related to their previous searches, the things they’ve seen the most, and the most viewed articles.
But the algorithm doesn’t change for young users, research says. The Wall Street Journal. (WSJ) found.
When a “13-year-old user” made of paper searched for “Fans Only”, they were able to watch a handful of videos – including two sailing porn.
The same teen was also shown a series of sex-based clips on the platform’s “for you” page – where the content is tailored to each user.
In their study, WSJ created 31 tick-tock user bots with ages ranging from 13 to 15 to determine if younger viewers received different feeds than older tick-tack users.
But regardless of their age, the more sexual content teenagers saw, the more they were rewarded.
Through constant curation with AI-driven algorithms, they found that feeds would become more focused, and increasingly attracted to inappropriate content.
One account, which claims to be 13-year-old, was shown 569 videos of drug use, including references to cocaine and meth addiction.
They were also shown videos of online drug sales.
In total, WSJ accounts showed more than 100 videos promoting accounts and sex shops from accounts labeled as adults only.
They were even fed clips encouraging eating disorders, promoting alcohol use and other adult-based content.
WSJ correspondents have posted about 1,000 videos on TikTuk showing drugs, pornography and other adult-related content that were shown on their 13- to 15-year-old boot accounts.
Of the clips he sent, 255 were removed.
A spokesman told WSJ that the firm had removed some of the reported videos, and restricted the distribution of others so that they would not be recommended to other young users in the future – but it was not clear how many there were.
The safety of minors is paramount, and Tik Tak has taken industry-leading steps to promote a safe and age-appropriate experience for teens.
TuckTuck said it does not currently differentiate between videos offered in adult and children’s accounts, but is working on a new filtering tool for smaller accounts.
Under their Terms of Service, customers must be at least 13, and if they are under 18, they need parental permission.
“Protecting minors is critical, and TuckTuck has taken industry-leading steps to promote a safe and age-appropriate experience for teens,” TuckTuck said.
Each of the 31 ‘boot’ accounts used in the WSJ investigation was given a date of birth and an IP address, with no other details revealed.
A dozen accounts quickly dominated a particular theme, because the selection algorithm suited your page.
TuckTuck told the journal that Bots’ behavior “in no way represents real human behavior and viewing experience.”
But that doesn’t just change the size of adult videos between the ages of 13-15.
A Tic Tac spokesman said: “Although the activity and productive experience of these bots does not in any way represent the behavior and viewing experience of a real person, we are constantly working to improve our system and we Here’s how to prevent the most unusual looking habits from creating a negative cycle, especially for our young consumers.
“We care a lot about the safety of minors, which is why we incorporate youth welfare into our policies, limit age-appropriate characteristics, and empower parents with tools and resources.” And continue to invest in new ways to enjoy age-appropriate content. Or family comfort. “