TikTok has announced measures they are taking to tackle the issue of “dangerous” trends and hoax videos on the platform after conducting research into the impact they have on users.
Although millions have grown to enjoy TikTok for its huge collection of viral videos for a variety of categories, there’s no denying that the platform has seen its fair share of controversial, and outright dangerous trends in recent years.
There have been several instances of children getting injured as a result of trends involving things like fire, leading to a wave of backlash against the platform.
In September, TikTok even had to entirely ban the viral ‘Devious Lick’ trend that saw teenagers stealing items from their schools and trying to hit their teachers, after it blew up in a big way online.
On November 17, they released a post addressing how they aim to tackle some of the more dangerous content on the platform after conducting research into the effect of some trends on young users.
They revealed that ‘self-harm hoaxes’ that attempt to spread false, but frightening information were having an impact on teens’ wellbeing, explaining that: “31% of teens exposed to these hoaxes had experienced a negative impact. Of those, 63% said the negative impact was on their mental health.”
Subscribe to our newsletter for the latest updates on Esports, Gaming and more.
According to their studies, o.3% of teens say they have taken part in dangerous challenges, and they went on to explain how they are using “strong policies” coupled with ” strong detection and enforcement measures” to protect the community.
“We created technology that alerts our safety teams to sudden increases in violating content linked to hashtags, and we have now expanded this to also capture potentially dangerous behavior,” they stated.
“For example, a hashtag such as #FoodChallenge is commonly used to share food recipes and cooking inspiration, so if we were to notice a spike in content tied to that hashtag that violated our policies, our team would be alerted to look for the causes of this and be better equipped to take steps to guard against potentially harmful trends or behavior.”
They added that they will now be displaying additional sources when people search for content relating to self-harm or suicide.