A recent study has found that the YouTube algorithm has the potential to automatically push violent or graphic videos to young children without them actively searching for them.
YouTube’s video algorithm has been a point of contention for quite some time. The mysterious inner workings of the platform’s recommended feed have dictated what content is pushed to viewers and in turn, what content gains popularity.
The algorithm, like most other social media platforms, takes into account what content the user is consuming, before offering them similar content in order to keep them engaged. At least, that’s the intention. Often, this means tuning a user’s suggested videos to cater to their particular hobbies and enjoyment.
However, a new study has found that there may be a potentially darker side to how YouTube’s algorithm functions. A study conducted by the Tech Transparency Project found that young children were having their YouTube suggestions flooded with graphic videos about school shootings, gun training videos, and more.
Study finds YouTube algorithm may be pushing violent content to children
The new study took two YouTube accounts and simulated the behavior of two nine-year-olds who like video games. These accounts were both identical, except one would click on videos that were recommended by YouTube, whilst the other did not. The account that did click on the suggestions provided by the platform was soon flooded with graphic content, the report claims.
Subscribe to our newsletter for the latest updates on Esports, Gaming and more.
The account that did not interact with the suggestions would still receive some gun-related videos, at a total of 34. However, the accounts that did engage with the suggested videos would end up receiving 382 different firearm-related videos in a month, or around 12 per day, according to the research. Alongside this, the study also created accounts that mimicked 14-year-old boys and found similar results.
The findings of the study have brought about criticism of YouTube’s algorithm, which a spokeswoman at the platform defended in response to AP News. They noted it requires users under the age of 17 to gain their parent’s permission before using the site, and accounts for users under 13 are linked to a parental account.
“We offer a number of options for younger viewers… which are designed to create a safer experience for tweens and teens.”
The conversation about algorithms has also bled onto TikTok, which has similarly defended its site and policies by stating they prohibit users under the age of 13.