YouTube recommendations send violent gun videos to 9-year-olds, study finds

When researchers at a nonprofit studying social media wanted to understand the link between YouTube videos and gun violence, they created accounts on the platform that mimicked the behavior of normal boys living in the US

They modeled two nine-year-olds who both liked video games, particularly first-person shooter games. The accounts were identical, except that one clicked on videos suggested by YouTube and the other ignored the platform’s suggestions.

The account, which clicked on YouTube tips, was soon filled with graphic videos about school shootings, tactical gun training videos, and instructions on how to make firearms fully automatic.

One video showed an elementary school-age girl with a gun in her hand; Another showed a shooter using a .50-caliber gun to shoot a dummy head, which was filled with blood and brains. Many videos violate YouTube’s own policies against violent or gory content.

The findings suggest that despite YouTube’s rules and content moderation efforts, the platform is failing to stop the spread of horrific videos that could traumatize vulnerable children – or send them down the dark streets of extremism and violence.

“Video games are one of the most popular activities for kids. You ended up at the gun store,” said Katie Paul, director of the Tech Transparency Project, the research group that published its findings about YouTube on Tuesday. can play games like “Call of Duty” without them — but YouTube is taking them there.” “It’s not video games, it’s not kids. This is algorithmic.

Accounts that followed YouTube’s Suggested Videos received 382 different firearms-related videos in a single month, or about 12 per day. Accounts that ignored YouTube’s recommendations still received some gun-related videos, but only 34 in total.

The researchers also created accounts imitating 14-year-old boys who liked video games; Those accounts also received a similar level of gun- and violence-related content.

One of the videos recommended to the accounts was titled “How a Switch Works on a Glock (Educational Purposes Only)”. YouTube later removed the video after determining that it violated its rules; approx. The same video popped up two weeks later with a slight change in name; that video remains available.

Messages seeking comment from YouTube were not immediately returned Tuesday. Officials at the Google-owned platform have said that identifying and removing harmful content is a priority, as is protecting its youngest users.

YouTube requires that users under the age of 17 obtain their parent’s permission before using its site; Accounts for users under 13 are linked to a parental account.

Along with TikTok, the video sharing platform is one of the most popular sites for kids and teens. Both sites have been criticized in the past for hosting, and in some cases promoting, videos that promote gun violence, eating disorders and self-harm. Critics of social media have also pointed to links between social media, radicalization, and real-world violence.

Perpetrators of many recent mass shootings have used social media and video streaming platforms to glorify the violence or even live stream their attacks. In posts on YouTube, the shooter behind the 2018 attack on a school in Parkland, Fla., that killed 17 people, wrote “I want to kill people,” “I’m going to be a professional school shooter” and “I have no problem shooting a girl in the chest.

The neo-Nazi gunman who killed eight people at a Dallas-area shopping center earlier this month also had a YouTube account that included videos of him assembling rifles, serial killer Jeffrey Dahmer and appearing on a television show. A clip from the scene of the school shooting.

In some cases, YouTube has already removed some of the videos identified by researchers at the Tech Transparency Project, but in other cases the content remains available. Many big tech companies rely on automated systems to flag and remove content that violates their rules, but Paul said the project’s report findings suggest more investment in content moderation is needed.

Shelby Knox, campaign director for the Parents Together advocacy group, said that in the absence of federal regulation, social media companies could target young users with potentially harmful content, keeping them coming back for more.

Knox’s group has called on platforms like YouTube, Instagram and TikTok to make it easier for children and teens to find content about suicide, guns, violence and drugs.

Knox said in response to a report published earlier this year, “Big tech platforms like TikTok have chosen the health, safety and even time and again of their profits over children, their stockholders and their companies. ” To Kishore

TikTok has defended its site and its policies, which restrict users under the age of 13. Its rules also prohibit videos that encourage harmful behavior; Users who search for content about topics including eating disorders are automatically prompted to offer mental health resources.

(This story has not been edited by News18 staff and is published from a syndicated news agency feed)