Self-harm clips hidden in kids' cartoons

Self-harm clips hidden in kids' cartoons

We appreciate people drawing problematic content to our attention, and make it possible for anyone to flag a video.

Dr Free Hess, a paediatrician and mother, had learnt about the chilling videos over the summer when another mum spotted one on YouTube Kids.

She campaigned to have the clip removed but found this month that it had resurfaced on the site, apparently hidden in the middle of a cartoon. Four minutes and forty-five seconds into a video, the cartoon cut away to a clip of a man walking onto the screen and simulating cutting his wrist. "An injured and abused female character disguised as a pet wolf is bought for $400 from a masked man with a simple phone call", she said of one video.

"I think it's extremely risky for our children", Dr Hess said about the clips on Sunday in a phone interview with The Washington Post.

Hess also created compilation videos and uploaded them to YouTube.

"I think it's extremely unsafe for our kids", Dr Hess told the news outlet. However, parents have since discovered that several other cartoons contain information about how to commit suicide, including the same spliced-in video clip.

YouTube has long struggled with how to keep the platform free from such material - removing hateful and violent videos, banning risky pranks and cracking down on child sexual exploitation.

Following complaints more than a year ago, Youtube said it had taken steps to add more moderation and step up enforcement of guidelines for videos aimed at children.

More news: Train to nowhere: 183 passengers stuck on Amtrak train since Sunday

Another version was reposted February 12, receiving more than 1,000 views before it, too, was removed from the site.

Kaslow, who teaches at Emory University's school of medicine, said that some children may ignore the grim video content but that others, particularly those who are more vulnerable, may be drawn to it. "We have strict policies that prohibit videos which promote self-harm", said YouTube spokesperson Andrea Faville, explaining that the site relies on user flagging and smart detection technology for inappropriate content. Kids are young to understand the consequences and it would be too late before parents realise it.

Children's charity the NSPCC has accused YouTube of failing to tackle risky content on its youth channel. Still, she said, it's not enough. That way they can report inappropriate content to social media platforms, so it can be pulled.

The mom wrote, "What did I just see?"

When asked about the videos in an email exchange with CBC News, the video site responded that YouTube is not intended for kids under 13. "There needs to be messaging-this is why it's not okay".

She added that there should be "serious consequences" for those who had a hand in the videos, noting that it was "very worrisome" that they were targeting children.

If you or anyone you know is experiencing suicidal thoughts or depression, call the National Suicide Prevention Lifeline at 800-273-8255.