The father of 14-year-old Molly Russell who killed herself after viewing suicide images online, says the tech giants must take urgent steps to protect young people.

Speaking the day before the 18-month anniversary of his daughter’s death, Ian Russell said they are still hosting harmful content.

Mr Russell has always said the posts Molly had been looking at on Instagram before she died contributed to her death.

Now he has urged: “For the safety of young people, the platforms need to do something quickly to make the internet safer.”

Welcome to MirrorNextGen: a ground-breaking project by the Mirror to put the things British teenagers care about at the heart of the national conversation.

For one day only, we are handing over control of the Mirror to a team of teenage editors. They are taking charge of our coverage in both print and online - giving them the chance to tell us about the issues which matter to them.

We hope you enjoy what you discover… and discover a lot.

Molly Russell who took her own life in November 2017

Last month Health Secretary Matt Hancock met Facebook , Snapchat, Google and Instagram and they agreed to fund the Samaritans to help identify dangerous content and create a best practice guide to tackling it.

Mr Russell, from Harrow, north London, who runs the Molly Rose Foundation in memory of his daughter, believes that this is not moving fast enough.

He warned: “It’s still all too easy to find such dangerous content.”

He added: “In the hours between us saying ‘sleep well’ and the terrible dawning of next day, Molly’s only other influence must’ve come from beyond our house, beyond our love and protection - from the internet.”

Molly's dad Ian says Instagram is partly responsible for his daughter's death
 

The Government will appoint an independent regulator to hand out fines and hold tech bosses personally liable for harmful content.

But legislation might take two years.

Speaking alongside Mr Russell, Andy Burrows, associate head of child safety online with the NSPCC, said: “Until we have legislation passed the Government should monitor whether platforms are playing ball with this interim code of practice [and] name and shame those that drag their heels.”

Tara Hopkins, from Instagram, said: “Our policies have never allowed content that encourages or promotes suicide, self-harm or eating disorders.

“We will remove it as soon as we are made aware of it.

“Many use Instagram to get support or support others, so we do allow content that discusses these topics.

“Our policies no longer allow graphic self-harm content.

"It will take time while we build technologies to find and remove it.”

Read More

Top news stories from Mirror Online