This is the last article you can read this month
You can read more article this month
You can read more articles this month
Sorry your limit is up for this month
Reset on:
Please help support the Morning Star by subscribing here
A MAN whose daughter took her own life after viewing self-harm content on social media is backing Labour’s consultation on tackling online harms that was launched today.
Ian Russell’s daughter Molly was 14 when she killed herself in 2017.
After her death, the family found that she had viewed graphic images on Instagram, owned by Facebook, that encouraged suicide.
Last year he had said that Molly had entered a “dark rabbit hole of depressive suicidal content” which he suspected was encouraged by the network’s algorithm that pushes images and videos on users that are similar to those they viewed previously.
After her death, the government put forward the idea of online regulation in 2017, following it with a white paper in 2019.
However, Lord Puttnam said in June that an online harms Bill might not come into effect until 2024 after Digital, Culture, Media and Sport minister Caroline Dinenage said she could not commit to bringing it to Parliament this year.
He said that a potential 2024 date would be “seven years from conception. In the technology world, that’s two lifetimes.”
Today, Mr Russell criticised the government for “dragging their heels.”
He added that Labour’s consultation, which is seeking submissions from organisations and party members, will “help the UK become a world leader for effective regulation required to make the internet a safer place.”
Facebook has blamed the coronavirus for hampering efforts to remove harmful posts between April and June as there were fewer moderators in action.
Facebook’s latest community standards report shows that 911,000 posts related to suicide and self-injury underwent action within that quarter, compared to 1.7 million in the previous quarter.
On Instagram, steps were taken against 275,000 posts compared with 1.3 million before.
Action on posts featuring child nudity and sexual exploitation also fell on Instagram, from one million posts to 479,400.