The coroner in the Molly Russell inquest has told social media companies to consider overhauling algorithms to stop them bombarding children with harmful content.
Senior Coroner Andrew Walker is expected to conclude on Friday whether suicide and self-harm material on platforms including Instagram and Pinterest contributed to the death of 14-year-old Molly in November 2017.
After hearing closing submissions at North London Coroner’s Court on Thursday, he set out safety concerns to lawyers from the two social media companies. He acknowledged that he could not make any formal recommendations but instead urged them not to let the chance to make children safer online “slip away”.
Mr Walker’s concerns included the content that was available and recommended to children by the algorithms that underpin the social media sites.
Instagram uses a so-called “content ranking” algorithm, which means that, if a user interacts with an image or video, they see more and more videos similar to it. For Molly, this resulted in her “bingeing” on an escalating amount of graphic material including suicide scenes and self-harm injuries before she took her own life.
The inquest heard that after her death, her family found Pinterest had continued to send her emails with galleries of harrowing content under headlines such as: “New ideas for you in Depression.”
‘A source of risk’
Other concerns raised by Mr Walker included the absence of age verification processes, children and adults sharing a platform in the first place, children seeing the same content as adults and parents being given no oversight of content being browsed by their children.
He told counsel for Pinterest and Meta, which owns Instagram: “This is an opportunity to make this part of the internet safe – do not let it slip away. We must do it.”
He said it was once the case that children entered a “place of safety” as soon as they walked through the front door, adding: “With the availability of the internet, we brought into our homes a source of risk, and we did so without appreciating the extent of that risk.
“If there is one benefit that can come from this inquest, it must be to recognise that risk and take action to make sure that the risk that we have so embraced in our homes is kept away from children completely.
He described the evidence heard during the inquest into Molly’s death as “a rare opportunity to see the extent to which that risk has invaded all aspects of young people’s lives”.
‘Deranged, incomplete understanding’
Earlier in the hearing, Oliver Sanders KC, representing the Russell family, gave an excoriating assessment of the evidence provided to the court by Meta in his closing submissions to the coroner.
He highlighted the “utterly staggering” suggestion made by the company that “Molly’s experience of Instagram was somehow equivalent to reading Sylvia Plath poems or reading Romeo and Juliet”. Meta and Pinterest, he claimed, continued to fail to understand the risks that “they were creating”.
Mr Sanders also condemned the “bizarre suggestion” that Molly had somehow found support from the content on the platform when in reality the posts were encouraging her not to talk to others about her problems while presenting suicide as “an inevitability”.
Instagram, he said, displayed a “deranged, incomplete understanding of the risks” posed to children and the way to manage them.
The inquest will conclude on Friday.