Ian Russell’s mind was still reeling when he sat at the family computer back in 2017. His 14-year-old daughter, Molly, had just died from an act of self-harm and Russell was looking for answers about how this could have happened. Scrolling through Molly’s email inbox, he believed he had found them. Two weeks before her death, the British teenager received an email from Pinterest. “Depression Pins you may like,” it read. The email included an image of a bloody razor. Instagram was also helping Molly discover new depression content: In the six months before her death, she shared, liked, or saved more than 2,000 posts related to suicide, self-harm, and depression on the site.
Last week, the senior coroner for north London, Andrew Walker, concluded that it was not right to say Molly died by suicide and said that posts on Instagram and Pinterest contributed to her death. “She died from an act of self-harm while suffering from depression and the negative effects of online content,” Walker said.
More children than Molly are exposed to disturbing content online. Almost two-thirds of British children aged 3-15 use social media, and one-third of online children aged 8-15 have seen worrying or upsetting content online in the past 12 months, according to a 2022 report by British media regulator Ofcom. Child protection campaigners say posts showing self-harm are still available, even if they are now harder to find than in 2017.
But Molly’s case is believed to be the first time social media companies have been required to take part in legal proceedings that linked their services to the death of a child. The platforms were found to have hosted content that glamorized self-harm and promoted keeping feelings about depression secret, says Merry Varney, solicitor at Leigh Day, the law firm representing the Russell family. Those findings “captured all the elements of why this material is so harmful,” she adds.
The inquest sought only to establish the official reason Molly died. But unofficially, the two-week hearing put Instagram and Pinterest on trial. Both companies say they’ve changed in the five years since Molly’s death. But those changes have made them veer in different directions; demonstrating two distinct models for how to run a social media platform. Meta, Instagram’s parent company, says it wants to be a place where young people struggling with depression can seek support or cry out for help. Pinterest has started to say some subjects simply don’t belong on its platform.
According to Pinterest, self-harm is one of those subjects. “If a user searches for content related to suicide or self-harm, no results are served, and instead they are shown an advisory that directs them to experts who can help if they are struggling,” says Jud Hoffman, global head of community operations at Pinterest. “There are currently more than 25,000 self-harm-related search terms on the blocked list.” Varney agrees the platform has improved but says it’s not perfect. “Research that we did with Molly’s family suggested that there is much less of this content on Pinterest [now],” she says.