Social media content ‘likely’ contributed to Molly Russell’s death
Content from social media sites including Instagram and Pinterest are ‘likely’ to have contributed to the death of British teenager Molly Russell, who took her own life after viewing thousands of posts about suicide, depression and self-harm, a coroner said. .
The result marks a toll for social media platforms as lawmakers around the world grapple with how to make the internet safe for children, and will put renewed pressure on apps used by young people.
Presenting his findings nearly five years after Russell died in November 2017, aged 14, Chief Coroner Andrew Walker said she died “of an act of self-harm while suffering from depression and the negative effects of online content”.
Russell had engaged in 2,100 posts about depression, suicide, or self-harm on Meta-owned Instagram in the six months prior to his death and interacted with this content on all but 12 days during that time, at unbeknownst to his family.
Social media sites were not “safe” at the time of Russell’s death, Walker said, adding that it was “likely that materials used by Molly, who was already depressively ill and vulnerable due to his age, affected his mental health in a negative way, and contributed to his death in a more than minimal way”.
Walker opted not to register a suicide finding due to the severity of Russell’s mental health and the fact that online content had “normalized his condition”. The messages Russell saw described suicide as “an inevitable consequence of a condition from which one could not recover,” Walker said.
The platforms’ algorithm design meant that Russell was exposed to certain content without seeking it, Walker said. Russell had been able to “gorge” himself with harmful videos, images and clips “some of which were selected and provided without Molly asking for them”, he said. These “binge periods” were “likely to have a negative impact on Molly”, he added.
“At the time these sites were visited by Molly, some of the [them] weren’t safe because they provided access to adult content that shouldn’t have been available to a 14-year-old,” Walker said.
The coroner said on Thursday: ‘In the past, when a child walked through the front door of their home, they did so in a safe place. . . where the dangers were minimized, if at all.
“What we did with the Internet was [bring] in our homes a source of risk and we did it without appreciating the extent of this risk.
The coroner will prepare a report in the coming weeks aimed at preventing future deaths, which will be sent to Pinterest and Instagram. He said Thursday that he believes children and adults need separate social media sites.
The inquest heard that Russell had received a ’10 Depression Pins You Might Like’ email from Pinterest in the weeks following his death and that Instagram had suggested accounts related to depression and depression to him. self-harm.
The inquest heard clashes between Russell family lawyer Oliver Sanders KC and Meta welfare officer Elizabeth Lagone, who defended the company’s view that certain content related to self-harm were “safe”.
Meta banned all graphic self-harm and suicide content in 2019 and has always removed posts that encouraged or promoted it.
Pinterest community manager Jud Hoffman acknowledged during the investigation that even with technology upgrades the site was still “imperfect” and admitted he wasn’t safe at the time Russell l was using.
Sanders told the coroner on Thursday that 14-year-olds like Russell were in the “eye of the storm” for potential damage from the rapid growth of social media sites.
Both Instagram and Pinterest have tightened their policies since Russell’s death in 2017.
Meta said he was “committed to ensuring Instagram is a positive experience for everyone, especially teens” and “will carefully review the coroner’s full report.”
Pinterest said: “Combating self-harming content is a priority for us as we work to ensure Pinterest plays a positive role in people’s lives. . . Molly’s story has reinforced our commitment to creating a safe and positive space for our [users].”
Anyone in the UK affected by the issues raised in this article can contact the Samaritans free of charge on 116 123.