Media platforms

Molly Russell’s death sparks call for change in social media platforms

A coroner has recommended separate platforms for adults and children after the death of schoolgirl Molly Russell

Schoolgirl Molly Russell’s father has urged social media companies not to ‘drag their feet while waiting for legislation’, as a coroner has made recommendations for separate platforms for adults and children.

Coroner Andrew Walker sent a Preventing Future Deaths (PFD) report to companies including Meta, Pinterest, Twitter and Snapchat, as well as the UK government, on Thursday October 13, in which he called for a review of the algorithms used by the site to deliver content.

The 14-year-old, from Harrow, north-west London, ended her life in November 2017 after viewing suicide and self-harm content online, prompting her family to campaign for better internet security.

Molly Russell ended her life in November 2017 after viewing suicide and self-harm content online (Picture: PA)

The coroner also raised concerns about age verification when signing up to platforms, content not being vetted to be age specific, and the algorithms used to deliver content with ads.

Other issues raised in the report included the lack of access or control for parents and guardians, and the inability to link a child’s account to that of a parent or guardian.

What happened to Molly Russel?

At the inquest held at North London Coroner’s Court last month, the coroner concluded that Molly died while suffering from the ‘adverse effects of online content’.

The inquest heard the teenager had accessed material from the ‘ghetto of the online world’ before her death.

Molly’s family have argued that sites such as Pinterest and Instagram recommend accounts or posts that “promote” suicide and self-harm.

In her testimony, Meta executive Elizabeth Lagone said she believed the messages seen by Molly, which her family said “encouraged” suicide, were safe.

Pinterest’s Judson Hoffman told the inquest the site was “unsafe” when the schoolgirl used it.

What recommendations were made?

In light of the concerns raised, Mr. Walker recommended that the government consider reviewing the provision of internet platforms to children in his report.

Areas highlighted for review included separate platforms for adults and children, age verification before joining a platform, providing age-specific content and the use of algorithms to provide content.

The coroner also recommended that the government examine the use of advertising and parental, guardian or carer control, including access to material viewed by a child and retention of material viewed by a child.

Mr Walker’s report said: ‘I recommend that consideration be given to the creation of an independent regulatory body to monitor the content of online platforms, taking particular account of the above.

“I recommend that consideration be given to enacting the necessary legislation to ensure the protection of children from the effects of harmful online content and the effective regulation of harmful online content.

“While regulation is a matter for government, I see no reason why the platforms themselves would not wish to consider self-regulation taking into account the issues raised above.”

Mr Walker said he believed action needed to be taken to prevent future deaths.

Mr Russell urged social media companies not to ‘drag their feet’ in implementing social media recommendations (Picture: PA)

Molly’s father, Ian Russell, responded to the recommendations, saying the report echoed the family’s concerns about the “online dangers” that Molly was exposed to and “driven by the platform’s algorithms”.

He urged social media companies to heed the coroner’s advice and “not drag their feet waiting for legislation and regulation”, but rather take a “proactive approach to self-regulation” to make the platforms more safe for young people.

Mr Russell added: “They should think long and hard about whether their platforms are youth friendly.

“The government must also act urgently to put in place its strict regulation of social media platforms to ensure that children are protected from the effects of harmful online content, and that platforms and their senior management expose themselves to heavy penalties if they fail to take steps to curb the algorithmic amplification of destructive and extremely dangerous content or fail to remove it promptly.

“I hope this will be implemented quickly through the Online Safety Bill which needs to be passed as soon as possible.”

In response to the PFD’s report, Instagram’s parent company Meta said it agrees “regulation is needed” and is “reviewing” the coroner’s report, adding: “We don’t allow content that encourages suicide or self-harm, and we find 98% of content we take action on before it’s reported to us.

“We will continue to work hard, together with experts, teenagers and parents, so that we can continue to improve.”

Pinterest also released a statement in response to the report, which read, “Pinterest is committed to continuous improvements to help ensure the platform is safe for everyone and the coroner’s report will be carefully reviewed.”

Meta, Pinterest, Twitter and Snapchat all have 56 days to respond with a timeline of actions they propose to take or explain why no action is proposed.

For those struggling, there are a variety of places that offer help and support. Anyone can contact the Samaritans for free at any time from any phone on 116 123, even a mobile without credit. This number will not appear on your phone bill. Or you can email[email protected]or visit the Samaritans website.


Source link