Skip to main contentSkip to navigationSkip to navigation
Molly Russell
Molly saved 16,300 images on her Instagram account in the last six months of her life, 2,100 of which related to self-harm and suicide. Photograph: PA
Molly saved 16,300 images on her Instagram account in the last six months of her life, 2,100 of which related to self-harm and suicide. Photograph: PA

Meta executive apologises over inappropriate content seen by Molly Russell

This article is more than 1 year old

Inquest hears that some of content viewed by 14-year-old on Instagram in months before her death violated guidelines in place at the time

A senior executive at Instagram’s owner has apologised after admitting that the platform had shown Molly Russell content that violated its policies before she died.

Elizabeth Lagone, head of health and wellbeing policy at Meta, acknowledged that some of the posts and videos had broken Instagram guidelines at the time, which prohibited the glorification, encouragement and promotion of suicide and self-harm.

“We are sorry that Molly saw content that violated our policies, and we don’t want that on the platform,” she said.

Molly, 14, from Harrow, north-west London, killed herself in November 2017 after viewing extensive amounts of content online related to suicide, self-harm, depression and anxiety.

North London coroner’s court also heard that a note started on Molly’s phone, and discovered after her death, used language that appeared in a video clip she had viewed.

Before the apology, the KC representing Molly’s family, Oliver Sanders, had berated the Meta executive over the company letting teenagers view content related to suicide, depression and self-harm.

“I suggest to you it is an inherently unsafe environment and it is dangerous and toxic to have 13- and 14-year-olds alone in their bedrooms scrolling through this rubbish on their phones,” said Sanders.

Lagone replied: “I respectfully disagree.”

Raising his voice, Sanders said: “Why on earth are you doing this?” He said Instagram was choosing to put content “in the bedrooms of depressed children”, adding: “You have no right to. You are not their parent. You are just a business in America.”

Meta’s legal representative, Caoilfhionn Gallagher KC, then interjected, asking the senior coroner, Andrew Walker, to remind Sanders of guidelines on how to question witnesses at an inquest. Turning to Sanders, Walker said: “You have put your point.”

Last week a senior executive at Pinterest, another platform that Molly interacted with heavily before her death, apologised for the content that the platform had shown her.

However, earlier on Monday Lagone told the court that the majority of a batch of posts seen by Molly before she died was “safe” for children to see.

Lagone was taken through Instagram posts that were saved, liked and shared by Molly in the final six months of her life. The first batch shown to Lagone included content that Molly’s family believe encouraged suicide and self-harm, which would have been against Instagram guidelines at the time.

Lagone said they were “by and large” permissible under the platform’s guidelines because they represented an attempt to raise awareness of a user’s mental state and share their feelings. However, she conceded that at least two of the posts shown would have violated Instagram’s policies.

Lagone was then taken through another batch of content comprising a series of graphic video montages viewed by Molly before she died, some of which Lagone said would have violated content policies in 2017. The court heard that in the last six months of her life, Molly saved 16,300 images on her Instagram account, 2,100 of which related to depression, self-harm and suicide. In another exchange with Lagone, Walker said Instagram created “risk and danger” for users. “You create the risk, you create the danger, and then you take steps to lessen the risk and danger,” he said. Lagone replied: “We take our responsibilities seriously to have right policies and procedures in place.”

The inquest continues.

In the UK, the youth suicide charity Papyrus can be contacted on 0800 068 4141 or email pat@papyrus-uk.org, and in the UK and Ireland Samaritans can be contacted on freephone 116 123, or email jo@samaritans.org or jo@samaritans.ie. In the US, the National Suicide Prevention Lifeline is at 800-273-8255 or chat for support. You can also text HOME to 741741 to connect with a crisis text line counsellor. In Australia, the crisis support service Lifeline is 13 11 14. Other international helplines can be found at befrienders.org

Most viewed

Most viewed