Instagram content viewed by teenager Molly Russell before she took her own life was safe, the social media site’s head of health and wellbeing has told a court.
Elizabeth Lagone, a Meta executive, was taken through a number of posts the schoolgirl engaged with on the platform in the last six months of her life.
Meta is the parent company for Facebook, Instagram, and WhatsApp.
Ms Lagone told the inquest at North London Coroner’s Court she thought it was “safe for people to be able to express themselves” – but conceded two of the posts shown to the court would have violated Instagram’s policies.
Molly, from Harrow in northwest London, was 14 when she died in November 2017, prompting her family to campaign for better internet safety.
The inquest was told out of the 16,300 posts Molly saved, shared or liked on Instagram in the six-month period before her death, 2,100 were depression, self-harm or suicide-related.
The Russell family’s lawyer, Oliver Sanders KC, spent around an hour taking Ms Lagone through Instagram posts liked or saved by Molly and asked if she believed each post “promoted or encouraged” suicide or self-harm.
She said the content was “nuanced and complicated”, adding it was “important to give people that voice” if they were expressing suicidal thoughts.
‘It is safe for people to express themselves’
Addressing Ms Lagone as she sat in the witness box, Mr Sanders asked: “Do you agree with us that this type of material is not safe for children?”
Ms Lagone said policies were in place for all users and described the posts viewed by the court as a “cry for help”.
“Do you think this type of material is safe for children?” Mr Sanders continued.
Ms Lagone said: “I think it is safe for people to be able to express themselves.”
After Mr Sanders asked the same question again, Ms Lagone said: “Respectfully, I don’t find it a binary question.”
Coroner Andrew Walker interjected and asked: “So you are saying yes, it is safe or no, it isn’t safe?”
“Yes, it is safe,” Ms Lagone replied.
The coroner continued: “Surely it is important to know the effect of the material that children are viewing.”
Ms Lagone said: “Our understanding is that there is no clear research into that. We do know from research that people have reported a mixed experience.”
‘Who has given you the permission?’
Questioning why Instagram felt it could choose which material was safe for children to view, the coroner then asked: “So why are you given the entitlement to assist children in this way?
“Who has given you the permission to do this? You run a business.
“There are a great many people who are … trained medical professionals. What gives you the right to make the decisions about the material to put before children?”
Ms Lagone responded: “That’s why we work closely with experts. These aren’t decisions we make in a vacuum.”
Last week, Pinterest’s head of community operations, Judson Hoffman, apologised after admitting the platform was “not safe” when Molly used it – and “deeply regrets” the posts she viewed before her death.
The inquest, due to last up to two weeks, continues.
Anyone feeling emotionally distressed or suicidal can call Samaritans for help on 116 123 or email [email protected]. Alternatively, letters can be mailed to: Freepost SAMARITANS LETTERS.