In a London court this week, coroner Andrew Walker had the difficult task of assessing a question that child safety advocates have been asking for years: How responsible is social media for the content algorithms feed to minors? The case before Walker involved a 14-year-old named Molly Russell, who took her life in 2017 after she viewed thousands of posts on platforms like Instagram and Pinterest promoting self-harm. At one point during the inquest, Walker described the content that Russell liked or saved in the days ahead of her death as so disturbing, the coroner said in court that he found it “almost impossible to watch.”
Today, Walker concluded that Russell’s death couldn’t be ruled a suicide, Bloomberg reports. Instead, he described her cause of death as “an act of self-harm whilst suffering from depression and the negative effects of online content.”
Bloomberg reported that Walker came to this decision based on Russell’s “prolific” use of Instagram—liking, sharing, or saving 16,300 posts in six months before her death—and Pinterest—5,793 pins over the same amount of time—combined with how the platforms catered content to contribute to Russell’s depressive state.
This post has been read 22 times!