
In February 2019, head of Instagram Adam Mosseri promised that the platform would ban any graphic content depicting self-harm.
“It won’t be in search, it won’t be in hashtags, it won’t be in recommendations,” he told a reporter for the BBC.
Beginning in the 2010s, the company had come under scrutiny for its perceived role in a growing teen mental health crisis. Then, in 2017, a 14-year-old girl in the United Kingdom named Molly Russell committed suicide after being fed self-harm content on Instagram. Russell’s death spurred an international reckoning on teenage social media usage.
But nearly a year after Mosseri assured the public that Instagram was taking action, executives for Meta, Instagram’s parent company, appeared to admit the algorithm was still pushing self-harm and eating disorder content.

