The Molly Rose Foundation (MRF) was established by the Russell family and close friends in memory of Molly Russell, who tragically passed away at the age of 14. Molly’s untimely death, linked to depression and the detrimental effects of social media, inspired the foundation’s mission to combat youth suicide. By conducting thorough research and driving policy changes, MRF seeks to uncover the harmful influence of social media and advocate for measures to safeguard young users.
The Challenge
To evaluate the prevalence and accessibility of harmful content on social media platforms, MRF directed its research towards Instagram, TikTok, and Pinterest. These platforms were scrutinized for promoting or enabling exposure to self-harm, suicidal themes, and content perpetuating hopelessness. With support from The Bright Initiative and Bright Data’s Dataset Marketplace, MRF collected and analyzed over a thousand posts tied to hashtags associated with suicide and self-harm. These posts were systematically categorized using both quantitative and qualitative measures to assess their impact.
The Findings
The study revealed a troubling pattern: social media platforms often recommend and amplify harmful content through algorithms designed to maximize engagement. Among the findings:
- Instagram: 48% of the most interacted-with posts promoted or glorified suicide or self-harm.
- TikTok: Similarly, 49% of highly engaged posts contained themes of depression, self-harm, or suicidal ideation.
- Pinterest: Its “more to explore” recommendation feature actively suggested content related to suicide and self-harm.
These alarming results underscore a systemic issue where platform algorithms prioritize user engagement over safety, exposing vulnerable young users to harmful material.
The Impact and Call to Action
The research has catalyzed discussions on the urgent need for stricter regulations and improved digital safety measures. MRF and Bright Data are actively collaborating to advocate for policy changes that hold social media platforms accountable for user well-being. The findings have strengthened the case for transparency in algorithmic operations and the implementation of safeguards to protect users, particularly young individuals, from harmful content.
Key Statistics:
- 1,181 Posts: Collected with hashtags related to self-harm and suicide.
- 48% Harmful Content: Nearly half of Instagram and TikTok’s analyzed posts glorified or promoted self-harm or suicidal themes.
- 12% Viral Reach: A significant portion of posts had over 1 million likes, amplifying their influence on young audiences.
Read More:
- Full Report: Preventable yet pervasive: The prevalence and characteristics of harmful content, including suicide and self-harm material, on Instagram, TikTok, and Pinterest.
- The Guardian Coverage: Insights into the report and its implications.
Conclusion
By shedding light on the harmful impact of social media, MRF and Bright Data aim to create a safer digital space for all users, ensuring platforms prioritize mental health and safety over engagement metrics.
Learn more about Bright Initiative’s activities here. Interested in exploring Bright Data’s products? Sign up now and start your free trial.
No credit card required