Tech Giants Unite To Combat Self Harm Content Online Tech Giants Unite To Combat Self Harm Content Online

Tech Giants Unite to Combat Self-Harm Content Online

In a groundbreaking collaboration, Meta, Snap, and TikTok have launched “Thrive,” a pioneering program aimed at curbing the proliferation of graphic content related to self-harm and suicide across social media platforms.

Key Takeaways

  • Meta, Snap, and TikTok join forces to create Thrive program
  • Initiative designed to share “signals” of violating content across platforms
  • Collaboration with Mental Health Coalition to destigmatize mental health discussions

A United Front Against Harmful Content

The Thrive program represents a significant step forward in the tech industry’s efforts to address mental health concerns online. By enabling participating companies to share “signals” alerting each other to violating content, Thrive aims to create a more robust defense against the spread of harmful material.

“This collaborative approach allows us to respond more swiftly and effectively to potential threats,” said a spokesperson for Meta, which provides the technical infrastructure for Thrive.

Technological Innovation for Social Good

Thrive utilizes the same cross-platform signal sharing technology employed in the Lantern program, which combats child abuse online. This technology allows participating companies to share hashes that match violating media, creating a more comprehensive detection system.

Balancing Safety and Open Dialogue

Meta emphasizes its commitment to making harmful content less accessible while preserving space for important mental health discussions. “We’re striving to strike a balance between protecting our users and allowing for honest conversations about mental health,” the Meta spokesperson added.

The Scale of the Challenge

According to Meta’s internal data, the company takes action on millions of pieces of suicide and self-harm content quarterly. In the last quarter alone, an estimated 25,000 posts were restored, primarily following user appeals.

Industry Implications

This collaborative effort could set a new standard for how tech companies address sensitive content issues. As social media platforms continue to grapple with content moderation challenges, initiatives like Thrive may become increasingly crucial.

Looking Ahead

As Thrive rolls out, industry observers will be watching closely to see its impact on content moderation practices and mental health discussions online. The success of this program could pave the way for further collaborations in tackling other challenging online issues.


If you or someone you know is struggling with thoughts of self-harm or suicide, help is available. In the US, contact the Suicide & Crisis Lifeline by calling or texting 988. For international resources, visit the International Association for Suicide Prevention website.

0 0 votes
Article Rating
Subscribe
Notify of
guest
0 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
0
Would love your thoughts, please comment.x
()
x