Meta launches Thrive program to share signals about violating suicide content
Meta logo | Image credit: Meta
Meta is taking another big step toward removing dangerous content from its platform. The social network giant announced this week it has teamed up with the Mental Health Coalition to launch Thrive, a program that allows other social networks to share signals about violating suicide or self-harm content.Thanks to Thrive, participating tech companies will be able to share signals about violating suicide or self-harm content so that other companies can investigate and take action if the same or similar content is being shared on their platforms.
Meta explains that “participating companies will start by sharing hashes – numerical codes that correspond to violating content – of images and videos showing graphic suicide and self-harm, and of content depicting or encouraging viral suicide or self-harm challenges.”
According to Meta, these signals represent content only, so they will not include identifiable information about any accounts or individuals. The social network giant promises to provide support to those sharing and searching for content related to suicide or self-harm by connecting them to local organizations, including Suicide and Crisis Lifeline and Crisis Text Line in the US.
Meta revealed that between April and June, it took action on over 12 million pieces of suicide and self-harm content on Facebook and Instagram alone. That’s an impressive amount of harmful content, but it’s safe to assume that the number is much higher if we take into consideration the other participating companies, Snap and TikTok.
Things that are NOT allowed: