Open in App
  • Local
  • U.S.
  • Election
  • Politics
  • Sports
  • Lifestyle
  • Education
  • Real Estate
  • Newsletter
  • UPI News

    Meta, Snap, TikTok to remove self-harm content

    By Mark Moran,

    11 hours ago

    Sept. 12 (UPI) -- Three of the biggest social media platforms are teaming up to address online content that features suicide and self harm, Meta announced Thursday.

    https://img.particlenews.com/image.php?url=1pBHcY_0vUe7bhv00
    A Meta workspace in San Francisco in 2023. The company is part of a social media coalition that claims it will monitor and remove content featuring self harm. Photo by Terry Schmitt/UPI

    Meta, the owner Facebook, Instagram and WhatsApp, has teamed up with Snap and TikTok to form Thrive, an initiative designed to destigmatize mental health issues and work to slow the viral spread of online content featuring suicide or self-harm, Meta said in a blog post .

    "Suicide and self-harm are complex mental health issues that can have devastating consequences," Meta said in its release.

    "We're prioritizing this content because of its propensity to spread across different platforms quickly," Antigone Davis, Meta's global head of safety, wrote in the post. "These initial signals represent content only, and will not include identifiable information about any accounts or individuals."

    The initiative was formed in conjunction with The Mental Health Coalition , a group of mental health organizations working to destigmatize these issues.

    Meta, Snap and TikTok will share tips with each other, or "signals," allowing them to compare notes and investigate and take steps if similar content appears on other apps. Thrive will serve as a database that all the participating social media companies will have access to.

    Meta is using technology developed by Lantern , a company designed to make technology safe for minors. Amazon , Apple, Google , Discord, OpenAI and others are part of the coalition. Meta made clear in its release that it is targeting content, not users.

    "We're prioritizing this content because of its propensity to spread across different platforms quickly. These initial signals represent content only, and will not include identifiable information about any accounts or individuals," Davis wrote in the blog post.

    The social media companies will be responsible for reviewing and taking any necessary action through Thrive, and for writing a yearly report to measure the program's impact .

    Meta said when content featuring self harm or suicide is identified, it will be given a number, or a "hash," which can then be crossed checked by the other social media companies, look for the content and remove it.

    Increased social media use by minors has caused a spike in depression and suicidal behavior, the Mental Health Coalition said. Research also suggests that young people who self harm are more active on social media.

    Earlier this year, Meta announced it would begin removing and limiting sensitive content deemed to be "age-inappropriate" from teenagers' feeds on its apps. The company said it had plans to hide search results and terms relating to suicide, self harm and eating disorders for all users.

    Meta, TikTok, Snapchat and other social media platforms have long been criticized for failing to remove content deemed harmful to teens, including videos and images of self-harm.

    Expand All
    Comments / 3
    Add a Comment
    Dan Welch
    24m ago
    because they're is going to be alot of depressed maga after Trump loses again in November.
    View all comments
    YOU MAY ALSO LIKE
    Local News newsLocal News
    Alameda Post17 days ago
    Total Apex Sports & Entertainment21 days ago

    Comments / 0