Open in App
  • Local
  • U.S.
  • Election
  • Politics
  • Crime
  • Sports
  • Lifestyle
  • Education
  • Real Estate
  • Newsletter
  • DPA

    Social media firms failing to remove suicide content, research shows

    By DPA,

    1 day ago

    https://img.particlenews.com/image.php?url=3YyJvF_0uyOG65i00

    Some of the biggest social media platforms are failing to detect and remove dangerous suicide and self-harm content, according to a new study.

    Research from the Molly Rose Foundation found that of more than 12 million content moderation decisions made by six of the biggest platforms, over 95% were detected and removed by only two sites – Pinterest and TikTok.

    The charity said Meta’s Instagram and Facebook were each responsible for just 1% of all suicide and self-harm content detected by the major sites studied, and X, formerly known as Twitter, is responsible for just one in 700 content decisions.

    The study analysed publicly available records of over 12 million content moderation decisions taken by six sites: Facebook, Instagram, Pinterest, Snapchat, TikTok and X, and said it found the response of most platforms to suicide and self-harm content was “inconsistent, uneven and unfit for purpose”.

    The charity’s chairman, Ian Russell, and his family set up the Molly Rose Foundation in memory of his daughter, Molly, who ended her life at age 14, in November 2017, after viewing harmful content on social media.

    “Almost seven years after Molly died, it’s shocking to see most major tech companies continue to sit on their hands and choose inaction over saving young lives,” Russell said. “No ifs, no buts, it’s clear that assertive action is required.”

    In its report, the foundation said it had found that social media sites were routinely failing to detect harmful content on the highest risk parts of its services.

    For example, it said only one in 50 suicide and self-harm posts detected by Instagram were videos, despite the short-form video feature Reels now accounting for half of all time spent on the app.

    The study also accused sites of failing to enforce their own rules, noting that while TikTok detected almost three million items of suicide and self-harm content, it suspended only two accounts.

    The research was based on content moderation decisions made in the EU, which are required to be made publicly accessible.

    In response to the study, a Meta spokesperson said: “Content that encourages suicide and self-injury breaks our rules.

    “We don’t believe the statistics in this report reflect our efforts. In the last year alone, we removed 50.6m pieces of this kind of content on Facebook and Instagram globally, and 99% was actioned before it was reported to us.

    “However, in the EU we aren’t currently able to deploy all of our measures that run in the UK and the rest of the world.”

    A spokesperson for Snapchat said: “The safety and wellbeing of our community is a top priority. Snapchat was designed to be different to other platforms, with no open newsfeed of unvetted content, and content moderation prior to public distribution.

    “We strictly prohibit content that promote or encourage self-harm or suicide, and if we identify this, or it is reported to us, we remove it swiftly and take appropriate action.

    “We also share self-harm prevention and support resources when we become aware of a member of our community in distress, and can notify emergency services when appropriate.

    “We also continue to work closely with Ofcom on implementing the Online Safety Act, including the protections for children against these types of harm.”

    TikTok did not provide a statement but said its rules were clear that it did not allow showing, promoting or sharing plans for suicide or self-harm.

    Pinterest and X have not responded to a request for comment.

    Expand All
    Comments / 0
    Add a Comment
    YOU MAY ALSO LIKE
    Most Popular newsMost Popular
    Total Apex Sports & Entertainment24 days ago
    Total Apex Sports & Entertainment26 days ago

    Comments / 0