Open in App
  • U.S.
  • Election
  • Newsletter
  • USA TODAY

    Riots in the UK spurred by racist posts on X. Experts warn of a repeat in the US

    By Will Carless and Jessica Guynn, USA TODAY,

    3 hours ago

    On July 29, in a coastal town in northwest England, a 17-year-old burst into a Taylor Swift-themed dance party and began stabbing children, killing three and injuring at least eight others.

    The 17-year-old’s name was not initially released, but people across the UK soon learned that he was a newly arrived Muslim immigrant. Within days, unrest had boiled over across the country, with attacks on immigrants, immigration facilities and mosques.

    The story, of course, wasn’t true. The suspect wasn’t an immigrant; he had been born in Wales.

    But the false narratives and anti-immigrant sentiment that sparked the political violence had been rocket-fueled, just as other recent unrest had been spread in other European countries: via social media platforms that experts warn are too friendly both to disinformation and the influence of all sorts, especially the far-right.

    Now, with a presidential election less than three months away, the United States could face the same spread of lies and manipulative posts, experts told USA TODAY. In a race already saturated from a steady drip of false claims about immigrants and election fraud, the country is primed for a comparable flashpoint.

    “There's no reason why we couldn't see exactly the same pattern here in the U.S.,” said Heidi Beirich, co-founder of the Global Project Against Hate and Extremism. “As we get closer to the election here in the United States, and tempers flare, any incident like the incident that happened in England could spark the same fury here.”

    https://img.particlenews.com/image.php?url=1ozQ16_0uzT9tk200
    Demonstrators clash with police officers during an anti-immigration protest in Rotherham, Britain, Aug. 4, 2024. Hollie Adams, REUTERS

    A crucial player in the swirl of misinformation, experts say, is X, the platform long known as Twitter, and its owner Elon Musk, who commands a personal following of almost 200 million users.

    X did not respond to an email seeking comment. Musk did not respond to a request for comment made on X.

    But Musk has shown public support for the racist backlash abroad (at one point tweeting “Civil war is inevitable” in the U.K.,), and also for election disinformation at home. And efforts to curb misinformation and shut down hate-speech have been largely abandoned on X.

    Beyond Musk’s platform, similar efforts across all of social media have also largely fallen apart. Increasingly Facebook owner Meta and other platforms have cut staff tasked with policing potentially harmful content and loosened strict content moderation policies. And some platforms, notably Telegram, have become hotbeds of rumor, conspiracy and discussions of political violence.

    “What we have seen in the UK is really much bigger than X or Musk,” said Katherine Keneally,  head of threat analysis and prevention at the Institute for Strategic Dialogue. “This has been percolating for years.”

    The effects come not just from the social platforms, Keneally said, but from their most-followed users. Musk’s influence mirrors various far-right influencers who hold sway on subject matters as widely varying as anti-abortion politics or white supremacy.

    And the events in Europe, Keneally said, reveal how quickly that influence can turn to violent action.

    https://img.particlenews.com/image.php?url=10pMSJ_0uzT9tk200
    Floral tributes lie near the scene where a teenage suspect was arrested after children were stabbed in Southport, Britain, August 5, 2024. REUTERS/Manon Cruz Manon Cruz, REUTERS/

    UK political violence driven by lies

    In the hours after the horrific knife attack in the coastal UK town of Southport, social media sites were rife with conspiracy theories about the attacker.

    Tensions were already high in the country after an incident a few days earlier at Manchester Airport, where two men were involved in a violent altercation with police. One of the men was kicked in the head by a police officer, leading to the suspension of the officer and sparking demonstrations in Manchester and nearby Rochdale against police violence.

    Following these events, far-right, anti-Islamic influencers inside and outside the UK posted repeatedly about Muslim immigrants and asylum seekers, making claims that were often spurious.

    “The initial wave was based on the prejudiced presumption that if something awful has happened, it will be a Muslim that's done it,” said Joe Mulhall, director of research for the London-based activist group HOPE Not Hate.

    One of the most prominent voices pushing anti-Muslim sentiment was Tommy Robinson (real name Steven Yaxley-Lennon), a British commentator who posts Islamophobic material, and who fled the country last month on the eve of a hearing in a contempt-of-court case.

    Even as the Southport attacker’s real identity was revealed, Robinson doubled down with more disinformation as political violence spread across the U.K.

    In a weeklong string of posts that eventually accumulated more than 430 million total views, according to an analysis from the Center for Countering Digital Hate, Robinson falsely linked the Southport stabbing to Islam, claimed investigators were lying about the identity of the attacker, and spread an inaccurate story  that far-right protesters had been stabbed by Muslims.

    As Robinson and others propagated these lies, white supremacists and others took to the streets in several cities in the UK, surrounding immigration centers and mosques, burning and overturning cars and clashing with police.

    “These seemingly organic riots, in fact, have a lot of inorganic elements to them, said Joel Finkelstein, co-founder of the Network Contagion Research Institute, which studies the spread of online disinformation.

    The resulting clashes were just the latest example of real-life violence stemming from online disinformation campaigns, Finkelstein said. A similar pattern has played out in other cities in the U.K. and in Ireland and Portugal in recent months, he said.

    Bad actors — not just far-right influencers but also foreign governments and even bots — have been able to take advantage of an online ecosystem that is completely unprepared, and in some cases unwilling, to tackle misinformation campaigns, Finkelstein said.

    “The disease has evolved much faster than the immune system has,” Finkelstein said. “I think we're approaching a place of maximum information entropy, where these things are going to be harder and harder to see, especially as the situation becomes more unpredictable.”

    But while public officials in the UK condemned the riots, and the disinformation that fed them, one notable influencer stepped up to spread them: X owner Elon Musk.

    https://img.particlenews.com/image.php?url=4KgjM0_0uzT9tk200
    Elon Musk, X/Tesla CEO, arrives before the start of Senate Majority Leader Chuck Schumer, Senators Rounds, Heinrich And Young hosting the Inaugural Artificial Intelligence Insight Forum With Key AI Stakeholders To Help Forge Bipartisan Consensus On Legislation To Capitalize On This Transformative Technology. Jack Gruber / USA TODAY NETWORK

    Musk’s role in spreading UK disinformation

    Throughout last week’s violence, Musk helped amplify the anti-Muslim rhetoric and false claims spreading across his platform and others.

    According to an analysis by the British news outlet Tortoise , Musk responded to a Robison tweet criticizing British Prime Minister Keir Starmer’s handling of the riots, questioned Robinson’s recent arrest and allowed Robinson’s documentary — which includes false claims about Syrian refugees and has been banned from being shown in the UK — to rack up 33 million views on X.

    “People on this side of the Atlantic have been open-mouthed, watching him interact with extremely well known, very extreme far-right figures in the last two weeks,” Mulhall said. “At a time when most of the country was reeling from this horrifying wave of far-right violence, Elon Musk decided to engage with people from extraordinarily extreme and fringe groups.”

    Divisive and misleading content has flooded X since Musk – a self-described free speech absolutist – bought Twitter in 2022, pledging to remake it as an anything-goes platform.

    Musk gutted the policy and trust and safety staff who were tasked with keeping misinformation and other harmful content off the platform, and he overhauled the blue-check verification system, giving anyone willing to pay for a subscription an algorithmic boost.

    He also welcomed back white supremacists, conspiracists and election deniers who had been deplatformed. One beneficiary was Robinson, who was banned from Twitter in 2018. Since being reinstated under Musk, Robinson’s audience has more than doubled.

    Musk preaches a philosophy of “freedom of speech, not freedom of reach,” meaning X may leave up misleading or abusive content but will limit the number of people who see it.

    But nothing seems to curtail Musk’s own reach.

    In recent weeks, he has fed conspiracy theories, such as claiming that Biden’s withdrawal from the presidential race was a coup, or insinuating that the Secret Service intentionally failed to protect former President Trump from an assassination attempt.

    Musk posted a deepfake video of vice president and Democratic presidential candidate Kamala Harris which makes it appear as if she is saying that she is the “ultimate DEI hire.”

    Some accounts supporting Harris say they have faced unexplained restrictions in recent weeks.

    Just this week, Musk opened up the platform to Trump for a friendly conversation he billed as an interview , in which the former president talked for two hours unchecked, repeating disproven claims on the environment, the economy and more.

    When designing guardrails to protect users, company leaders never anticipated that a future owner would be the one to flout the rules, four former Twitter employees told USA TODAY. They spoke on the condition of anonymity out of fear of retaliation.

    “We had good policies. We had good enforcement. But we had very few safeguards in place to protect against our own leaders,” one former employee said.

    “So much of what we were working on at that time was to create guidelines or rules so that no one would be able to put their thumb on the scale,” another former employee said. “The breathtaking part of this is: What do you do when the most important user has constantly got his finger on the scale?”

    https://img.particlenews.com/image.php?url=3oHR8w_0uzT9tk200
    A counter-protest against racism gathers outside the headquarters of political party Reform UK in London, Aug. 10, 2024. Belinda Jiao, REUTERS

    A pervasive challenge

    Across social media, the challenge remains difficult to solve without a shared or even universal effort to remove disinformation across platforms, experts say. Instead, the momentum has been in the opposite direction.

    “Over the last four to five years, the pendulum has swung back more to the center and companies are more hands-off about moderating content,” said Nu Wexler, who was Twitter’s first communications staffer in Washington, D.C., and head of global policy communications

    None more than X. Musk embraces content that other platforms “have tried very hard to avoid,” said Wexler, now a partner with Four Corners Public Affairs.

    “I don’t think anyone really ever considered the possibility that the owner of a platform would participate and amplify some of this stuff,” he said.

    That combination troubles Emerson Brooking, a director of strategy and resident senior fellow at the Atlantic Council's Digital Forensic Research Lab.

    Musk "is sole owner and operator of the most politically consequential social media platform in the United States,” Brooking said. “With the troubles in Britain, Musk has seemed almost eager to use his platform to elevate the forces of racism, hate, and violence.”

    While U.S. law offers little in the way of limits on disinformation, trouble may be brewing for X in Europe, fueled by the recent clashes with British officials furious over his tweets.

    Starmer's office said there was “no justification” for Musk’s comments.

    https://img.particlenews.com/image.php?url=1GR5Af_0uzT9tk200
    New UK Prime Minister Keir Starmer celebrates historic election win Agency Pool / UK Pool / Reuters

    X is facing an investigation into charges it breached the EU’s Digital Services Act – or DSA – which requires social media platforms to prevent the spread of harmful content.

    If the company is found in violation, the European Commission has the power to levy fines of as much as 6% of X’s global annual revenue.

    “In our view, @X doesn’t comply with the DSA in key transparency areas,” Margrethe Vestager, European Commissioner for Competition, wrote in July.

    Musk replied:  "The DSA IS misinformation!"

    This article originally appeared on USA TODAY: Riots in the UK spurred by racist posts on X. Experts warn of a repeat in the US

    Expand All
    Comments / 0
    Add a Comment
    YOU MAY ALSO LIKE
    Most Popular newsMost Popular

    Comments / 0