Open in App
  • U.S.
  • Election
  • Newsletter
  • AFP

    US disinformation researcher laments 'incredible witch hunt'

    By Anuj CHOPRA with Alex PIGMANBastien INZAURRALDE,

    14 hours ago
    https://img.particlenews.com/image.php?url=4WBUsS_0vL4tUFf00
    Renee DiResta, author of "Invisible Rulers" and formerly with the Stanford Internet Observatory, a non-partisan disinformation research project /AFP

    Understanding disinformation has emerged as a lightning rod in the United States ahead of the November election, with academics and think-tanks facing lawsuits by right-wing groups and subpoenas from a Republican-led congressional committee.

    The researchers are accused of colluding with the government to censor conservative speech online under the guise of fighting disinformation. They deny the claims and denounce the sweeping offensive as an intimidation campaign.

    AFP spoke with Renee DiResta, author of "Invisible Rulers: The people who turn lies into reality."

    She was formerly with the Stanford Internet Observatory (SIO), a non-partisan disinformation research project.

    Following the Republican-led investigation, her contract, along with those of many other staffers, was not renewed, leading to reports that the group was being dismantled under political pressure.

    The interview was edited for length and clarity.

    QUESTION:

    What pressure did the Stanford Observatory face?

    ANSWER:

    We received a letter and then a subpoena from Jim Jordan, who heads a (Republican-led) committee that asked us for our emails with the executive branch of the United States and with tech platforms.

    It was a very broad request ostensibly to investigate whether there had been some sort of cabal by which the government was telling us to tell tech platforms to take information or content down. That never happened.

    We turned over copious amounts of material, several colleagues who had worked on this project sat for multi hour private interviews with the committee.

    There was nothing found to bolster their theory, but it created extensive costs in terms of time and lawyer costs. Students were targeted, doxxed and harassed.

    Ultimately, Stanford made the determination to not continue to pursue rapid response election research and many of our contracts weren't renewed for funding reasons.

    QUESTION:

    What impact has this had on election disinformation research?

    ANSWER:

    There has been a chilling effect. This idea that inquiries from congressional committees are shutting down research or making students afraid to pursue them because they're afraid of being harassed is remarkable.

    We are one institution among many. I saw a statistic that something like 91 subpoenas had gone out from this committee. It's just an incredible witch hunt and the cost of that is that less resourced institutions may choose to not fight, to just comply as quickly as possible.

    There has been a sense that doing work on certain topics is going to attract unwanted attention, and so you shouldn't do work on those topics. That's terrible. Academia is supposed to be about asking hard questions, doing complicated research, doing things that perhaps industry might not want to take on, or that government is not positioned to take.

    QUESTION:

    How do you deal with personal attacks? You have been branded “CIA Renee” by trolls insinuating that you have secret ties with the US intelligence agency.

    ANSWER:

    I've dealt with idiots on the internet for a decade now. People have their opinions.

    I am not troubled by the harassment of the trolls online. I'm troubled by the fact that the United States government (through the congressional committee) is facilitating it at this point with misleading investigations, misleading reports, cherry picked sentences, leaked documents and badly framed stories that bear no relation to the truth.

    That I think is the problem that we need to be focusing on. That is an affront to free speech.

    QUESTION:

    Many tech platforms have scaled back content moderation. Are they equipped to tackle the flood of election disinformation?

    ANSWER:

    There's this belief that if you just label something or take it down, you solve the problem. You don't.

    We can debate the areas where the platforms are not doing enough, because there certainly are some. But you also can't solve a human problem with technology. People are going to share rumors.

    To address disinformation, platforms have traditionally appended a label, a fact check perhaps (but) it's not clear how well the labels work.

    One of the things that we've seen is that we need more proactive content participation from the institutions. We need election officials out there proactively countering rumors.

    Expand All
    Comments / 0
    Add a Comment
    YOU MAY ALSO LIKE
    Most Popular newsMost Popular

    Comments / 0