Open in App
  • U.S.
  • Election
  • Newsletter
  • WashingtonExaminer

    Elon Musk faces off with Gavin Newsom after sharing fake Kamala Harris video

    By Ailin Vilches Arguello,

    10 hours ago

    https://img.particlenews.com/image.php?url=2ltiZc_0ugrbBRu00

    Elon Musk , tech mogul and X owner, is facing backlash after reposting a fake video on X that uses artificial intelligence to mimic Vice President Kamala Harris ’s voice.

    Musk is being accused of breaching his own social media platform’s policies by posting the video, which insults Harris’s campaign and President Joe Biden .

    The controversy stems from Musk’s failure to label the video as a parody , which appears to violate X’s policy against “misleading media .”

    “I, Kamala Harris, am your Democrat candidate for president because Joe Biden finally exposed his senility at the debate ,” the fake Harris voice says in the video.

    “I was selected because I am the ultimate diversity hire. I’m both a woman and a person of color, so if you criticize anything, I say you’re both sexist and racist ,” the video continues.

    Despite the original post from @MrReaganUSA's account labeling the video as a parody, the ad does not include a disclaimer that it uses AI to mimic Harris’s voice.

    “We believe the American people want the real freedom , opportunity, and security Vice President Harris is offering, not the fake, manipulated lies of Elon Musk and Donald Trump ,” Harris campaign spokeswoman Mia Ehrenberg told the Hill.

    In addition, Gov. Gavin Newsom (D-CA) condemned the post, arguing that such videos should be against the law .

    “Manipulating a voice in an ‘ad’ like this one should be illegal,” he wrote on X. “I’ll be signing a bill in a matter of weeks to make sure it is.”

    Musk fired back at Newsom with a crude joke defending sharing parody content online.

    “I checked with renowned world authority, Professor Suggon Deeznutz, and he said parody is legal in America,” Musk wrote in response to Newsom’s post.

    Federal regulators are increasing their efforts to address deepfake technology used to impersonate politicians.

    Other instances of AI use to impersonate politicians have occurred recently, such as a New Hampshire incident in which a fake Biden voice was used in a robocall to discourage voters from participating in the state’s primary earlier this year.

    Last week, the Federal Communications Commission advanced a proposal requiring advertisers to disclose the use of AI in TV and radio commercials.

    The use of mimicked voices is already prohibited in robocalls.

    “Bad actors are already using AI technology in robocalls to mislead consumers and misinform the public,” FCC Chairwoman Jessica Rosenworcel said earlier this month.

    CLICK HERE TO READ MORE FROM THE WASHINGTON EXAMINER

    “That’s why we want to put in place rules that empower consumers to avoid this junk and make informed decisions.”

    However, the FCC proposal would not extend to ads and videos online or on streaming services.

    Expand All
    Comments / 0
    Add a Comment
    YOU MAY ALSO LIKE
    Local California State newsLocal California State
    Most Popular newsMost Popular

    Comments / 0