Open in App
  • Local
  • U.S.
  • Election
  • Politics
  • Crime
  • Sports
  • Lifestyle
  • Education
  • Real Estate
  • Newsletter
  • The Guardian

    Smudgy chins, weird hands, dodgy numbers: seven signs you’re watching a deepfake

    By Ana Lucía González Paz Andrew Witherspoon and Elena Morresi Bryony MooreDan Milmo,

    7 hours ago
    https://img.particlenews.com/image.php?url=4I9tVR_0uAK243x00
    Composite of false, AI generated images Composite: selection of false, AI-generated images.

    In a crucial election year for the world, with the UK, US and France among the countries going to the polls, disinformation is swirling around social media.

    There is much concern about deepfakes, or artificial intelligence-generated images or audio of leading political figures designed to mislead voters, and whether they will affect results.

    They have not been a huge feature of the UK election so far , but there has been a steady supply of examples from around the world, including in the US where a presidential election looms.

    Here are the visual elements to look out for.

    Oddness around the mouth or chin

    In deepfake videos the area around the mouth can be the biggest giveaway. There may be fewer wrinkles in the skin, less detail around the mouth, or the chin looks blurry or smudged. Poor synchronisation between a person’s voice and mouth can be another sign.

    This deepfake video posted on 17 June shows a simulation of Nigel Farage destroying Rishi Sunak’s house in Minecraft. It is part of a trend of deepfake satire videos featuring politicians playing the online game.

    A couple of days later, another simulated video appeared of Keir Starmer playing Minecraft and setting a trap in “Nigel’s pub”.

    Dr Mhairi Aitken, an ethics fellow at the Alan Turing Institute, the UK’s national institute for AI, says the first giveaway for the Minecraft deepfakes is, of course, “the ridiculousness of the situation”. But another sign of AI-generated media or manipulation is imperfect sync between voice and mouth.

    “This is particularly clear in the segment with Farage talking,” says Aitken.

    Another tell, says Aitken, is whether shadows fall in the right place or whether lines and wrinkles on a face move when you would expect them to move.

    Ardi Janjeva, a research associate at the institute, adds that the low resolution throughout the video is another obvious sign that people should notice because it “immediately resembles something that is patched together”. He says people are familiar with this amateur approach because of the prevalence of “rudimentary, low-res scam email attempts”.

    This lo-fi approach then manifests itself in obvious areas such as the mouth and jawline, he says. “It shows in facial features like the mouth, which viewers tend to focus their attention on, where there is excessive blurring and smudging.”

    Strange elements of speech

    Another deepfake video was made of Keir Starmer selling an investment scheme by editing the audio over his 2023 New Year address video.

    If you listen carefully, you will spot that the sentence structure is odd, with Starmer saying “pounds” before the number multiple times, for example “pounds 35,000 a month”.

    Aitken says that, again, the voice and mouth are out of sync and the lower facial area is blurred. The use of “pounds” before a number indicates that a text-to-audio tool has probably been used to recreate Starmer’s voice, she adds.

    “This is likely an indication that a tool has been used to convert written words into speech, without checking this reflects typical spoken word patterns,” she says. “There are also some clues in the intonation. This maintains a fairly monotone rhythm and pattern throughout. To check the veracity of a video it’s a good idea to compare the voice, mannerisms and expressions with real recordings of the individual to see whether they are consistent.”

    Consistency between the face and body

    This deepfake video of the Ukrainian president, Volodymyr Zelenskiy, asking civilians to lay down their arms to the Russian military was circulated in March 2022. The head is of a disproportionate size to the rest of the body and there is a difference between the skin tones of the neck and face.

    Hany Farid, a professor at the University of California in Berkeley and a specialist in deepfake detection, says this is an “old-school deepfake”. The immobile body is a giveaway, he says. “The telltale sign in this so-called puppet-master deepfake is that the body below the neck doesn’t move.”

    Discontinuity across the video clip

    This video, circulated in May 2024, falsely shows the US state department spokesperson, Matthew Miller, justifying Ukrainian military strikes on the Russian city of Belgorod by telling a reporter “there are virtually no civilians left in Belgorod”. The video was tweeted by the Russian embassy in South Africa and then removed, according to a BBC journalist .

    The fake video shows the spokesperson’s tie and shirt changing colour from one point of the video.

    While this is a relatively conspicuous change, Farid notes that the generative AI landscape is changing rapidly and therefore so are the deepfake pointers. “We also have to always practise good information consumption habits that include a combination of good old common sense and a healthy amount of scepticism when presented with particularly outrageous or unlikely claims,” he says.

    Extra fingers, hands, limbs

    Look out for a surplus of fingers, legs, arms and odd-looking hands in still images that are AI-generated.

    A picture purportedly showing the US president, Joe Biden, and the vice-president, Kamala Harris, celebrating Donald Trump’s indictment was circulated on Twitter in April 2023.

    Interactive

    Signs that it could have been generated by AI include Kamala Harris’s right hand having six fingers. The top of the flag is distorted and the pattern on the floor is also awry.

    The AI team at Reality Defender, a deepfake detection firm , says prompts typed into image-generating tools can focus on people – often the names of well-known individuals – which result in outputs emphasising faces. As a result the artifice is often revealed in other details such as hands or physical backgrounds, as with the Biden-Harris image.

    “Usually the prompts to create such images place higher emphasis on the people featured in them – particularly the faces,” the Reality Defender team explains. “Thus, the outputs often create credible human faces with higher-frequency details, while deprioritising the physical consistency of the background (or, in many cases, other parts of bodies, like hands).”

    However, Reality Defender, which uses deepfake detection tools, says the increasing complexity of generative AI programmes means that manual scrutiny of deepfakes is becoming “decidedly less reliable”.

    Mangled letters and numbers

    AI image generators have difficulties reproducing numbers and text. This fake Trump mugshot published in April 2023 was made with such a tool. You can see how in the background, instead of a height chart, there is a combination of nonsensical numbers and letters.

    “The numbers and text in the background are a giveaway,” says Aitken. “AI image generators really struggle with producing text or numbers. They don’t have an understanding of the meaning of the symbols they produce so they typically produce garbled or illegible text and numbers. If there are any text or numbers in an image zooming into these can be a really good way of identifying if it might be AI-generated.”

    Interactive

    Jerky video edits

    Some manipulated images are put together so amateurishly they are easy to spot. Known as “cheapfakes”, these often use simple video-editing software and other lo-fi techniques.

    Just before the Mexican elections, a video of the then presidential candidate Claudia Sheinbaum was edited to show her saying she would close churches if elected. The clip was patched together in a deliberately misleading manner from a video where she was in fact stating: “They are saying, imagine the lie, that we’re going to close down churches.” An alternative background showing satanist symbols was also added in an attempt to make the clip even more damaging.

    Expand All
    Comments / 0
    Add a Comment
    YOU MAY ALSO LIKE
    Most Popular newsMost Popular
    Total Apex Sports & Entertainment19 days ago
    Total Apex Sports & Entertainment22 hours ago

    Comments / 0