Open in App
  • Local
  • U.S.
  • Election
  • Politics
  • Crime
  • Sports
  • Lifestyle
  • Education
  • Real Estate
  • Newsletter
  • PBS NewsHour

    Nonconsensual sexual images posted online made worse by deepfakes and AI technology

    By Stephanie SyVeronica VelaAndrew Corkery,

    16 days ago

    https://img.particlenews.com/image.php?url=3St8tv_0uu9Ixi400

    Deepfake and AI technology’s ability to manipulate photos and even videos has made the problem of sexual images being posted online without consent even worse. Google recently announced new steps to combat sexually explicit deepfakes in their search results, but there is no one-size-fits-all solution. WIRED senior writer Paresh Dave joins Stephanie Sy to discuss this complex problem.

    Read the Full Transcript

    Stephanie Sy : Sexual images posted online without one’s consent have long been a problem, starting with revenge porn, but deepfake and AI technology’s ability to manipulate photos and even videos has made things worse. Even Taylor Swift has been a victim of AI generated sexual images.

    Now, Google has announced new steps to combat sexually explicit deepfakes in their search results, but there is no one size fits all solution for this complex problem.

    Paresh Dave is a senior writer at Wired who’s been reporting on all of this and joins us now. Paresh, thank you so much for joining us. How quickly have we seen the rise of nonconsensual sexual images online, and what is driving this?

    Paresh Dave, Senior Writer, Wired : I mean, it’s huge. It’s a huge problem. The doubling of these reports over the last couple of years is what we’ve been seeing in places like the U.K. and the U.S., and it’s in part driven by those deepfakes that you were talking about.

    Stephanie Sy : It is also driven in part by the porn industry, right? And this is all part of the porn economy now. People are paying to download images and videos of this nature.

    Paresh Dave : The nonconsensual nature, yes, but I would draw a distinction between porn and this nonconsensual image based sexual abuse, which is not considered porn. You know, porn is generally viewed as more consensual.

    Stephanie Sy : According to the company sensitive, 96 percent of deepfakes are sexually explicit and feature women who didn’t consent to the creation of the content. Who is most vulnerable to being victimized in this way?

    Paresh Dave : It’s everyone there’s been recent surveys that have shown it’s men, you know, it’s young boys. There’s a huge trend of sextortion that’s been happening. It’s anyone that has their photo on the internet, which at this point is pretty much all of us.

    Stephanie Sy : Google has said, Paresh, that it will reduce the prominence and relevance of deepfake images in its searches that addresses deepfakes. But what about actual images that are used without consent? Does Google go far enough to address those victims?

    Paresh Dave : So Google already takes those measures that you described that they’ve just recently applied to deepfakes. But what I’ve reported recently is that the key issue here is that there are numerous ideas on the table that Google could adopt to reduce the burden on victims of image based sexual abuse.

    These are ideas proposed by survivors, their advocates, even employees within Google, and Google has just refused to adopt these ideas, in part because they’re worried about over censorship, in part because they’re worried is coming off as too much of a regulator of the internet. But there are some basic ideas here that is just unfathomable why they haven’t pursued them.

    Stephanie Sy : You use this term image based sexual abuse. Can you describe some of what happens to victims who find themselves in this situation?

    Paresh Dave : Yeah. I mean, it’s people who might send an image consensually. You know that they’ve taken themselves a selfie, and then they’ve shared it with a friend, and then you know that friend has seeking revenge something goes wrong in that relationship, they post it online. This could be surreptitious recordings of people. There’s a number of ways someone could be, you know, suffering from this issue.

    And then the big issue is, then they have to go to the place where they were traumatized, go to the internet and find all these images, try to get it taken down. It’s a lot of burden on the actual victims or survivors here to get rid of all this material from the internet, and it takes years and years sometimes.

    Stephanie Sy : Well, that’s something that all of us parents deal with when we talk to our children. If your photo shows up on online, it could be there forever. How difficult is the process for victims to take this stuff down, and how willing are the technology companies to meet victims halfway and help them?

    Paresh Dave : One of the things that’s remarkable here is that there’s a whole cottage industry of companies that have popped up charging, sometimes thousands of dollars, you know, per sort of project or initiative to help victims get their images taken down.

    So, this problem is so bad that there’s like an economy that has been created around it, and these companies help the victims find the images. They even use AI to automatically find the images as they pop up and immediately send takedown notices. But it is but it is a pain. Sometimes companies will ask for identity verification, so you have to send them your ID, or they might question whether it was actually, you know, nonconsensual.

    You know, maybe you’ve been an OnlyFans creator, and then you have, you know, images of yourself leaked online. You know, Google might question, Well, are you still commercializing those images? It’s a whole mess that these survivors have to go through.

    Stephanie Sy : Besides tech companies, who else can be addressing this problem. Is this a law enforcement and regulatory policy problem as well?

    Paresh Dave : Absolutely. One of the things that I heard consistently from employees at Google and reporting my story was that, why aren’t law enforcement doing more to go after the criminals? Why haven’t we criminalized more of this behavior across the country?

    There’s, you know, most states have criminalized aspects of this, but there’s certainly not enough effort from law enforcement to go after the underlying criminals, in part because, you know, sometimes they may be overseas, or even when they are brought to justice, you can actually maybe get them in jail or get them to pay up the damages that they’re supposed to.

    Stephanie Sy That is Paresh Dave, a senior editor at Wired, joining us. Thanks so much, Paresh.

    Paresh Dave : Glad to be here.

    Expand All
    Comments / 0
    Add a Comment
    YOU MAY ALSO LIKE
    Most Popular newsMost Popular
    Total Apex Sports & Entertainment4 days ago
    Total Apex Sports & Entertainment9 hours ago

    Comments / 0