Open in App
  • Local
  • U.S.
  • Election
  • Politics
  • Crime
  • Sports
  • Lifestyle
  • Education
  • Real Estate
  • Newsletter
  • The 19th News

    They’re crimes — so why do we keep calling them ‘porn’?

    By Jasmine Mithani,

    2024-07-11

    https://img.particlenews.com/image.php?url=1LeA5b_0uN6nlCN00

    In January, sexually explicit fake images of Taylor Swift went viral on X. One post had over 45 million views before it was removed, according to The Verge. Teens from New Jersey to California are using artificial intelligence to create and distribute fake nudes of their classmates. In Texas, a jury awarded a woman $1.2 billion in damages after her ex-boyfriend circulated intimate images of her online after their breakup.

    News outlets — and the general public — often use the word “porn” when describing these types of crimes.

    But it’s not porn. Pornography is a form of consensual adult entertainment, not abuse, and its legality has been upheld by the Supreme Court numerous times.

    Researchers, advocates and survivors have long preferred the term “image-based sexual abuse” to describe the variety of ways sexually explicit imagery can be weaponized. Often laws use more precise language like “intimate visual depiction.”

    It’s a crime with real impact, no matter what we call it. Survivors of image-based sexual abuse — overwhelmingly women and girls — can experience the same kinds of mental health issues as those who have been physically sexually assaulted.

    With the rise of generative AI, more people are at risk, as it becomes easier and easier to create fake images of anyone. It can feel like nowhere is safe.

    The language we use about this abuse matters — and it can be changed. The fights to change the terms “child pornography” and “revenge porn” show how wording affects survivors, policy and public perceptions of crime.

    Naming abuse

    Child safety advocates celebrated a win in June, when the congressional reauthorization for the National Center for Missing and Exploited Children used the term “child sexual abuse material” (CSAM) instead of “child pornography” to describe the scourge the agency seeks to eradicate. The National Center runs the CyberTipline national reporting service for online exploitation of children.

    The importance of language “really comes from how people perceive the crime, and how survivors feel like their victimization is being explained,” said Yiota Souras, chief legal officer at National Center for Missing and Exploited Children. Advocates in the United States have been pushing to use “child sexual abuse material” for the last decade, and her organization has been urging Congress to make the change official for five years. The Center supports the bipartisan EARN-IT Act , which would redefine child pornography as child sexual abuse material in all federal statutory codes.

    Since federal legislation can be slow-moving, the National Center has advocated for changes to state laws as well. Ten states in the past two years have passed laws changing the terminology in their legal codes.

    Resources for survivors of image-based sexual abuse

    The tendency to label any sexually explicit image as pornographic affects how law enforcement, juries and doctors view the crimes, Souras said. It causes confusion and implies that any children pictured were willing participants, which is not possible.

    “We want people to understand what this is. This isn’t sex scenes with somebody that looks young; this is rape and sexual abuse of children. We often say these are crime scene photos, and videos of the worst possible abuse that is being inflicted on a child by an adult,” Souras said.

    Using the right terminology for crimes, whether against children or adults, is important.

    “All of these issues are bringing to the forefront how we perceive these crimes, where we associate blame and where we associate victimization,” Souras said.

    The case against ‘revenge porn’

    It’s hard to pin down where the term “revenge porn” originated. Sharing intimate photographs without permission predates the internet, but technology has ramped up this behavior and worsened its consequences for victims. Popular websites popped up in the early 2000s encouraging men to share intimate photographs of their ex-girlfriends as “revenge” for break ups.

    Today, many advocates have a simple tagline: It’s not revenge, and it’s not porn.

    Sophie Maddocks said that many researchers like her hate the term and that about 10 years ago, they started coalescing around “image-based sexual abuse” instead. Maddocks is the incoming research director of the Center for Media at Risk at the University of Pennsylvania, which seeks to foster free and critical media practice and scholarship.

    “Revenge porn” blames the victim — both for the existence of the images in the first place and in somehow “provoking” the perpetrator — and trivializes what happened, Maddocks said. The word pornography has a titillating connotation too.

    “We don’t want this to seem like a sexual product that someone can enjoy,” she said.

    Maddocks said she sees this history repeating itself when people use phrases like “deep nudes” or “deepfakes” to describe sexually explicit digital forgeries.

    Deepfakes are images or videos where one person’s face or voice is mapped onto another person’s body. The term comes from a Reddit user who popularized the technique in 2017, using it to digitally paste the faces of celebrity women over videos of adult film actors. Originally a callback to machine learning, the word obscures the exploitation at both ends ; images of the bodies of the actors are also being modified and distributed without their consent. (The term has expanded to become a catch-all for any sort of realistic manipulated media.)

    Why this matters now

    Maddocks recently completed her doctoral thesis on the effects of image-based abuse on adolescent girls. She found young girls are often filled with dread about potentially being targeted, especially as advancements in generative AI render them defenseless. To them, it feels unavoidable.

    “There’s this new feeling of presumptive exposure, like you’re about to be exposed all the time,” Maddocks said. “That’s obviously a really damaging way to have to go through the world as a teenager.”

    “For every celebrity this happens to, it also happens to hundreds of regular people,” she continued.

    The illegal distribution of real or fake sexual images can have massive effects on a survivor’s socialization and mental health. In their 2014 article “Criminalizing Revenge Porn,” law professors Danielle Citron and Mary Anne Franks discuss reports that survivors completely withdraw from online life. If the images are tied to a survivor’s name, it can make job applications or background checks impossible.

    Nowadays, the nonconsensual sharing of intimate images is a crime in 49 states and the District of Columbia. The most recent state law was signed by Massachusetts Gov. Maura Healy on June 20, leaving South Carolina as the only remaining holdout .

    But state laws vary, and it’s unclear whether they can be useful to victims of sexually explicit AI abuse. Many laws focus on criminalizing distribution of intimate photos that were meant to be shared privately, although platforms generally can’t be held liable for the content they host, due to Section 230 of the Communications Decency Act. And the laws don’t necessarily mention AI-generated imagery, which is increasingly used to make deepfakes.

    That’s another reason for policymakers to embrace image-based sexual abuse as a more holistic term: It allows laws to stay flexible as technology evolves.

    For example, the federal law banning child sexual abuse material includes references to computer-generated imagery, said Souras from the National Center for Missing and Exploited Children. That has come in handy for prosecuting illegal AI-generated imagery, even though the legislators were originally referring to the potential of simpler tools like Photoshop.

    Abuse isn’t pornography

    Sex workers don’t want you to use the word “pornography” to describe crime, either.

    In 2015, when Norma Buster’s ex-boyfriend posted naked photos online that she had originally shared with him privately, the only term people were using to describe what happened to her was “revenge porn.” She said it was an escalation of the obsessive, intimidating behavior that had already been terrifying her for months.

    Buster reached out to attorney Carrie Goldberg, who connected her with a local prosecutor who took action against her ex for circulating those images under New Jersey law. She was eventually able to get a lifetime restraining order against him. Now, Buster works for Goldberg as chief of staff at C. A. Goldberg PLLC , a victims’ rights firm that specializes in people experiencing extreme harms from tech-facilitated abuse.

    As she became more involved in victims’ rights advocacy, Buster noticed how sex workers were excluded from movements against sexual violence. This became even more clear when she began engaging in online sex work herself. She also thought more about the connotations of calling image-based abuse “porn.”

    “​​If we are going to be inclusive, we need to be intentional about the language we use and making sure that it doesn’t accidentally stigmatize people who work in the porn industry or people who do sex work, because they also can become victims of the same thing we’re fighting against,” Buster said.

    Mislabeling abusive media as porn makes it seem like “this is something that either platforms or performers are complicit with,” said Mike Stabile, director of public policy at the Free Speech Coalition , the adult industry trade group. And that’s absolutely not the case.

    Sex workers are more likely to be harassed and have their images stolen and distributed without their consent, Stabile said.

    Many people in the industry prefer “adult content” instead of pornography, Stabile said, because it emphasizes the media is made for adults, by adults.

    How to talk or write about this abuse

    How journalists write about these issues can shape how the wider world sees them. The most important thing journalists can do is center harm in their language and reporting, Maddocks said.

    She said some advocates, survivors and researchers prefer “image-based sexual abuse” because it gives specificity to the type of harm. It’s important to note that it is sexual abuse, Maddocks said, because “when I speak to practitioners who are working with survivors of this harm, they have very similar symptoms as sexual abuse survivors.”

    Image-based sexual abuse is also an umbrella term. It can cover anything from digital forgeries made with generative AI to a video of a physical sexual assault to the nonconsensual sharing of photos that were first shared privately.

    “Tech-facilitated abuse” is an even broader term, which can cover all sorts of exploitations, including abusers using GPS to track their partners.

    Learn more

    The Free Speech Coalition’s Stabile also suggests substituting “sexually explicit” for “pornographic” when describing images, as it is more precise.

    Language encourages cultural shift, Buster said, and it’s also important to meet people where they are at. Many survivors search for “revenge porn” instead of “image-based sexual abuse” when they are seeking help, which is why C. A. Goldberg PLLC uses both terms on their website.

    While shifting culture can take a long time, there is hope. Some problematic terminology has already fallen out of fashion.

    “It’s worth thinking about the fact that 10 to 15 years ago the language for ‘revenge porn’ was ‘sex tape,’” Stabile said. “Being able to call the ‘Paris Hilton sex tape’ the ‘non-consensual intimate imagery of Paris Hilton’ makes a difference.”

    Expand All
    Comments / 0
    Add a Comment
    YOU MAY ALSO LIKE
    Most Popular newsMost Popular

    Comments / 0