Open in App
  • Local
  • U.S.
  • Election
  • Politics
  • Crime
  • Sports
  • Lifestyle
  • Education
  • Real Estate
  • Newsletter
  • Rough Draft Atlanta

    Local filmmaker grapples with AI in the industry

    By Sammie Purcell,

    5 days ago

    https://img.particlenews.com/image.php?url=3sA5Lm_0uT6Xj4L00
    Laila Harrison (right), the filmmaker behind the exhibition “Where the Mind Goes” that played at Eyedrum over the month of June (Photo by Ben Garden).

    A couple of years ago, Laila Harrison wasn’t even thinking about AI. But in May, “Where the Mind Goes” – billed as an immersive AI video installation, created by Harrison – debuted at Eyedrum.

    The use of artificial intelligence in art and filmmaking is an ongoing conversation, especially since last year’s writers and actors strikes , and one that still evolves day by day. The strikes centered around the very real fear that studios could use tools like Chat GPT, Midjourney, or the upcoming text-to-video generator Sora to replace human workers, as well as concerns about actors’ control over their likeness and when it’s used. It was around this time that Harrison also started to consider how AI might affect the film industry and her role within it moving forward.

    “I did not see myself being someone that uses AI,” Harrison said. “I think a lot of people that got involved with AI are very similar.”

    Harrison decided to attend a tech conference to learn more about how companies intended to use AI. But she walked away feeling like she had more questions than answers.

    “If no one has intentions for people like me, or no one has intentions for anyone, it’s up to me to figure out what the framework for utilizing these tools looks like,” she said.

    The strikes eventually ended, with both the writers and actors guilds reaching landmark deals concerning AI . But still, some guild members had concerns over whether those protections would be enough, and the looming question of how AI would affect the film industry did not go away. Everyday, filmmakers and creatives like Harrison are wrestling with how to approach AI in a world where the use of the technology in filmmaking seems increasingly inevitable.

    Audio_Video_Club, a local nonprofit that aims to support independent filmmakers, helped Harrison put on her exhibition at Eyedrum, which included a roughly 15-minute short film with voiceovers from different actors. Harrison asked the voiceover performers to answer a series of prompts and then used generative AI to create images based on that voiceover.

    Like most AI-generated images and videos, there is something a little uncanny and unreal about the images in the short film. Harrison considers that a good thing.

    “You can still sense this fakeness,” she said. “Which I’m glad that you can. When you look at it, I want you to see that. I want to be … transparent. This is AI.”

    Harrison said she sees usefulness in AI as a potential visualization tool, a way for filmmakers to more easily show producers and potential funding sources what their film might look like before they even get started. Even though she sees this as a potential benefit, however, she also has concerns.

    “I think that’s why I’m trying to dive deeper into it,” she said. “So I can learn how to mitigate it.”

    Harrison said she pays close attention to the companies making this technology – such as OpenAI, which owns Chat GPT and the upcoming tool Sora – and how transparent they are about what their models are trained on (OpenAI has been tight-lipped about how Sora was trained , and the New York Times sued OpenAI and Microsoft over for copyright infringement last year). She also has questions about how everyone – not just actors and filmmakers – can protect their likeness and what a legal framework for that will look like moving forward.

    “The main thing I’m talking to people about is protecting your likeness,” Harrison said. “I don’t think this is just for filmmaking. I think this is for everyone. How can we start creative regulations to protect our likeness? What is privacy about to look like for everyone?”

    The New York Times is not the only company suing an AI platform. According to Matthew Sag – a professor at Emory University who specializes in law and artificial intelligence, machine learning, and data science – as of June 25 there were around 24 different court cases against various AI companies alleging that the copying necessary at the beginning of the AI’s training process is copyright infringement.

    Sag said that the training process for an AI platform starts with copying data. In almost every case, that data is scraped from the open internet without express approval. Sag said that while that might sound like a straightforward case of copyright infringement, it’s a little more complicated than it seems.

    “Although massive quantities of copyrighted works have been copied, they’re almost like invisible copies. They’re copies that no human is ever going to look at or experience or enjoy,” Sag said. “Those copies are just used to train what is more or less just a very large fancy algorithm, so that that algorithm can create new things in response to new inputs.”

    Sag said when it comes to copyright, the most important question in these cases is how similar the outputs are to the inputs.

    “The question is, are they just learning abstract and uncopyrightable features, or are they actually copying significant amounts of training data and repackaging them for a new audience?,” Sag said. “That’s just a difficult fact question, at the moment.”

    Harrison does not think it’s ethical for AI platforms to train their models on current artists’ work. For her, it always comes back to the question of transparency and how that issue will be addressed moving forward.

    She brought up the European Union’s AI Act , recently established and the world’s first comprehensive regulation on artificial intelligence. The act sets a standard set of rules for companies in the EU on their use of artificial intelligence. The act bans specific uses that “threaten citizens’ rights,” sets out transparency requirements, and states that “general purpose AI systems” have to comply with copyright law and share summaries of their training data.

    She also talked about Cara, an anti-AI social media platform for artists , that believes that AI in its current form is unethical. Harrison sees things like Cara and the AI Act, as a way forward.

    “Pandora’s Box has already been opened,” Harrison said. “How do we find hope at the bottom of the box?”

    The post Local filmmaker grapples with AI in the industry appeared first on Rough Draft Atlanta .

    Expand All
    Comments / 0
    Add a Comment
    YOU MAY ALSO LIKE
    Most Popular newsMost Popular
    Total Apex Sports & Entertainment23 days ago
    Total Apex Sports & Entertainment25 days ago
    Total Apex Sports & Entertainment14 days ago
    Total Apex Sports & Entertainment9 days ago

    Comments / 0