Open in App
  • Local
  • U.S.
  • Election
  • Politics
  • Crime
  • Sports
  • Lifestyle
  • Education
  • Real Estate
  • Newsletter
  • TAPinto.net

    New Brunswick Law Firm Develops Team to Address AI-related Legal Issues

    By Chuck O'Donnell,

    21 days ago

    https://img.particlenews.com/image.php?url=0LVLiD_0u9s8xJb00

    The New Brunswick-based law firm of Hoagland, Longo, Moran, Dunst & Doukas has assembled a team of attorneys helmed by managing partner Chad Moore to educate, counsel and advise clients in the developing area of artificial intelligence law.

    Credits: shutterstock/metamorworks

    NEW BRUNSWICK – The advent of artificial intelligence promises to revolutionize aspects of everyday life in fields ranging from the arts to agriculture, entertainment to education, health care to hospitality.

    Nascent chatbot technology such as ChatGPT, Microsoft Copilot and Google Gemini and other large language models, however, are prompt-driven. And, since they rely on the accuracy of the information they receive, whatever they spit out can be prone to errors, omissions, misrepresentations, biases and more. And what are the ramifications if they’re fed information that is legally protected, such as confidential medical records or sealed court documents? Or protected by copyright laws?

    CLICK HERE TO SIGN UP FOR THE FREE TAPINTO.NET NEWSLETTER

    While the legal system races to catch up to trademark compliance, data privacy and other issues that are already beginning to arise because of the expanded use of AI, one New Brunswick-based firm has launched new services for clients with legal questions surrounding its use and its users.

    The firm of Hoagland, Longo, Moran, Dunst & Doukas has assembled a team of attorneys helmed by managing partner Chad Moore to educate, counsel and advise clients in this developing area of law.

    Moore, who was part of a state Supreme Court committee that was tasked with developing AI-use guidelines for lawyers, said the firm’s team consists of attorneys with expertise in areas such as litigation, transactional services, compliance, and copyright and intellectual properties.

    “I think at the point you feel you’ve been harmed or damaged, you want to just come in and have a consult with us or somebody else just to make sure,” Moore said. “A really good point is when you have the idea maybe you’re an entrepreneur and you want to create this app or something that relies on AI and you just want to have your ducks in a row, forming a business based on that idea, you want to have a consult.

    DOWNLOAD THE FREE TAPINTO APP FOR MORE LOCAL NEWS. AVAILABLE IN THE APPLE STORE AND THE GOOGLE PLAY STORE .

    “If you’re a business and certainly you have all this data and want to use AI to manage it, you want to reach out. If you’re a firm or a business and you don’t even have an AI policy because you’re not sure what it should contain or where your perils might be, you want to reach out.”

    The Biden Administration in March issued new guidelines on how federal agencies can and cannot use AI – an important step as society grapples with how to use it.

    Among other directives, the guidelines call for each agency to appoint a chief AI officer, a senior role that will oversee implementation of AI. And they outline how the government is trying to grow the workforce focused on AI, including a plan to hire at least 100 professionals in the field by the end of summer, according to an NPR report .

    Those guidelines were silent, however, on what Moore cites as one area that surely will come into question with the increased use of AI: intellectual properties. What happens if someone feeds several news articles about, say, Rutgers, generated by various media outlets into ChatGPT and publishes the resulting article?

    “There are many issues with that,” he said. “Where does your work live? Is it behind a paywall? How is AI using it? Who’s telling the AI to use it? Are they benefiting from that?”

    The front line in one of the most high-profile battles of AI-permitted uses wasn’t formed in some mahogany and leather courtroom, but on Hollywood picket lines.  The 148-day strike in 2023 by the screenwriters, which was enjoined by actors, sought to make sure AI couldn’t be used on their work without their approval. Similarly, actors sought guidelines to ensure technology couldn’t recreate their performances, the Associated Press reported.

    “The easiest example would be, just because AI could process your voice after hearing a certain number of words and then (could) create a whole dialogue with your voice and your inflections in your tone,” Moore said. “The actors were worried about that because the studios and everyone has their voices and they could then, for example, create a whole commercial with your voice without you ever having to be there to perform or get compensated for it.”

    As the practical implications of AI grow each day and the legal system tries to keep up, Moore said his AI team is uniquely qualified to help its clients because not only does it have the legal acumen, but also a firm grasp on the technology it entails.

    “We have some newer attorneys who know the ins and outs of everything you can imagine on a computer and sort of keep up with the times and understand how these models work, using them every day,” Moore said. “(It goes) all the way up to someone like myself who is very, very interested in the tech and what it can do for the efficiency of the firm, but also connected enough to the committees and other law firms and people in the area to make sure we’re up on the expertise as a department.”

    For more local news, visit TAPinto.net

    Expand All
    Comments / 0
    Add a Comment
    YOU MAY ALSO LIKE
    Most Popular newsMost Popular

    Comments / 0