Open in App
  • Local
  • U.S.
  • Election
  • Politics
  • Sports
  • Lifestyle
  • Education
  • Real Estate
  • Newsletter
  • The New York Times

    The U.S. Regulates Cars, Radio and TV. When Will It Regulate AI?

    By Ian Prasad Philbrick,

    2023-08-24
    https://img.particlenews.com/image.php?url=04cRVv_0o8L92rM00
    Samuel Altman, CEO of OpenAI, at a congressional hearing on artificial intelligence in Washington, May 16, 2023. (Haiyun Jiang/The New York Times)

    As increasingly sophisticated artificial intelligence systems with the potential to reshape society come online, many experts, lawmakers and even executives of top AI companies want the U.S. government to regulate the technology, and fast.

    “We should move quickly,” Brad Smith, the president of Microsoft, which launched an AI-powered version of its search engine this year, said in May. “There’s no time for waste or delay,” Chuck Schumer, the Senate majority leader, has said. “Let’s get ahead of this,” said Sen. Mike Rounds, R-S.D.

    Yet history suggests that comprehensive federal regulation of advanced AI systems probably won’t happen soon. Congress and federal agencies have often taken decades to enact rules governing revolutionary technologies, from electricity to cars. “The general pattern is, it takes a while,” said Matthew Mittelsteadt, a technologist who studies AI at George Mason University’s Mercatus Center.

    In the 1800s, it took Congress more than half a century after the introduction of the first public, steam-powered train to give the government the power to set price rules for railroads, the first U.S. industry subject to federal regulation. In the 20th century, the bureaucracy slowly expanded to regulate radio, television and other technologies. And in the 21st century, lawmakers have struggled to safeguard digital data privacy.

    It’s possible that policymakers will defy history. Members of Congress have worked furiously in recent months to understand and imagine ways to regulate AI, holding hearings and meeting privately with industry leaders and experts. Last month, President Joe Biden announced voluntary safeguards agreed to by seven leading AI companies.

    But AI also presents challenges that could make it even harder — and slower — to regulate than past technologies.

    <b>The Hurdles</b>

    To regulate a new technology, Washington first has to try to understand it. “We need to get up to speed very quickly,” Sen. Martin Heinrich, D-N.M., who is part of a bipartisan working group on AI, said in a statement.

    That typically happens faster when new technologies resemble older ones. Congress created the Federal Communications Commission in 1934, when television was still a nascent industry, and the FCC regulated it based on earlier rules for radio and telephones.

    But AI, some advocates for regulation argue, combines the potential for privacy invasion, misinformation, hiring discrimination, labor disruptions, copyright infringement, electoral manipulation and weaponization by unfriendly governments in ways that have little precedent. That’s on top of some AI experts’ fears that a superintelligent machine might one day end humanity.

    While many want fast action, it’s hard to regulate technology that’s evolving as quickly as AI. “I have no idea where we’ll be in two years,” said Dewey Murdick, who leads Georgetown University’s center for security and emerging technology.

    Regulation also means minimizing potential risks while harnessing potential benefits, which for AI can range from drafting emails to advancing medicine. That’s a tricky balance to strike with a new technology. “Often, the benefits are just unanticipated,” said Susan Dudley, who directs George Washington University’s regulatory studies center. “And, of course, risks also can be unanticipated.”

    Overregulation can quash innovation, Dudley added, driving industries overseas. It can also become a means for larger companies with the resources to lobby Congress to squeeze out less-established competitors.

    Historically, regulation often happens gradually as a technology improves or an industry grows, as with cars and television. Sometimes it happens only after tragedy. When Congress passed, in 1906, the law that led to the creation of the Food and Drug Administration, it didn’t require safety studies before companies marketed new drugs. In 1937, an untested and poisonous liquid version of sulfanilamide, meant to treat bacterial infections, killed more than 100 people across 15 states. Congress strengthened the FDA’s regulatory powers the following year.

    “Generally speaking, Congress is a more reactive institution,” said Jonathan Lewallen, a University of Tampa political scientist. The counterexamples tend to involve technologies that the government effectively built itself, like nuclear power development, which Congress regulated in 1946, one year after the first atomic bombs were detonated.

    “Before we seek to regulate, we have to understand why we are regulating,” said Rep. Jay Obernolte, R-Calif., who has a master’s degree in AI. “Only when you understand that purpose can you craft a regulatory framework that achieves that purpose.”

    <b>Brain Drain</b>

    Even so, lawmakers say they’re making strides. “I actually have been very impressed with my colleagues’ efforts to educate themselves<em>,” </em>Obernolte said. “Things are moving, by congressional standards, extremely quickly.”

    Regulation advocates broadly agree. “Congress is taking the issue really seriously,” said Camille Carlton of the Center for Humane Technology, a nonprofit that regularly meets with lawmakers.

    But in recent decades, Congress has changed in ways that could impede translating studiousness into legislation. For much of the 20th century, the leadership and staff of congressional committees dedicated to specific policy areas — from agriculture to veterans’ affairs — served as a kind of institutional brain trust, shepherding legislation and often becoming policy experts in their own right. That started to change in 1995, when Republicans led by Newt Gingrich took control of the House and slashed government budgets. Committee staffs stagnated, and some of the committees’ power to shape policy devolved to party leaders.

    “Congress doesn’t have the kind of analytic tools that it used to,” said Daniel Carpenter, a Harvard professor who studies regulation.

    For now, AI policy remains notably bipartisan. “These regulatory issues we’re grappling with are not partisan issues, by and large,” said Obernolte, who helped draft a bipartisan bill that would give researchers tools to experiment with AI technologies.

    But partisan infighting has already helped snarl regulation of social media, an effort that also began with bipartisan support. And even if lawmakers agreed on a comprehensive AI bill tomorrow, next year’s elections and competing legislative priorities — like funding the government and, perhaps, impeaching Biden — could consume their time and attention.

    <b>A Department of Information?</b>

    If federal regulation of AI did emerge, what might it look like?

    Some experts say a range of federal agencies already have regulatory powers that cover aspects of AI. The Federal Trade Commission could use its existing antitrust powers to prevent larger AI companies from dominating smaller ones. The FDA has already authorized hundreds of AI-enabled medical devices. And piecemeal, AI-specific regulations could trickle out from such agencies within a year or two, experts said.

    Still, drawing up rules agency by agency has downsides. Mittelsteadt called it “the too-many-cooks-in-the-kitchen problem, where every regulator is trying to regulate the same thing.” Similarly, state and local governments sometimes regulate technologies before the federal government, such as with cars and digital privacy. The result can be contradictions for companies and headaches for courts.

    But some aspects of AI may not fall under any existing federal agency’s jurisdiction — so some advocates want Congress to create a new one. One possibility is an FDA-like agency: Outside experts would test AI models under development, and companies would need federal approval before releasing them. Call it a “Department of Information,” Murdick said.

    But creating a new agency would take time — perhaps a decade or more, experts guessed. And there’s no guarantee it would work. Miserly funding could render it toothless. AI companies could claim its powers were unconstitutionally overbroad, or consumer advocates could deem them insufficient. The result could be a prolonged court fight or even a push to deregulate the industry.

    Rather than a one-agency-fits-all approach, Obernolte envisions rules that accrete as Congress enacts successive laws in coming years. “It would be naive to believe that Congress is going to be able to pass one bill — the AI Act, or whatever you want to call it — and have the problem be completely solved,” he said.

    Heinrich said in his statement, “This will need to be a continuous process as these technologies evolve.” Last month, the House and Senate separately passed several provisions about how the Defense Department should approach AI technology. But it is not yet clear which provisions will become law, and none would regulate the industry itself.

    Some experts aren’t opposed to regulating AI one bill at a time. But they’re anxious about any delays in passing them. “There is, I think, a greater hurdle the longer that we wait,” Carlton said. “We’re concerned that the momentum might fizzle.”

    This article originally appeared in <a href="https://www.nytimes.com/2023/08/24/upshot/artificial-intelligence-regulation.html">The New York Times</a>.

    Expand All
    Comments / 18
    Add a Comment
    gusto
    2023-08-25
    As long as this POS administration is in office??? We’re screwed into our own deaths!
    100PercentTexan
    2023-08-25
    So does it seem like we are living in a society that is either going to go “Terminator” or “Demolition Man”?
    View all comments
    YOU MAY ALSO LIKE
    Local News newsLocal News
    The New York Times15 days ago

    Comments / 0