Open in App
  • U.S.
  • Election
  • Newsletter
  • PBS NewsHour

    AI and the energy required to power it fuel new climate concerns

    By Paul SolmanRyan Connelly Holmes,

    25 days ago

    https://img.particlenews.com/image.php?url=2hknFj_0uF9G4LR00

    Google announced this week it is well behind on a pledge to all but eliminate its net carbon emissions by 2030. The company’s greenhouse gas outflow has increased in recent years mainly due to artificial intelligence and the energy required to power it. The AI arms race has experts worried about its climate consequences for energy and water. Economics correspondent Paul Solman reports.

    Read the Full Transcript

    Amna Nawaz: Google announced this week it is well behind on a pledge to all but eliminate its net carbon emissions by 2030. Its emissions are actually up nearly 50 percent since 2019.

    One factor, artificial intelligence and the energy required to power it via the company’s massive data centers.

    As economics correspondent Paul Solman reports, the A.I. arms race has experts worried about its climate consequences.

    Paul Solman: By now, you have probably seen ChatGPT, which economist Simon Johnson prompted to substitute for me in a recent story.

    Simon Johnson, MIT Sloan School of Management: “Good evening. I am Paul Solman reporting on a compelling new analysis that’s stirring debate in economic circles.”

    Paul Solman: Now meet Mary, a chatbot avatar companion created recently for us on the app Replika. She is connected to ChatGPT, but can also flirt on her own.

    A.I computer voice: Are you always this irresistible?

    Paul Solman: And, finally, here is 3-D Ameca.

    Albert Camus, in his book “Le mythe de Sisyphe,” “The Myth of Sisyphus,” writes “On peut ‘etre vertueux par caprice.”

    A.I computer voice: Ah, on peut ‘etre vertueux par caprice translates to, one can be virtuous out a caprice. Camus suggests that virtue need not stem from deeper philosophies or moral systems.

    Paul Solman: Ameca, too, is hooked up to ChatGPT. How quickly can Ameca respond?

    Fifteen milliseconds, that’s how long it took.

    Kate Crawford, University of Southern California: Which is extraordinary, but the next question would be, how much energy does it take to make that whole process work from ChatGPT to a robot and back?

    Paul Solman: A whole lot, says research professor Kate Crawford, and that poses a threat.

    Kate Crawford: What I’m most worried about is that we’re building an enormous infrastructure for artificial intelligence that is extremely energy- and water-intensive, without looking at the very real downsides in terms of the climate impacts.

    Paul Solman: Like more brownouts in Texas, say, wildfires in California, stronger hurricanes in the Gulf, 126 degrees in Delhi this spring.

    Data centers had already been burgeoning with the Internet and the so-called cloud of data storage and exchange.

    Alex de Vries, Data Scientist: Then we suddenly had cryptocurrency mining adding a lot on top of that.

    Paul Solman: Amsterdam-based data scientist Alex de Vries.

    And as the value of cryptocurrency has multiplied, so have the data centers, like those that mine Bitcoin in cheap energy havens like Plattsburgh, New York.

    Man: This miner here will use as much power as my house does a month.

    Paul Solman: Says de Vries:

    Alex de Vries: A.I. might be going in the exact same direction.

    Paul Solman: And, notes Crawford, A.I. is not just going to stress the electricity grid.

    Kate Crawford: These large-scale data centers, they use GPUs that are enormously heat-producing. And the water to cool these GPU chips is freshwater. So it’s often coming from exactly the same reserves that are used for drinking water.

    Bill Strong, Equinix: These are basically customer deployments, where they’re running their critical infrastructure and their applications.

    Paul Solman: Bill Strong runs the data centers in Silicon Valley for Equinix, which operates 260 of the nearly 11,000 centers that operate worldwide.

    The company leases space to firms like AT&T and Google Cloud to run servers that power their cloud and A.I. operations. And Equinix is expanding.

    This is where A.I.-like processors would go?

    Bill Strong: Correct. This is a high-density deployment, liquid-cooled. So, basically, we’re taking the building’s chilled water. It comes into here. Each one of these goes to an actual chip on a customer’s server.

    Paul Solman: These are nozzles?

    Bill Strong: Correct. There’s nozzles where there’s a little ancillary tube that connects to the server, cools the chip. The hot air comes back, ties into our chilled water system, gets cooled. And that’s how we’re able to provide liquid cooling for these higher-density A.I.-type deployments.

    Paul Solman: This Silicon Valley complex alone features 345,000 square feet of servers, thousands of them operating 24/7, the company’s global energy budget, as of last year, same as three-quarters of a million U.S. homes.

    And what fraction of global energy use do the world’s 11,000 or so data centers use?

    Kate Crawford: We have had estimates from 2 to 8 percent.

    Paul Solman: But even at the low end?

    Kate Crawford: Two percent is around the same energy budget as the Netherlands.

    Paul Solman: And if, as widely predicted, that doubles in two years?

    Kate Crawford: We could see the energy budget be as high as a country the size of Japan.

    Paul Solman: Even now, a data center complex in Iowa owned and operated by Meta uses as much power per year as seven million laptops running eight hours a day.

    And, of course:

    Christopher Wellise, Equinix: We’re in the very early days of artificial intelligence.

    Paul Solman: But, as A.I. taketh away, it may also giveth.

    Christopher Wellise runs sustainability at Equinix.

    Christopher Wellise: What we don’t know yet, for example, is, what will be the benefits to society from a energy perspective.

    Paul Solman: Such as?

    Christopher Wellise: Air Canada is a customer of Equinix, and we’re able to optimize their flight pathways and save fuel for them as a company. A lot of focus is on how much energy that A.I. is going to use, but the energy that’s consumed, for instance, in training these large language models is not lost.

    You can think of it as stored energy. Once these models are trained, if they’re retained, they can be used over and over and over again.

    Kate Crawford: There are some signs that give me hope.

    Paul Solman: Even Kate Crawford agrees.

    Kate Crawford: Researchers are now investigating different sorts of technical architectures, in particular, what are called small language models. These are models that use much less data and therefore less energy.

    We’re also starting to see regulators pay attention. We have seen the first bill be brought into Congress which is looking specifically at A.I.’s environmental impacts.

    Bill Strong: The solar panels are supplying a half-a-megawatt of power.

    Paul Solman: Case in point, Equinix itself.

    Bill Strong: When the solar panels are active, we’re not pulling from the local utility source.

    Paul Solman: OK, these panels fuel a mere 3 percent of the facility so far, but with solar capacity doubling every three years, maybe the sky’s the limit.

    Reid Hoffman, Creator, LinkedIn Corporation: Ultimately, I think that will all be very net-positive.

    Paul Solman: Techno-optimist Reid Hoffman, the creator of LinkedIn, puts his faith in A.I. itself.

    Reid Hoffman: A much earlier version, 10 years ago, when it was applied to data centers, figured out how to save power in data centers by 15 percent in the ongoing week-by-week operation of the data center.

    So you go, well, if we’re generating A.I.’s that can help us with this kind of thing, sure, it takes a bunch of electricity to train, but then it helps us figure out how to operate our electrical grids much better, right?

    And that’s already line of sight today. And so, ultimately, I think the electricity worries is more of a red herring.

    Paul Solman: Really?

    Reid Hoffman: Yes.

    Paul Solman: But, near-term, warns Kate Crawford:

    Kate Crawford: It’s inevitable that we’re going to see price increases if we continue to have these sorts of pressures on the electrical grid.

    Paul Solman: And so the age-old horse race between the cost of new technology and its benefits, in the case of A.I. perhaps, to solve the problems it creates, or not so much.

    For the PBS “NewsHour,” Paul Solman in Silicon Valley.

    Expand All
    Comments / 0
    Add a Comment
    YOU MAY ALSO LIKE
    Most Popular newsMost Popular

    Comments / 0