Open in App
  • Local
  • U.S.
  • Election
  • Politics
  • Sports
  • Lifestyle
  • Education
  • Real Estate
  • Newsletter
  • Business Insider

    Nvidia CEO Jensen Huang tells David Solomon companies get things done 20 times as fast with generative AI

    By Emma Cosgrove,

    7 hours ago

    https://img.particlenews.com/image.php?url=1dprpG_0vSrRpmX00

    https://img.particlenews.com/image.php?url=47nav4_0vSrRpmX00
    Nvidia CEO Jensen Huang.
    • Nvidia CEO Jensen Huang discussed AI-infrastructure ROI with Goldman Sachs CEO David Solomon.
    • Huang said providers of AI cloud computing make $5 for every $1 they spend on GPUs.
    • In the next decade, he predicts smaller, liquid-cooled data centers that are energy- and cost-efficient.

    "What Nvidia's really good at is creating new markets," Nvidia CEO Jensen Huang told Goldman Sachs CEO David Solomon at a technology conference hosted by the investment bank Wednesday. Nvidia investors and AI stakeholders continue to hang on to Huang's every word since Nvidia's less-than-blockbuster (though only when compared with previous blowouts) earnings report in August.

    While Nvidia's chips may seem ubiquitous today, Huang said the company had to spread the gospel of GPU-driven computing "one industry at a time." That part is clear to investors, with ample evidence from Nvidia's eye-watering revenues and estimates of total Big Tech AI-infrastructure spending surpassing $1 trillion .

    Solomon asked perhaps the most important question in tech today: Where is the return on investment from all this investment in AI infrastructure?

    Huang has addressed similar questions before, but on Wednesday, he offered more math in a twofold response.

    First, in the age of generative artificial intelligence, Huang said, cloud providers who buy Nvidia (and other) graphics processing units and rent them to tech companies make $5 for every $1 they spend.

    "Everything is all sold out," he said. "And so the demand for this is just incredible."

    Second, he pointed to those cloud providers' customers, who essentially rent computing time on GPUs. If companies convert traditional data-processing work to accelerated computing methods, Huang said, the incremental cost may double, but the job would be done 20 times as fast. "So you get a 10x savings," Huang said, adding: "It's not unusual to see this ROI."

    Data center of the future

    Huang's overarching answer to the question of when AI's ROI will be evident urged companies to "accelerate everything."

    "Any large-scale processing of any large amounts of data — you've got to accelerate that," he said.

    Huang said that upgrading existing data centers to "accelerated computing," or the parallel computing that Nvidia GPUs and other AI chips enable, was inevitable.

    With price tags for some models reaching into the millions, Nvidia server racks sound expensive, Huang said, but they replace "thousands of nodes" of traditional computing. Smaller, more densely packed data centers with liquid cooling are the future, he added.

    "Densified" data centers will be more energy- and cost-efficient, Huang said, adding: "That's the next 10 years."

    Nvidia's stock price rallied less than an hour after Huang's conversation with Solomon began.

    Even if Nvidia's claims of AI computing being more energy-efficient are true, the AI boom is expected to put massive pressure on electricity grids .

    Huang acknowledged that the stakes are high for all involved in the boom.

    "Demand is so great that delivery of our components and our technology and our infrastructure and software is really emotional for people because it directly affects their revenues," Huang said.

    Read the original article on Business Insider
    Expand All
    Comments /
    Add a Comment
    YOU MAY ALSO LIKE
    Local News newsLocal News

    Comments / 0