Dean argued that though AI demands in data centers are increasing rapidly, it was from very little to begin with. "There’s been a lot of focus on the increasing energy usage of AI, and from a very small base that usage is definitely increasing," Dean said at a conference, according to Fortune .
"But I think people often conflate that with overall data center usage — of which AI is a very small portion right now but growing fast — and then attribute the growth rate of AI based computing to the overall data center usage," he added.
Dean's employer is one of those sounding a note of concern. “As we further integrate AI into our products, reducing emissions may be challenging due to increasing energy demands from the greater intensity of AI compute , and the emissions associated with the expected increases in our technical infrastructure investment," Google said in its report.
Microsoft co-founder Bill Gates has also recently suggested we need not worry about AI's energy demands, as efficiencies created by the technology would solve the problem. Like Dean, he believes it's not that big of a problem to start with.
"Let’s not go overboard on this," Gates said at a conference in London, per to The Guardian . "Data centers are, in the most extreme case, a 6% addition [in energy demand] but probably only 2% to 2.5%. The question is, will AI accelerate a more than 6% reduction? And the answer is: certainly."
More efficient systems through AI
Dean said Google remained on track to source all of its energy from renewable sources by the end of 2030, noting that the company is waiting for more clean energy providers to come online. When that happens, Google's proportion of energy that can be considered "clean" will immediately improve.
That said, Dean admitted that efficiency should be improved, too, saying: "we also want to focus on making our systems as efficient as possible."
Back in 2020, Google researchers warned in a paper about the environmental impact of AI development. They were told not to put their names on the paper if it was published, and two departed Google under conflicting stories, including the head of ethics at the time Timnit Gebru. At the time, Dean said in an internal email that the paper "didn't meet our bar for publication" and failed to include recent findings on how models could be made more efficient.
Still awaiting solutions for AI
Despite those findings from four years ago, AI continues to have high energy demands. A recent study from researchers at machine learning company Hugging Face showed that a generative AI system uses as much as 33 times the energy as alternative, task-specific software.
Part of that is down to the design of these massive systems. "Every time you query the model, the whole thing gets activated, so it’s wildly inefficient from a computational perspective," Hugging Face researcher Sasha Luccioni told the BBC.
Get updates delivered to you daily. Free and customizable.
Welcome to NewsBreak, an open platform where diverse perspectives converge. Most of our content comes from established publications and journalists, as well as from our extensive network of tens of thousands of creators who contribute to our platform. We empower individuals to share insightful viewpoints through short posts and comments. It’s essential to note our commitment to transparency: our Terms of Use acknowledge that our services may not always be error-free, and our Community Standards emphasize our discretion in enforcing policies. We strive to foster a dynamic environment for free expression and robust discourse through safety guardrails of human and AI moderation. Join us in shaping the news narrative together.
Comments / 0