Open in App
  • Local
  • U.S.
  • Election
  • Politics
  • Crime
  • Sports
  • Lifestyle
  • Education
  • Real Estate
  • Newsletter
  • LiveScience

    'Crazy idea' memory device could slash AI energy consumption by up to 2,500 times

    By Owen Hughes,

    1 day ago

    https://img.particlenews.com/image.php?url=1ZgDgN_0up4CRRY00

    Researchers have developed a new type of memory device that they say could reduce the energy consumption of artificial intelligence (AI) by at least 1,000.

    Called computational random-access memory (CRAM), the new device performs computations directly within its memory cells, eliminating the need to transfer data across different parts of a computer.

    In traditional computing, data constantly moves between the processor (where data is processed) and the memory (where data is stored) — in most computers this is the RAM module. This process is particularly energy-intensive in AI applications, which typically involve complex computations and massive amounts of data.

    According to figures from the International Energy Agency , global energy consumption for AI could double from 460 terawatt-hours (TWh) in 2022 to 1,000 TWh in 2026 — equivalent to Japan’s total electricity consumption.

    Related: Intel unveils largest-ever AI 'neuromorphic computer' that mimics the human brain

    In a peer-reviewed study published July 25 in the journal npj Unconventional Computing , researchers demonstrated that CRAM could perform key AI tasks like scalar addition and matrix multiplication in 434 nanoseconds, using just 0.47 microjoules of energy. This is some 2,500 times less energy compared to conventional memory systems that have separate logic and memory components, the researchers said.

    The research, which has been 20 years in the making, received financial backing from the U.S. Defense Advanced Research Projects Agency (DARPA), as well as the National Institute of Standards and Technology, the National Science Foundation and the tech company Cisco.

    Jian-Ping Wang , a senior author of the paper and a professor in the University of Minnesota’s department of electrical and computer engineering, said the researchers' proposal to use memory cells for computing was initially deemed "crazy."

    "With an evolving group of students since 2003 and a true interdisciplinary faculty team built at the University of Minnesota — from physics, materials science and engineering, computer science and engineering, to modeling and benchmarking, and hardware creation — [we] now have demonstrated that this kind of technology is feasible and is ready to be incorporated into technology," Wang said in a statement .

    The most efficient RAM devices typically use four or five transistors to store a single bit of data (either 1 or 0).

    RELATED STORIES

    'Universal memory' breakthrough brings the next generation of computers 1 step closer to major speed boost

    Computer inspired by Japanese art of paper-cutting has no electronics and stores data in tiny cubes

    Razor-thin crystalline film 'built atom-by-atom' gets electrons moving 7 times faster than in semiconductors

    CRAM gets its efficiency from something called "magnetic tunnel junctions" (MTJs). An MTJ is a small device that uses the spin of electrons to store data instead of relying on electrical charges, like traditional memory. This makes it faster, more energy-efficient and able to withstand wear and tear better than conventional memory chips like RAM.

    CRAM is also adaptable to different AI algorithms, the researchers said, making it a flexible and energy-efficient solution for AI computing.

    The focus will now turn to industry, where the research team hopes to demonstrate CRAM on a wider scale and work with semiconductor companies to scale the technology.

    Expand All
    Comments / 0
    Add a Comment
    YOU MAY ALSO LIKE
    Most Popular newsMost Popular
    LiveScience3 hours ago

    Comments / 0