University of Minnesota device slashes AI energy consumption

Researchers at the University of Minnesota Twin Cities have developed a cutting-edge hardware device that could dramatically reduce AI energy consumption by a factor of at least 1,000.

This breakthrough represents a significant leap forward in the quest for more energy-efficient AI applications.

Addressing the energy demands of AI

With AI applications increasingly prevalent, there is a pressing need to enhance energy efficiency without compromising performance or escalating costs.

Traditional AI processes consume vast amounts of power by constantly transferring data between logic (processing) and memory (storage).

The University of Minnesota’s new model, called computational random-access memory (CRAM), addresses this issue by keeping data within the memory for processing.

“This work is the first experimental demonstration of CRAM, where data can be processed entirely within the memory array without needing to leave the grid where a computer stores information,” explained Yang Lv, a postdoctoral researcher in the Department of Electrical and Computer Engineering and lead author of the study.

CRAM: A game-changer in AI energy efficiency

The International Energy Agency (IEA) predicts that AI energy consumption will double from 460 terawatt-hours (TWh) in 2022 to 1,000 TWh in 2026, comparable to Japan’s total electricity consumption.

CRAM-based machine learning inference accelerators could achieve energy improvements of up to 1,000 times, with some applications seeing energy savings of 2,500 and 1,700 times compared to traditional methods.

“Our initial concept to use memory cells directly for computing 20 years ago was considered crazy,” said Jian-Ping Wang, senior author of the paper and a Distinguished McKnight Professor at the University of Minnesota.

The interdisciplinary team, comprising experts from physics, materials science, computer science, and engineering, has been developing this technology since 2003.

The research builds on patented work into Magnetic Tunnel Junctions (MTJs), nanostructured devices used in hard drives, sensors, and other microelectronics systems, including Magnetic Random Access Memory (MRAM).

CRAM leverages these advancements to perform computations directly within memory cells, eliminating slow and energy-intensive data transfers typical of traditional architectures.

Breaking the von Neumann bottleneck

CRAM architecture overcomes the bottleneck of the traditional von Neumann architecture, where computation and memory are separate entities.

“CRAM is very flexible; computation can be performed in any location in the memory array,” said Ulya Karpuzcu, an Associate Professor and expert on computing architecture.

This flexibility allows CRAM to match the performance needs of various AI algorithms more efficiently than traditional systems.

CRAM uses significantly less energy than current random access memory (RAM) devices, which rely on multiple transistors to store data.

By employing MTJs—a type of spintronic device that uses electron spin instead of electrical charge—CRAM provides a more efficient alternative to traditional transistor-based chips.

The University of Minnesota team is now collaborating with semiconductor industry leaders to scale up demonstrations and produce the hardware necessary to reduce AI energy consumption on a larger scale.

The development of CRAM technology represents a monumental step towards sustainable AI computing.

By dramatically reducing AI energy consumption while maintaining high performance, this innovation promises to meet the growing demands of AI applications and pave the way for a more efficient and environmentally friendly future.

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Featured Topics

Partner News

Advertisements



Similar Articles

More from Innovation News Network