This breakthrough memory technology could make AI 1,000 times more efficient

We all know AI has a power problem. On the whole, global AI usage already drew as much energy as the entire nation of Cyprus did in 2021.

But engineering researchers at the University of Minnesota Twin Cities have developed and demonstrated a new computer memory design that could drastically reduce the amount of energy AI systems consume to help temper this problem. Their research was recently published in the journal Nature journal Unconventional Computing.

Recommended Videos

Most modern computing systems are built on what is known as the Von Neumann architecture, where the logic and memory subsystems are separated. During normal operations, data is shuttled back and forth between the memory modules and processors. This is the basic foundation of modern computers operate.

However, as processing speeds rapidly outpace I/O technology, this data transfer becomes a bottleneck both in terms of processing speed (also known as the memory wall problem) and power consumption. As the researchers pointed out, just shuffling the data back and forth consumers as much as 200 times the amount of power that the computations themselves do.

Developers have sought to work around this issue by bringing the logic and memory physically closer together with “near-memory” and “in-memory” computing designs. Near-memory systems stack the logic and memory on top of one another in a 3D array, they’re layered PB&J-style, while in-memory systems intersperse clusters of logic throughout the memory on a single chip, more like a peanut butter and banana sandwich.

The Twin Cities research team’s solution is a novel, fully digital, in-memory design, dubbed computational random-access memory (CRAM), wherein, “logic is performed natively by the memory cells; the data for logic operations never has to leave the memory,” per the researchers. The team achieved this by integrating a reconfigurable spintronic compute substrate directly into the memory cell, an advance that the researchers found could reduce an AI operation’s energy consumption by an “order of 1,000x over a state-of-the-art solution.”

And that 1,000x improvement could just be the baseline. The research team tested CRAM on an MNIST handwritten digit classifier task and found it to be “2,500× and 1,700× less in energy and time, respectively, compared to a near-memory processing system at the 16 nm technology node.”

The emerging AI industry is already facing significant resource issues. The ever faster, ever more powerful and capable GPUs that underpin AI software are immensely energy hungry. NVIDIA‘s newest top-of-the-line Blackwell B200 consumes up to 1,200W, for example, and generates so much waste heat that it requires liquid cooling, another resource-intensive operation.

With hyperscalers like Google, Amazon, and Microsoft all scrambling to build out the physical infrastructure necessary to power the oncoming AI revolution — ie gigawatt-sized data centers, some with their own attached nuclear power plants — creating more energy-efficient compute and memory resources will become increasingly critical to the long-term viability of AI technology.

Editors’ Recommendations

  • Gemini AI is making robots in the office far more useful

  • AI is making a long-running scam even more effective

  • Are gaming PCs more expensive today? Here’s what $1,000 bought you 10 years ago




Latest posts

Rockstar Games says hack will have ‘no impact’

Rockstar confirmed on Saturday that some of its data was compromised in a breach of a third-party provider. The group ShinyHunters claimed responsibility, saying...

Room for the Moon is thrillingly weird experimental pop

I mean, I guess it kinda looks like the moon? | Image: Kate NV / RVNG Intl. For obvious reasons, I've had Moon on the...

OpenAI says Elon Musk is orchestrating a last-minute ‘legal ambush’ before trial

The feud between Elon Musk and OpenAI is getting even more contentious as the two sides get ready for trial later this month. The...

The US government wants Reddit to snitch on one of its users through a grand jury

Immigration and Customs Enforcement has a certain Redditor in its crosshairs and it's now strong-arming the social media platform to reveal who they are...

Apple reportedly testing out four different styles for its smart glasses that will rival Meta Ray-Bans

Apple may be late to the smart glasses market, but it could be covering all its bases with up to four potential styles for...

How AT&T created the most iconic phone ever

For years, even decades, virtually everyone in the United States had the same phone. Nobody really thought about it, it didn't even matter what...

The Hisense UR9 is a great first shot against OLED’s bow

Hisense is first out of the gate with the UR9 RGB LED TV, which uses individual red, green, and blue LEDs for its backlight. RGB...

You can grab a refurbished 2021 Kindle Paperwhite starting at just $49.99

We spend a lot of time at The Verge waxing poetic about the latest gadgets, but sometimes it’s the last-gen devices from several years...

You can save $20 on the Super Mario Galaxy game bundle when you buy a Nintendo Switch 2

In celebration of The Super Mario Galaxy Movie coming to theaters, Nintendo is making it a little cheaper to get both Super Mario Galaxy...

YouTube Premium is the only streaming service that can get away with price hikes

YouTube confirmed this week that Premium prices are going up for the first time in around three years and, unlike every other streaming service,...