New Chip Uses AI to Shrink Large Language Models’ Energy Footprint by 50%

Typography

Oregon State University College of Engineering researchers have developed a more efficient chip as an antidote to the vast amounts of electricity consumed by large-language-model artificial intelligence applications like Gemini and GPT-4.

Oregon State University College of Engineering researchers have developed a more efficient chip as an antidote to the vast amounts of electricity consumed by large-language-model artificial intelligence applications like Gemini and GPT-4.

“We have designed and fabricated a new chip that consumes half the energy compared to traditional designs,” said doctoral student Ramin Javadi, who along with Tejasvi Anand, associate professor of electrical engineering, presented the technology at the recent IEEE Custom Integrated Circuits Conference(Link is external) in Boston.

“The problem is that the energy required to transmit a single bit is not being reduced at the same rate as the data rate demand is increasing,” said Anand, who directs the Mixed Signal Circuits and Systems Lab(Link is external) (Link is external)at OSU. “That’s what is causing data centers to use so much power.”

Read More: Oregon State University

Ramin Javadi in lab (Photo Credit: Karl Maasdam)