Researchers Tackle AI’s Energy Problem with a Greener Fix
Categories
By Grace Stanley
Artificial intelligence is getting more powerful – but it’s also racking up a massive energy bill. Some estimate that one maximum-length ChatGPT query can use about twice as much power as an average U.S. home does in one minute. Multiply that by billions of queries and the enormous training AI models require, and the energy impact is staggering.
As researchers are racing to find greener ways to power AI, a new study led by Tianyi Chen, associate professor of electrical and computer engineering at Cornell Tech, with collaborators from IBM and Rensselaer Polytechnic Institute, explores a promising solution: analog in-memory computing (AIMC), utilizing analog chips.
Unlike traditional architectures, which constantly move data back and forth between memory and processors, AIMC stores and processes data in a single location. “This leverages physics to perform the math calculation instantly without moving the data, potentially slashing power consumption by 1,000 times and making the next generation of AI sustainable,” said Chen, who is also associate professor at Cornell Engineering.
Read more in the Cornell Chronicle.
Grace Stanley is a staff writer-editor for Cornell Tech.
Media Highlights
Tech Policy Press
Content Moderation, Encryption, and the LawRELATED STORIES