San Antonio/ Science, Tech & Medicine
AI Assisted Icon
Published on December 06, 2023
Brains & Bytes, UTSA's AI Mavericks Bag $4M to Spark Eco-Smart AI RevolutionSource: Google Street View

Researchers at the University of Texas at San Antonio are pushing the envelope on artificial intelligence, armed with a hefty $4 million in federal grants aimed at creating AI that learns more like the human brain yet uses a fraction of the energy of current systems. The funds were awarded to two teams from UTSA's MATRIX AI Consortium for Human Well-Being by the National Science Foundation's Emerging Frontiers in Research and Innovation program. This crucial financial boost is set to advance their goal of developing energy-efficient, continuous-learning AI technology.

The undertaking, led by Dhireesha Kudithipudi, the McDermott endowed chair in UTSA’s electrical and computer engineering department and founding director of the MATRIX AI Consortium, involves collaboration with co-investigators including Itamar Lerner, an assistant professor in the university’s psychology department, among others spanning institutions nationwide. "The challenge really is translating these principles of biological intelligence into engineered learning systems," Kudithipudi told Express News, pinpointing the crux of their 4-year project aimed at harnessing brain-like AI efficiency.

Tackling the astonishing energy demands of modern AI platforms, these UTSA projects set out to dramatically reduce the computational thirst of powerful AI systems. For perspective, the AI interface known as Chat GPT-3 guzzled an immense 1,287 megawatt-hours to get up and running—a stark contrast to the human brain's modest 20 watts. Delving into what's known as the temporal scaffolding hypothesis, which suggests the brain's sleep-state memory replay allows it to recognize patterns, UTSA's researchers are determining if similar principles can be applied to future AI systems to help them process patterns both in active and restful states.

This research pivot could redefine how AI interacts with and learns from real-world scenarios, a realm where it often falls short due to unfamiliarity or gaps in data. As Lerner has observed, the human brain's pattern recognition over time is key to its predictive prowess and minimal energy footprint. "This project is based on trying to take this basic mechanism and see if we can implement it in large neural networks in an energy-efficient way," Lerner highlighted during UTSA's research announcement. The team's practical approach will also include human sleep studies conducted at the university’s sleep lab.

Parallel to Kudithipudi's initiatives, Professor Fidel Santamaria of UTSA's neuroscience, developmental, and regenerative biology department is helming another $2 million project. Santamaria's group is putting a mathematical theory into practice, one that explains the adaptability of neurons based on previous activity, aiming to fashion energy-saving electric circuits. Humans, as Santamaria articulated, "are history dependent," with past experiences shaping our processing of new information, a trait they seek to emulate in electronic form. "The objective of the project is to develop the theory, implement circuits that discover new capacitors with these history-dependent properties, and then test if we can have optimal computation and optimal energy consumption at the same time," he said in a development that could potentially change the future of computational hardware architecture.

The MATRIX AI Consortium, founded in 2020, has been a cross-disciplinary beacon, unifying approximately 65 researchers across various fields from UTSA and partner institutions like UT Health, Southwest Research Institute, and Texas Biomedical Research Institute. Amidst a world reliant on the burgeoning field of AI, projects like these are quintessential for ensuring that the U.S. remains at the leading edge of technological development without sacrificing sustainable practices. The stakes are clear, as Santamaria warns of the alarming energy costs associated with training AI technology—"There is not enough energy on the planet to train all the AI" that humans are developing—that sets the stage for their ambitious quest to meld neuroscience insights with AI advancements.