
Power-Efficient AI Training
Optimizing Analog Computing for Large Language Models
This research explores how analog in-memory computing (AIMC) can dramatically reduce the environmental and economic costs of training large AI models.
- Investigates how resistive elements' response functions impact training dynamics
- Analyzes the role of electrical conductance as weight representation in hardware
- Demonstrates potential for significant energy efficiency gains for large model training
- Provides engineering insights for optimizing hardware implementation
As AI models continue to grow in size and complexity, hardware solutions like AIMC become increasingly critical for sustainable AI development and deployment in resource-constrained environments.
Analog In-memory Training on General Non-ideal Resistive Elements: The Impact of Response Functions