
Time-MoE: Scaling Up Time Series Forecasting
Leveraging Mixture-of-Experts for Billion-Scale Time Series Foundation Models
Time-MoE introduces a breakthrough architecture for time series forecasting that scales to billion-parameter models while remaining computationally efficient.
- Combines transformer architecture with mixture-of-experts to handle massive time series data at scale
- Enables unified pre-training across diverse time series domains
- Achieves superior forecasting capabilities at lower computational costs than previous approaches
- Designed for real-world engineering applications where accurate prediction is critical
This research represents a significant advancement for engineering applications requiring time series forecasting, from predictive maintenance to demand planning, by making large-scale foundation models practical for industrial deployment.
Time-MoE: Billion-Scale Time Series Foundation Models with Mixture of Experts