Toto (Time Series Optimized Transformer for Observability) is a state-of-the-art time-series foundation model designed for multi-variate time series forecasting, emphasizing observability metrics. Toto efficiently handles high-dimensional, sparse, and non-stationary data commonly encountered in observability scenarios.
The average rank of Toto compared to the runner-up models on both the GIFT-Eval and BOOM benchmarks (as of May 19, 2025).
- Zero-Shot Forecasting: Perform forecasting without fine-tuning on your specific time series. - High-Dimension Multi-Variate Support: Efficiently process multiple variables using Proportional Factorized Space-Time Attention. - Decoder-Only Transformer Architecture: Support for variable prediction horizons and context lengths. - Probabilistic Predictions: Generate both point forecasts and uncertainty estimates using a Student-T mixture model. - Extensive Pretraining on Large-Scale Data: Trained on over 2 trillion time series data points, the largest pretraining dataset for any open-weights time series foundation model to date. - Tailored for Observability Metrics with State-of-the-Art Performance on GIFT-Eval and BOOM.
- Observability Metrics: ~1 trillion points from Datadog internal systems (no customer data) - Public Datasets: - GIFT-Eval Pretrain - Chronos datasets - Synthetic Data: ~1/3 of training data
For optimal speed and reduced memory usage, you should also install xFormers and flash-attention
Here's how to quickly generate forecasts using Toto:
â ď¸ In our study, we take the median across 256 samples to produce a point forecast. This tutorial previously used the mean but has now been updated.
For detailed inference instructions, refer to the inference tutorial notebook.
| Checkpoint | Parameters | Config | Size | Notes | |------------|------------|--------|------|-------| | Toto-Open-Base-1.0 | 151M | Config | 605 MB | Initial release with SOTA performance |
- Research Paper - GitHub Repository - Blog Post - BOOM Dataset
đ Citation If you use Toto in your research or applications, please cite us using the following: