Introduction In the last few years, the field of time-series forecasting has seen a fundamental shift. Where we once depended solely on classic statistical methods, think ARIMA, SARIMA, and Prophet, new “foundation” models have emerged, promising to bring the power and flexibility of large language models (LLMs) into the world of time-series data. The allure is obvious: can we build a single, reusable forecasting model that works across a variety of datasets and domains, instead of painstakingly training a new model for every scenario? Parseable is built to handle our users’ observability data at any scale, a nonstop stream of raw ingest counts, infrastructure vitals, and fine-grained application signals. Running a separate, hand-tuned forecasting model for every slice quickly turns into a treadmill: each new stream or workload tweak demands fresh hyper-params, retrains, and ever-growing config sprawl. All that manual churn slows forecasts and breeds drift, so the results never feel fully trustworthy. Then came the rise of foundation models, which revolutionised natural language processing by offering strong zero-shot and transfer learning capabilities. Researchers began asking a natural question: if LLMs can generalise to new tasks with minimal retraining, could similar techniques be applied to time-series data? What if you could just hand any telemetry stream to a pre-trained foundation model and immediately get a high-quality forecast, regardless of whether the model had seen data from that source before? Motivated by this possibility, we set out to benchmark a new generation of time-series foundation models, Amazon Chronos, Google TimesFM, IBM Tiny Time-Mixers, and Datadog Toto. Our goal was to assess how well these models perform on two representative tasks: a relatively straightforward forecasting problem (predicting ingestion volumes) and a more complex multivariate problem (forecasting multiple pod-level metrics). Along the way, we compared them to classical...
First seen: 2025-06-13 05:52
Last seen: 2025-06-13 18:54