匿名用户2024年07月31日
29阅读
所属分类ai
开源地址https://modelscope.cn/models/better464/Timer

作品详情

Timer (Large Time Series Model)

This repo provides official code and checkpoints for Timer: Generative Pre-trained Transformers Are Large Time Series Models.

Updates

:triangularflagon_post: News (2024.6) Pre-training dataset (UTSD) is available in HuggingFace!

:triangularflagon_post: News (2024.5) Accepted by ICML 2024, a camera-ready version of 31 pages.

:triangularflagon_post: News (2024.4) The pre-training scale has been extended, enabling zero-shot forecasting.

:triangularflagon_post: News (2024.2) Releasing model checkpoints and code for adaptation.

Introduction

Time Series Transformer (Timer) is a Generative Pre-trained Transformer for general time series analysis. You can visit our Homepage for a more detailed introduction.

Datasets

We curate Unified Time Series Datasets (UTSD)) comprised of 1B time points and 4 volumes to facilitate the research on large time series models and pre-training.

Our dataset is released in HuggingFace to facilitate the research of large models and pre-training in the field of time series.

Tasks

Forecasting: We provide all scripts as well as datasets for few-shot forecasting in this repo.

Imputation: We propose segment-level imputation, which is more challenging than point-level imputation.

Anomaly Detection: We provide new benchmarks of predictive anomaly detection on UCR Anomaly Archive.

Code for Fine-tuning

  1. Install Pytorch and necessary dependencies.
pip install -r requirements.txt
  1. Put the datasets from Google Drive and Tsinghua Cloud under the folder ./dataset/.

  2. Put the checkpoint from Google Drive and Tsinghua Cloud under the folder ./checkpoints/.

  3. Train and evaluate the model. We provide the above tasks under the folder ./scripts/.

# forecasting
bash ./scripts/forecast/ECL.sh

# segement-level imputation
bash ./scripts/imputation/ECL.sh

# anomaly detection
bash ./scripts/anomaly_detection/UCR.sh

We also provide detailed README files for each task in folder ./scripts/.

Approach

Pre-training and Adaptation

To pre-train on heterogeneous time series, we propose single-series sequence (S3), reserving series variations with the unified context length. Further, we convert forecasting, imputation, and anomaly detection into a unified generative task.

Model Architecture

Given the limited exploration of the backbone for large time series models, we extensively evaluate candidate backbones and adopt the decoder-only Transformer with autoregressive generation towards LTSMs.

Performance

Timer achieves state-of-the-art performance in each task and we present the pre-training benefit on few-shot scenarios.

Scalability

By increasing the parameters and pre-training scale, Timer achieves notable performance improvement: 0.231 $\to$ 0.138 (−40.3%), surpassing the previous state-of-the-art deep forecasters.

Flexible Sequence Length

The decoder-only architecture provides the flexibility to accommodate time series of different lookback and forecast lengths.

Showcases

Forecasting under data scarcity

Imputation with few-shot samples

Anomaly detection on UCR Anomaly Archive

Future Work

We are preparing to provide the online service for zero-shot forecasting. Please stay tuned for the update!

Citation

If you find this repo helpful, please cite our paper.

@article{liu2024timer,
 title={Timer: Transformers for Time Series Analysis at Scale},
 author={Liu, Yong and Zhang, Haoran and Li, Chenyu and Huang, Xiangdong and Wang, Jianmin and Long, Mingsheng},
 journal={arXiv preprint arXiv:2402.02368},
 year={2024} 
}

Acknowledgement

We appreciate the following GitHub repos a lot for their valuable code and efforts.

  • Time-Series-Library (https://github.com/thuml/Time-Series-Library)

Contact

If you have any questions or want to use the code, feel free to contact:

  • Yong Liu (liuyong21@mails.tsinghua.edu.cn)
  • Haoran Zhang (z-hr20@mails.tsinghua.edu.cn)
声明:本文仅代表作者观点,不代表本站立场。如果侵犯到您的合法权益,请联系我们删除侵权资源!如果遇到资源链接失效,请您通过评论或工单的方式通知管理员。未经允许,不得转载,本站所有资源文章禁止商业使用运营!
下载安装【程序员客栈】APP
实时对接需求、及时收发消息、丰富的开放项目需求、随时随地查看项目状态

评论