匿名用户2024年07月31日
60阅读

技术信息

开源地址
https://modelscope.cn/models/better464/Timer

作品详情

Timer (Large Time Series Model)

This repo provides official code ad checkpoits for Timer: Geerative Pre-traied Trasformers Are Large Time Series Models.

Updates

:triagularflago_post: News (2024.6) Pre-traiig dataset (UTSD) is available i HuggigFace!

:triagularflago_post: News (2024.5) Accepted by ICML 2024, a camera-ready versio of 31 pages.

:triagularflago_post: News (2024.4) The pre-traiig scale has bee exteded, eablig zero-shot forecastig.

:triagularflago_post: News (2024.2) Releasig model checkpoits ad code for adaptatio.

Itroductio

Time Series Trasformer (Timer) is a Geerative Pre-traied Trasformer for geeral time series aalysis. You ca visit our Homepage for a more detailed itroductio.

Datasets

We curate Uified Time Series Datasets (UTSD)) comprised of 1B time poits ad 4 volumes to facilitate the research o large time series models ad pre-traiig.

Our dataset is released i HuggigFace to facilitate the research of large models ad pre-traiig i the field of time series.

Tasks

Forecastig: We provide all scripts as well as datasets for few-shot forecastig i this repo.

Imputatio: We propose segmet-level imputatio, which is more challegig tha poit-level imputatio.

Aomaly Detectio: We provide ew bechmarks of predictive aomaly detectio o UCR Aomaly Archive.

Code for Fie-tuig

  1. Istall Pytorch ad ecessary depedecies.
pip istall -r requiremets.txt
  1. Put the datasets from Google Drive ad Tsighua Cloud uder the folder ./dataset/.

  2. Put the checkpoit from Google Drive ad Tsighua Cloud uder the folder ./checkpoits/.

  3. Trai ad evaluate the model. We provide the above tasks uder the folder ./scripts/.

# forecastig
bash ./scripts/forecast/ECL.sh

# segemet-level imputatio
bash ./scripts/imputatio/ECL.sh

# aomaly detectio
bash ./scripts/aomaly_detectio/UCR.sh

We also provide detailed README files for each task i folder ./scripts/.

Approach

Pre-traiig ad Adaptatio

To pre-trai o heterogeeous time series, we propose sigle-series sequece (S3), reservig series variatios with the uified cotext legth. Further, we covert forecastig, imputatio, ad aomaly detectio ito a uified geerative task.

Model Architecture

Give the limited exploratio of the backboe for large time series models, we extesively evaluate cadidate backboes ad adopt the decoder-oly Trasformer with autoregressive geeratio towards LTSMs.

Performace

Timer achieves state-of-the-art performace i each task ad we preset the pre-traiig beefit o few-shot scearios.

Scalability

By icreasig the parameters ad pre-traiig scale, Timer achieves otable performace improvemet: 0.231 $\to$ 0.138 (−40.3%), surpassig the previous state-of-the-art deep forecasters.

Flexible Sequece Legth

The decoder-oly architecture provides the flexibility to accommodate time series of differet lookback ad forecast legths.

Showcases

Forecastig uder data scarcity

Imputatio with few-shot samples

Aomaly detectio o UCR Aomaly Archive

Future Work

We are preparig to provide the olie service for zero-shot forecastig. Please stay tued for the update!

Citatio

If you fid this repo helpful, please cite our paper.

@article{liu2024timer,
 title={Timer: Trasformers for Time Series Aalysis at Scale},
 author={Liu, Yog ad Zhag, Haora ad Li, Cheyu ad Huag, Xiagdog ad Wag, Jiami ad Log, Migsheg},
 joural={arXiv preprit arXiv:2402.02368},
 year={2024} 
}

Ackowledgemet

We appreciate the followig GitHub repos a lot for their valuable code ad efforts.

  • Time-Series-Library (https://github.com/thuml/Time-Series-Library)

Cotact

If you have ay questios or wat to use the code, feel free to cotact:

  • Yog Liu (liuyog21@mails.tsighua.edu.c)
  • Haora Zhag (z-hr20@mails.tsighua.edu.c)

功能介绍

Timer (Large Time Series Model) This repo provides official code and checkpoints for Timer: Generati

声明:本文仅代表作者观点,不代表本站立场。如果侵犯到您的合法权益,请联系我们删除侵权资源!如果遇到资源链接失效,请您通过评论或工单的方式通知管理员。未经允许,不得转载,本站所有资源文章禁止商业使用运营!
下载安装【程序员客栈】APP
实时对接需求、及时收发消息、丰富的开放项目需求、随时随地查看项目状态

评论