BasicTS (Basic Time Series) is a PyTorch-based benchmark and toolbox for time series forecasting (TSF).
On the one hand, BasicTS utilizes a unified and standard pipeline to give a fair and exhaustive reproduction and comparison of popular deep learning-based models.
On the other hand, BasicTS provides users with easy-to-use and extensible interfaces to facilitate the quick design and evaluation of new models. At a minimum, users only need to define the model architecture.
β¨ Highlighted Features
BasicTS is developed based on EasyTorch[1], an easy-to-use and powerful open-source neural network training framework.
πΌ Fair Performance Review
Users can compare the performance of different models on arbitrary datasets fairly and exhaustively based on a unified and comprehensive pipeline.
π¨βπ» Developing with BasicTS
Minimum Code
Users only need to implement key codes such as model architecture and data pre/post-processing to build their own deep learning projects.Everything Based on Config
Users can control all the details of the pipeline through a config file, such as the hyperparameter of dataloaders, optimization, and other tricks (*e.g.*, curriculum learning).Support All Devices
BasicTS supports CPU, GPU and GPU distributed training (both single node multiple GPUs and multiple nodes) thanks to using EasyTorch as the backend. Users can use it by setting parameters without modifying any code.Save Training Log
Support `logging` log system and `Tensorboard`, and encapsulate it as a unified interface, users can save customized training logs by calling simple interfaces.β¨ Built-in Datasets and Baselines
π‘ Datasets
BasicTS support a variety of datasets, including spatial-temporal forecasting, long time-series forecasting, and large-scale datasets, e.g.,
- METR-LA, PEMS-BAY, PEMS03, PEMS04, PEMS07, PEMS08
- ETTh1, ETTh2, ETTm1, ETTm2, Electricity, Exchange Rate, Weather, Beijing Air Quality
- SD, GLA, GBA, CA
- ...
βοΈ Baselines
BasicTS provides a wealth of built-in models, including both spatial-temporal forecasting models and long time-series forecasting models, e.g.,
- DCRNN, Graph WaveNet, MTGNN, STID, D2STGNN, STEP, DGCRN, DGCRN, STNorm, AGCRN, GTS, StemGNN, MegaCRN, STGCN, ...
- Informer, Autoformer, FEDformer, Pyraformer, DLinear, NLinear, Triformer, Crossformer, ...
πΏ Dependencies
Preliminaries
OS
We recommend using BasicTS on Linux systems (e.g. Ubuntu and CentOS). Other systems (e.g., Windows and macOS) have not been tested.
Python
Python >= 3.6 (recommended >= 3.9).
Miniconda or Anaconda are recommended to create a virtual python environment.
Other Dependencies
BasicTS is built based on PyTorch and EasyTorch. You can install PyTorch following the instruction in PyTorch. For example:
pip install torch==1.10.0+cu111 torchvision==0.11.0+cu111 torchaudio==0.10.0 -f https://download.pytorch.org/whl/torch_stable.html
After ensuring that PyTorch is installed correctly, you can install other dependencies via:
pip install -r requirements.txt
Warning
BasicTS is built on PyTorch 1.9.1 or 1.10.0, while other versions have not been tested.
π― Getting Started of Developing with BasicTS
Preparing Data
-
Clone BasicTS
cd /path/to/your/project git clone https://github.com/zezhishao/BasicTS.git
-
Download Raw Data
You can download all the raw datasets at Google Drive or Baidu Yun(password: 6v0a), and unzip them to
datasets/raw_data/
. -
Pre-process Data
cd /path/to/your/project python scripts/data_preparation/${DATASET_NAME}/generate_training_data.py
Replace
${DATASET_NAME}
with one ofMETR-LA
,PEMS-BAY
,PEMS03
,PEMS04
,PEMS07
,PEMS08
, or any other supported dataset. The processed data will be placed indatasets/${DATASET_NAME}
.
3 Steps to Evaluate Your Model
-
Define Your Model Architecture
The
forward
function needs to follow the conventions of BasicTS. You can find an example of the Multi-Layer Perceptron (MLP
) model in baselines/MLP/mlp_arch.py -
Define Your Runner for Your Model (Optional)
BasicTS provides a unified and standard pipeline in
basicts.runner.BaseTimeSeriesForecastingRunner
. Nevertheless, you still need to define the specific forward process (theforward
function in the runner). Fortunately, BasicTS also provides such an implementation inbasicts.runner.SimpleTimeSeriesForecastingRunner
, which can cover most of the situations. The runner for theMLP
model can also use this built-in runner. You can also find more runners inbasicts.runners.runner_zoo
to learn more about the runner design. -
Configure your Configuration File
You can configure all the details of the pipeline and hyperparameters in a configuration file, i.e., everything is based on config. The configuration file is a
.py
file, in which you can import your model and runner and set all the options. BasicTS usesEasyDict
to serve as a parameter container, which is extensible and flexible to use. An example of the configuration file for theMLP
model on theMETR-LA
dataset can be found in baselines/MLP/MLP_METR-LA.py
Run It!
An example of a start script can be found in experiments/train.py. You can run your model by the following command:
python experiments/train.py -c /path/to/your/config/file.py --gpus '0'
π Examples
Reproducing Built-in Models
BasicTS provides a wealth of built-in models. You can find all the built-in models and their corresponding runners in basicts/archs/arch_zoo
and basicts/runners/runner_zoo
, respectively. You can reproduce these models by running the following command:
python experiments/train.py -c baselines/${MODEL_NAME}/${MODEL_NAME}_${DATASET_NAME}.py --gpus '0'
Replace ${DATASET_NAME}
and ${MODEL_NAME}
with any supported models and datasets. For example, you can run Graph WaveNet on METR-LA dataset by:
python experiments/train.py -c baselines/GWNet/GWNet_METR-LA.py --gpus '0'
Customized Your Own Model
- Multi-Layer Perceptron (MLP)
- More...
π Main Results
Spatial-Temporal Forecasting
Long Time- Series Forecasting
π¦ Checkpoints
Uploading...
π Acknowledgement
BasicTS is developed based on EasyTorch[1], an easy-to-use and powerful open-source neural network training framework.
π References
- [1] Yuhao Wang. EasyTorch. https://github.com/cnstark/easytorch, 2020.