Coder Social home page Coder Social logo

salesforce / etsformer Goto Github PK

View Code? Open in Web Editor NEW
242.0 9.0 38.0 470 KB

PyTorch code for ETSformer: Exponential Smoothing Transformers for Time-series Forecasting

License: BSD 3-Clause "New" or "Revised" License

Python 81.01% Shell 18.99%
deep-learning interpretable-machine-learning time-series time-series-decomposition time-series-forecasting transformers exponential-smoothing forecasting pytorch

etsformer's Introduction

ETSformer: Exponential Smoothing Transformers for Time-series Forecasting



Figure 1. Overall ETSformer Architecture.

Official PyTorch code repository for the ETSformer paper. Check out our blog post!

  • ETSformer is a novel time-series Transformer architecture which exploits the principle of exponential smoothing in improving Transformers for timeseries forecasting.
  • ETSformer is inspired by the classical exponential smoothing methods in time-series forecasting, leveraging the novel exponential smoothing attention (ESA) and frequency attention (FA) to replace the self-attention mechanism in vanilla Transformers, thus improving both accuracy and efficiency.

Requirements

  1. Install Python 3.8, and the required dependencies.
  2. Required dependencies can be installed by: pip install -r requirements.txt

Data

  • Pre-processed datasets can be downloaded from the following links, Tsinghua Cloud or Google Drive, as obtained from Autoformer's GitHub repository.
  • Place the downloaded datasets into the dataset/ folder, e.g. dataset/ETT-small/ETTm2.csv.

Usage

  1. Install the required dependencies.
  2. Download data as above, and place them in the folder, dataset/.
  3. Train the model. We provide the experiment scripts of all benchmarks under the folder ./scripts, e.g. ./scripts/ETTm2.sh. You might have to change permissions on the script files by runningchmod u+x scripts/*.
  4. The script for grid search is also provided, and can be run by ./grid_search.sh.

Acknowledgements

The implementation of ETSformer relies on resources from the following codebases and repositories, we thank the original authors for open-sourcing their work.

Citation

Please consider citing if you find this code useful to your research.

@article{woo2022etsformer,
    title={ETSformer: Exponential Smoothing Transformers for Time-series Forecasting},
    author={Gerald Woo and Chenghao Liu and Doyen Sahoo and Akshat Kumar and Steven C. H. Hoi},
    year={2022},
    url={https://arxiv.org/abs/2202.01381},
}

etsformer's People

Contributors

gorold avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

etsformer's Issues

Why the value of true.npy is not the same as the original value?

Hello, thank you very much for your work. I have a few unclear points about the code:
(1)Why is the value of true.npy saved in the exp_main.py file not the same as the corresponding original value?
(2)How and where do I print the training set and test set?

Multivariate input to predict univariate variables

Hi,

I was wondering if it is possible to use your model with a multivariate input while predicting a univariate variable. If not, do you know what code I should change to make it work? As you are using some of the code from Informer, I was thinking about using the 'MS' features parameter, but this gives the following error in the encoder on line 109:

level = level.view(b, t, self.c_out, 1)
RuntimeError: shape '[32, 192, 1, 1]' is invalid for input of size 36864

Now I could reshape this level variable so it would be consistent with my data, but I don't know if your model is capable of handling that. Please let me know what you think.

Thanks for your time and contribution,

Rico

github code seems to predict flat lines only

Thanks for sharing, this series of timeseries models are really interesting. I especially like deeptime which works well for me (and I've tried adding multivariate, past only, inputs).

I particularly like the fact that you've testing on challenging multivariate weather and financial data. Many timeseries papers skip these difficult domains in favor of trivial problems. That's why I <3 Deeptime and the *Former papers.

I do have a question. I can't get ETSFormer to work. It seems like the current code mainly just predicts level, perhaps there is a bug in the uploaded code?

To replicate this I used a notebook and no substantial modifications. And you can see it's not predicting nice smooth ARIMA-like lines like in the paper. Instead it seems like it's all level with a tiny bit of growth in the first few steps. This happens at multiple lr's and with multiple datasets.

Am I missing something. Any ideas why this might be?

https://github.com/wassname/ETSformer/blob/w_notebook/notebook/run.ipynb

image

How to adapt framework for custom data?

Hi team,
great work first of all. I would like to use your framework for the domain of sports. I have a time series data of players in regards to different parameters. How would I use the data witht the current model? In what input format does the data have to be in? Is a dataframe sufficient that is grouped by player and his parameters? Can the model be trained on the data of an entire team and then predict a parameter of a given player?
Thank's a lot in advance!

Please upload Example Notebook

Hi, gorold. I gone through your repo i appreciate your work TSF. Please upload a sample notebook how to use the ETSformer. It will help a lot, learners like me.

Also please confirm ETSformer can be run on pc/laptop? If forecasting horizon less than 7-steps?

Thanks in advance! Your early reply awaiting...

Holidays and other flags

How does it handle holidays or other flags that might drive the data? It's not always visible for the Fourier transformation.

Error while running ETTm2.sh

Hi, I'm trying to run ETTm2.sh script, but this is what I get. I've put ETTm2.csv file to dataset/ETT-small

Traceback (most recent call last):
  File "C:\python3\lib\site-packages\einops\einops.py", line 410, in reduce
    return _apply_recipe(recipe, tensor, reduction_type=reduction)
  File "C:\python3\lib\site-packages\einops\einops.py", line 233, in _apply_recipe
    _reconstruct_from_shape(recipe, backend.shape(tensor))
  File "C:\python3\lib\site-packages\einops\einops.py", line 163, in _reconstruct_from_shape_uncached
    raise EinopsError('Expected {} dimensions, got {}'.format(len(self.input_composite_axes), len(shape)))      
einops.EinopsError: Expected 2 dimensions, got 4

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "C:\Users\Senti\Desktop\ETSformer\run.py", line 117, in <module>
    exp.train(setting)
  File "C:\Users\Senti\Desktop\ETSformer\exp\exp_main.py", line 144, in train
    outputs = self.model(batch_x, batch_x_mark, dec_inp, batch_y_mark)
  File "C:\python3\lib\site-packages\torch\nn\modules\module.py", line 1130, in _call_impl
    return forward_call(*input, **kwargs)
  File "C:\Users\Senti\Desktop\ETSformer\models\etsformer\model.py", line 72, in forward
    level, growths, seasons = self.encoder(res, x_enc, attn_mask=enc_self_mask)
  File "C:\python3\lib\site-packages\torch\nn\modules\module.py", line 1130, in _call_impl
    return forward_call(*input, **kwargs)
  File "C:\Users\Senti\Desktop\ETSformer\models\etsformer\encoder.py", line 169, in forward
    res, level, growth, season = layer(res, level, attn_mask=None)
  File "C:\python3\lib\site-packages\torch\nn\modules\module.py", line 1130, in _call_impl
    return forward_call(*input, **kwargs)
  File "C:\Users\Senti\Desktop\ETSformer\models\etsformer\encoder.py", line 142, in forward
    growth = self._growth_block(res)
  File "C:\Users\Senti\Desktop\ETSformer\models\etsformer\encoder.py", line 151, in _growth_block
    x = self.growth_layer(x)
  File "C:\python3\lib\site-packages\torch\nn\modules\module.py", line 1130, in _call_impl
    return forward_call(*input, **kwargs)
  File "C:\Users\Senti\Desktop\ETSformer\models\etsformer\encoder.py", line 38, in forward
    out = torch.cat([repeat(self.es.v0, 'h d -> b 1 h d', b=b), out], dim=1)
  File "C:\python3\lib\site-packages\einops\einops.py", line 537, in repeat
    return reduce(tensor, pattern, reduction='repeat', **axes_lengths)
  File "C:\python3\lib\site-packages\einops\einops.py", line 418, in reduce
    raise EinopsError(message + '\n {}'.format(e))
einops.EinopsError:  Error while processing repeat-reduction pattern "h d -> b 1 h d".
 Input tensor shape: torch.Size([1, 1, 8, 64]). Additional info: {'b': 32}.
 Expected 2 dimensions, got 4

This part of code seems to cause the problem:

   def forward(self, inputs):
       """
       :param inputs: shape: (batch, seq_len, dim)
       :return: shape: (batch, seq_len, dim)
       """
       b, t, d = inputs.shape
       values = self.in_proj(inputs).view(b, t, self.nhead, -1)
       values = torch.cat([repeat(self.z0, 'h d -> b 1 h d', b=b), values], dim=1)
       values = values[:, 1:] - values[:, :-1]
       out = self.es(values)
       out = torch.cat([repeat(self.es.v0, 'h d -> b 1 h d', b=b), out], dim=1)
       out = rearrange(out, 'b t h d -> b t (h d)')
       return self.out_proj(out)

Error of PyTorch vs. CUDA version

I'm getting the following error.

Traceback (most recent call last):
  File "run.py", line 117, in <module>
    exp.train(setting)
  File "/nvme/git/ETSformer/exp/exp_main.py", line 140, in train
    dec_inp = torch.zeros_like(batch_y[:, -self.args.pred_len:, :]).float()
RuntimeError: CUDA error: no kernel image is available for execution on the device
CUDA kernel errors might be asynchronously reported at some other API call,so the stacktrace below might be incorrect.
For debugging consider passing CUDA_LAUNCH_BLOCKING=1.


I believe this error is caused by the CUDA version being 11.7 (the newest), and is incompatible with the older torch version (1.11.0 specified in the requirements.txt).

I tried upgrading the torch version but found errors. Would you please update the torch code to the latest? I know that it breaks the transformer code as mentioned here:
pytorch/pytorch#80569

RuntimeError when running model

I gain this error, when I try to rebuild this model in other library
File "...../models/ETSformer/encoder.py", line 87, in topk_freq x_freq = x_freq[index_tuple] RuntimeError: index does not support automatic differentiation for outputs with complex dtype.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.