Coder Social home page Coder Social logo

Time series forecasting / prediction about kotori HOT 8 OPEN

amotl avatar amotl commented on June 6, 2024 1
Time series forecasting / prediction

from kotori.

Comments (8)

amotl avatar amotl commented on June 6, 2024

ATFNet

Adaptive Time-Frequency Ensembled Network for Long-term Time Series Forecasting.

https://github.com/YHYHYHYHYHY/ATFNet

from kotori.

amotl avatar amotl commented on June 6, 2024

EarthPT

A simple repository for training time series large observation models. This repository began its life as Andrej Karpathy's nanoGPT, and has been altered so that it is usable for time series data.

https://github.com/aspiaspace/earthPT

from kotori.

amotl avatar amotl commented on June 6, 2024

TinyTimeMixer (TTM)

About

TinyTimeMixers (TTMs) are compact pre-trained models for Multivariate Time-Series Forecasting, open-sourced by IBM Research. With less than 1 Million parameters, TTM introduces the notion of the first-ever “tiny” pre-trained models for Time-Series Forecasting.

The current open-source version supports point forecasting use-cases ranging from minutely to hourly resolutions (Ex. 10 min, 15 min, 1 hour, etc.). Note that zeroshot, fine-tuning and inference tasks using TTM can easily be executed in 1 GPU machine or in laptops too.

Details

TTM-1 currently supports 2 modes:

  • Zeroshot forecasting: Directly apply the pre-trained model on your target data to get an initial forecast (with no training).
  • Finetuned forecasting: Finetune the pre-trained model with a subset of your target data to further improve the forecast.

Since, TTM models are extremely small and fast, it is practically very easy to finetune the model with your available target data in few minutes to get more accurate forecasts. For more details on TTM architecture and benchmarks, refer to our paper.

HF: https://huggingface.co/ibm/TTM
Paper: https://arxiv.org/pdf/2401.03955.pdf
Repository: https://github.com/IBM/tsfm/tree/main/tsfm_public/models/tinytimemixer

from kotori.

amotl avatar amotl commented on June 6, 2024

Neural 🧠 Forecast

About

NeuralForecast offers a large collection of neural forecasting models focused on their usability, and robustness. The models range from classic networks like MLP, RNNs to novel proven contributions like NBEATS, NHITS, TFT and other architectures.

Features

  • Exogenous Variables: Static, historic and future exogenous support.
  • Forecast Interpretability: Plot trend, seasonality and exogenous NBEATS, NHITS, TFT, ESRNN prediction components.
  • Probabilistic Forecasting: Simple model adapters for quantile losses and parametric distributions.
  • Train and Evaluation Losses Scale-dependent, percentage and scale independent errors, and parametric likelihoods.
  • Automatic Model Selection Parallelized automatic hyperparameter tuning, that efficiently searches best validation configuration.
  • Simple Interface Unified SKLearn Interface for StatsForecast and MLForecast compatibility.
  • Model Collection: Out of the box implementation of MLP, LSTM, RNN, TCN, DilatedRNN, NBEATS, NHITS, ESRNN, Informer, TFT, PatchTST, VanillaTransformer, StemGNN and HINT. See the entire collection here.

Web: https://nixtlaverse.nixtla.io/neuralforecast/
Repository: https://github.com/Nixtla/neuralforecast

from kotori.

amotl avatar amotl commented on June 6, 2024

Awesome AI for Time Series (AI4TS) Papers, Tutorials, and Surveys

A professionally curated list of papers (with available code), tutorials, and surveys on recent AI for Time Series Analysis (AI4TS), including Time Series, Spatio-Temporal Data, Event Data, Sequence Data, Temporal Point Processes, etc., at the Top AI Conferences and Journals, which is updated ASAP (the earliest time) once the accepted papers are announced in the corresponding top AI conferences/journals. Hope this list would be helpful for researchers and engineers who are interested in AI for Time Series Analysis.

Repository: https://github.com/qingsongedu/awesome-AI-for-time-series-papers

from kotori.

amotl avatar amotl commented on June 6, 2024

PatchTST (ICLR 2023)

We propose an efficient design of Transformer-based models for multivariate time series forecasting and self-supervised representation learning. It is based on two key components: (i) segmentation of time series into subseries-level patches which are served as input tokens to Transformer; (ii) channel-independence where each channel contains a single univariate time series that shares the same embedding and Transformer weights across all the series. Patching design naturally has three-fold benefit: local semantic information is retained in the embedding; computation and memory usage of the attention maps are quadratically reduced given the same look-back window; and the model can attend longer history.

Our channel-independent patch time series Transformer (PatchTST) can improve the long-term forecasting accuracy significantly when compared with that of SOTA Transformer-based models. We also apply our model to self-supervised pre-training tasks and attain excellent fine-tuning performance, which outperforms supervised training on large datasets. Transferring of masked pre-trained representation on one dataset to others also produces SOTA forecasting accuracy.

Paper: A Time Series is Worth 64 Words: Long-term Forecasting with Transformers
Repository: https://github.com/yuqinie98/PatchTST

from kotori.

amotl avatar amotl commented on June 6, 2024

tsai

State-of-the-art Deep Learning library for Time Series and Sequences.

tsai is an open-source deep learning package built on top of Pytorch & fastai focused on state-of-the-art techniques for time series tasks like classification, regression, forecasting, imputation…

tsai is currently under active development by timeseriesAI.

Repository: https://github.com/timeseriesAI/tsai
Documentation: https://timeseriesai.github.io/tsai/

from kotori.

amotl avatar amotl commented on June 6, 2024

skforecast

Time series forecasting with scikit-learn models.

Skforecast is a Python library that eases using scikit-learn regressors as single and multi-step forecasters. It also works with any regressor compatible with the scikit-learn API (LightGBM, XGBoost, CatBoost, ...).

Homepage: https://skforecast.org/
Repository: https://github.com/JoaquinAmatRodrigo/skforecast

from kotori.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.