Comments (8)
ATFNet
Adaptive Time-Frequency Ensembled Network for Long-term Time Series Forecasting.
https://github.com/YHYHYHYHYHY/ATFNet
from kotori.
EarthPT
A simple repository for training time series large observation models. This repository began its life as Andrej Karpathy's nanoGPT, and has been altered so that it is usable for time series data.
https://github.com/aspiaspace/earthPT
from kotori.
TinyTimeMixer (TTM)
About
TinyTimeMixers (TTMs) are compact pre-trained models for Multivariate Time-Series Forecasting, open-sourced by IBM Research. With less than 1 Million parameters, TTM introduces the notion of the first-ever “tiny” pre-trained models for Time-Series Forecasting.
The current open-source version supports point forecasting use-cases ranging from minutely to hourly resolutions (Ex. 10 min, 15 min, 1 hour, etc.). Note that zeroshot, fine-tuning and inference tasks using TTM can easily be executed in 1 GPU machine or in laptops too.
Details
TTM-1 currently supports 2 modes:
- Zeroshot forecasting: Directly apply the pre-trained model on your target data to get an initial forecast (with no training).
- Finetuned forecasting: Finetune the pre-trained model with a subset of your target data to further improve the forecast.
Since, TTM models are extremely small and fast, it is practically very easy to finetune the model with your available target data in few minutes to get more accurate forecasts. For more details on TTM architecture and benchmarks, refer to our paper.
HF: https://huggingface.co/ibm/TTM
Paper: https://arxiv.org/pdf/2401.03955.pdf
Repository: https://github.com/IBM/tsfm/tree/main/tsfm_public/models/tinytimemixer
from kotori.
Neural 🧠 Forecast
About
NeuralForecast offers a large collection of neural forecasting models focused on their usability, and robustness. The models range from classic networks like
MLP
,RNN
s to novel proven contributions likeNBEATS
,NHITS
,TFT
and other architectures.
Features
- Exogenous Variables: Static, historic and future exogenous support.
- Forecast Interpretability: Plot trend, seasonality and exogenous NBEATS, NHITS, TFT, ESRNN prediction components.
- Probabilistic Forecasting: Simple model adapters for quantile losses and parametric distributions.
- Train and Evaluation Losses Scale-dependent, percentage and scale independent errors, and parametric likelihoods.
- Automatic Model Selection Parallelized automatic hyperparameter tuning, that efficiently searches best validation configuration.
- Simple Interface Unified SKLearn Interface for StatsForecast and MLForecast compatibility.
- Model Collection: Out of the box implementation of MLP, LSTM, RNN, TCN, DilatedRNN, NBEATS, NHITS, ESRNN, Informer, TFT, PatchTST, VanillaTransformer, StemGNN and HINT. See the entire collection here.
Web: https://nixtlaverse.nixtla.io/neuralforecast/
Repository: https://github.com/Nixtla/neuralforecast
from kotori.
Awesome AI for Time Series (AI4TS) Papers, Tutorials, and Surveys
A professionally curated list of papers (with available code), tutorials, and surveys on recent AI for Time Series Analysis (AI4TS), including Time Series, Spatio-Temporal Data, Event Data, Sequence Data, Temporal Point Processes, etc., at the Top AI Conferences and Journals, which is updated ASAP (the earliest time) once the accepted papers are announced in the corresponding top AI conferences/journals. Hope this list would be helpful for researchers and engineers who are interested in AI for Time Series Analysis.
Repository: https://github.com/qingsongedu/awesome-AI-for-time-series-papers
from kotori.
PatchTST (ICLR 2023)
We propose an efficient design of Transformer-based models for multivariate time series forecasting and self-supervised representation learning. It is based on two key components: (i) segmentation of time series into subseries-level patches which are served as input tokens to Transformer; (ii) channel-independence where each channel contains a single univariate time series that shares the same embedding and Transformer weights across all the series. Patching design naturally has three-fold benefit: local semantic information is retained in the embedding; computation and memory usage of the attention maps are quadratically reduced given the same look-back window; and the model can attend longer history.
Our channel-independent patch time series Transformer (PatchTST) can improve the long-term forecasting accuracy significantly when compared with that of SOTA Transformer-based models. We also apply our model to self-supervised pre-training tasks and attain excellent fine-tuning performance, which outperforms supervised training on large datasets. Transferring of masked pre-trained representation on one dataset to others also produces SOTA forecasting accuracy.
Paper: A Time Series is Worth 64 Words: Long-term Forecasting with Transformers
Repository: https://github.com/yuqinie98/PatchTST
from kotori.
tsai
State-of-the-art Deep Learning library for Time Series and Sequences.
tsai
is an open-source deep learning package built on top of Pytorch & fastai focused on state-of-the-art techniques for time series tasks like classification, regression, forecasting, imputation…
tsai
is currently under active development by timeseriesAI.
Repository: https://github.com/timeseriesAI/tsai
Documentation: https://timeseriesai.github.io/tsai/
from kotori.
skforecast
Time series forecasting with scikit-learn models.
Skforecast is a Python library that eases using scikit-learn regressors as single and multi-step forecasters. It also works with any regressor compatible with the scikit-learn API (LightGBM, XGBoost, CatBoost, ...).
Homepage: https://skforecast.org/
Repository: https://github.com/JoaquinAmatRodrigo/skforecast
from kotori.
Related Issues (20)
- docker-compose up is taking very long HOT 2
- Video tutorial HOT 1
- Panels are not updated on instant dashboards after update to Grafana 9.3.1 and Kotori 0.27.0 HOT 1
- Support new devices for DAQ-SIG
- Add ISEMS project to gallery
- Add "Well Depth Monitor" to project gallery
- Grafana: Adjust a few integration details
- Error channel reports `'NoneType' object has no attribute 'endswith'`
- Docker is sunsetting Free Team organizations HOT 5
- [Proposal] Add a generic device-based addressing scheme for "WAN" networks HOT 1
- Modernize firmware builder to use PlatformIO
- bunch » munch » benedict
- Support FIWARE NGSI-LD, NGSIv2, and Ultralight 2.0 protocols
- Support Sparkplug MQTT protocol HOT 2
- Make plumbing less opinionated
- Support Shelly devices HOT 2
- Support Jesth / Paradict / Braq
- LECO, PyMeasure, PyVISA
- HomA and Wiren Board MQTT Conventions
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from kotori.