This project is an forked and modified version of the state-of-the-art TimesNet model for time series analysis.
It is a temporal 2D-variation modeling approach for general time series analysis:
- It extends the analysis of temporal variations into the 2D space by transforming the 1D time series into a matrix, which encodes temporal features from multiple perspectives.
- It uses a pyramid-like architecture, TimesBlock, to hierarchically capture the complex temporal variations.
- The experiments on a variety of tasks demonstrate that TimesNet outperforms existing methods in terms of accuracy and generalization capability on datasets with diverse scales and complexities.
Original Repository is here
- Optimized and simplified the project structure from the original library.
- Implemented
Neptune
for experiment tracking. - Enhanced dataset and data loader flexibility to handle tasks with time gaps
- Implemented parallel loading and improved data structures to enhance data processing efficiency.
- Identified and resolved a minor bug in dataset length calculation, resulting in an increased dataset size.
- Integrated additional sub-networks to process time and temperature information for prediction (e.i.,
y_data
). - Accelerated the training process by utilizing
PyTorch
built-in automatic mixed precision training and asynchronous GPU data copying when a GPU is available.
- Mean Squared Error (MSE) of minute-level prediction <= 0.11 (standardized data)
- Precise and adjustable minute-level predictions
- Fast Inference: approximately 5s/32samples (each sample contains 4 power output within 1 hour) on Google Colab CPU
- Install the required dependencies:
pip install -r requirements.txt
- (Optional) Config the
neptune.yaml
file for Neptune.ai tracking:project: your_username/your_project_name api_token: your_api_token
to view the help message:
python -u run.py --help
to run the model:
python -u run.py --args args
Visit documentation to learn more about the run.py
arguments.
Alternatively, check [Data Preprocessing](Tutorials/Data Prepocessing.ipynb), [Train, Test and Predict](Tutorials/Train,
Test and Predict.ipynb) and [Visualization](Tutorials/Testing Result Visualization.ipynb) notebooks in the /Tutorials
folder for more examples.
- Implement transfer learning to enhance inference speed and potentially gain insights into model interpretation.
- Utilize linear interpolation to augment the number of data samples.
- Enhance data preprocessing techniques:
- Apply sliding window for further denoising.
- Experiment with and potentially combine different data scalers.
- Incorporate the characteristic of power threshold in the model input, and if possible, use a combination of hinge loss and mean squared error (MSE) as the loss function.
If you have any questions or suggestions, feel free to describe it in Issues.