Coder Social home page Coder Social logo

tasselyue / mixture-of-experts-based-federated-learning-for-energy-forecasting Goto Github PK

View Code? Open in Web Editor NEW

This project forked from jonassievers/mixture-of-experts-based-federated-learning-for-energy-forecasting

0.0 0.0 0.0 495.79 MB

Source code for our preprint paper "Advancing Accuracy in Load Forecasting using Mixture-ofExperts and Federated Learning".

JavaScript 0.03% C++ 0.01% Python 40.71% C 0.33% Lua 0.01% Fortran 0.02% PowerShell 0.02% Forth 0.01% CSS 0.01% PureBasic 27.26% HTML 0.01% Smarty 0.01% Batchfile 0.01% Jupyter Notebook 31.53% Cython 0.07% Roff 0.01%

mixture-of-experts-based-federated-learning-for-energy-forecasting's Introduction

Mixture-of-Experts based Federated Learning for secure short-term load forecasting

image description

Abstract: Accurate load forecasting is essential for the reliable planning and operation of smart grids, to improve energy storage optimization, or to enable demand response programs. As load forecasting often involves heterogeneous and complex data patterns with high variability, precise predictions are still challenging. Here, models with enhanced adaptability and generalizability are crucial, making Mixture-of-Experts (MoE) a promising solution. An MoE layer combines the predictions of multiple specialized sub-models, known as experts, using a gating mechanism to dynamically select and weight the experts' outputs based on the input sequence. Existing deep learning models can integrate this generic layer into their architecture to learn and handle complex patterns in data adaptively. In this paper, we adopt the MoE concept to dense and bidirectional long short-term memory models for load forecasting, using soft- and top-k gating. As our benchmark, we consider stateof-the-art bidirectional long short-term memory models, convolutional neural networks, residual neural networks, and transformer models. Further, we implement both local and federated learning architectures. In federated learning, models are trained locally on private data, and only the trained model parameters are merged and updated on a global server to improve accuracy and data privacy. Utilizing the Ausgrid dataset, we demonstrate that including an MoE layer into existing model architectures can increase accuracy by up to 13 % while decreasing the total training time by 33 %. Additionally, implementing the MoE model within an FL architecture can improve model accuracy by a further 4% compared to local learning.


Project structure

This repsoitory consists of the following parts:

  • data folder: Here are all datasets and scripts to collect the datasets, preprocess them, performe feature engineering and create the final dataset used for the forecasting task.
  • evaluations folder: Here are all the evaluation results stored
  • images folder: Here are all figures and plots stored and the script to create them
  • models folder: Here the model weights are stored
  • src folder: Here the main scripts are stored for the forecasting baseline, local learning, federated learning and evaluation
    • utils folder: Here helper classes for data handling, model generation, and model handling are stored

Install and Run the project

To run the project you can fork this repository and following the instructions: First create your own virtual environment:

python -m venv .venv

Activate your environment with

 .\.venv\Scripts\activate.ps1

Install all dependencies with

pip install -r requirements.txt

mixture-of-experts-based-federated-learning-for-energy-forecasting's People

Contributors

jonassievers avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.