Coder Social home page Coder Social logo

torsion-audio / nn-inference-template Goto Github PK

View Code? Open in Web Editor NEW
95.0 5.0 5.0 19.05 MB

Neural network inference template for real-time cricital audio environments - presented at ADC23

Home Page: https://torsion-audio.github.io/nn-inference-template/

License: MIT License

CMake 23.45% C++ 76.55%
audio audio-plugin deep-learning inference inference-engine juce libtorch machine-learning neural-network onnx

nn-inference-template's Introduction

A Template Audio Plugin for Real-time Neural Network Inference

This repository was started as a companion repository to the talk Real-time inference of neural networks: a practical approach for DSP engineers at the Audio Developer Conference 2023. The video can be found here.

Since the conference, we have continued to refine and extend the codebase. For more flexible and easier use of the inference architecture, we have consolidated this work into a library called anira, which is now used in this repository. For those interested in the state of the repository as presented at ADC23, it can be found under the tag ADC23-talk.

Authors: Valentin Ackva & Fares Schulz

Overview

mockup

Build Status

This repository provides a comprehensive JUCE plugin template that demonstrates the use of anira to implement neural network inference in real-time audio applications. In this template, we use all three inference engines currently supported by the library:

  • TensorFlow Lite
  • LibTorch
  • ONNXRuntime.

anira, an architecture for neural network inference in real-time audio applications, covers all critical aspects of ensuring real-time safety and seamless signal flow for real-time inference. Detailed information can be found on the library's github repo.

Build instruction

On Apple Silicon Macs you need to install the OpenMP library via Homebrew (brew install libomp)

Build with CMake

# clone the repository
git clone https://github.com/Torsion-Audio/nn-inference-template/
cd nn-inference-template/

# initialize and set up submodules
git submodule update --init --recursive

# use cmake to build debug
cmake . -B cmake-build-debug -DCMAKE_BUILD_TYPE=Debug
cmake --build cmake-build-debug --config Debug

# use cmake to build release
cmake . -B cmake-build-release -DCMAKE_BUILD_TYPE=Release
cmake --build cmake-build-release --config Release

To run the plugin or standalone application with the default model, you need to copy the model files to the correct location. The application will attempt to load the neural models from the user's application data directory at runtime. Therefore, you must copy the folder ./modules/GuitarLSTM/* to the following locations, depending on your operating system:

Linux: ~/.config/nn-inference-template/GuitarLSTM/*

macOS: ~/Library/Application Support/nn-inference-template/GuitarLSTM/*

Windows: %APPDATA%\nn-inference-template\GuitarLSTM\*

Install from release

To install the plugin, please follow the instructions provided for your operating system:

Linux macOS Windows

Unit Test / Benchmark

The previous unit test for benchmarking the plugin performance across different audio configurations and inference backends is replaced by the benchmarking options within anira. These new benchmarks have been improved in many ways and provide a range of simple to complex benchmarks that can be used to compare the inference time for different models, inference engines, and audio configurations.

Licenses

The primary license for the code of this project is the MIT license, but be aware of the licenses of the submodules:

  • The anira library is licensed under the Apache 2.0
  • The GuitarLSTM fork is licensed under the GPLv3
  • The JUCE library is licensed under the JUCE License
  • The ONNXRuntime library is licensed under the MIT
  • The Libtorch library is licensed under the Modified BSD
  • The TensorflowLite library is licensed under the Apache 2.0
  • All other code within this project is licensed under the MIT License.

nn-inference-template's People

Contributors

faressc avatar vackva avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar

nn-inference-template's Issues

What is this project bringing over anira ?

Hi all,

Just watched the ADC talk and came skimming through this repo to understand it a bit better. I see that the guts of the template is the anira project from TU-Studio and I'm not 100% sure if this repo just brings a pretty UI or if there's something more to it ? Just asking because anira seems to already provide a plugin example, and receives update sfaster than this repo so I'm wondering which one I should fork :)

Thanks in advance!

Linker error on windows

Following the instructions, I get a linker error, presumably from libTorch trying to find MKL. I have MKL installed. I tried to fix it but no joy yet

LINK : fatal error LNK1181: cannot open input file 'mkl_intel_ilp64.lib'

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.