Coder Social home page Coder Social logo

light-model-transformer's Introduction

Light Model Transformer

Light Model Transformer is a light tool that could transform trained tensorflow model into C++ code. The generated code is based on Intel MKLDNN and provides interface to do inference without any framework, intent to accelerate the inference performance.

Usage:

Install Intel mkl-dnn

  • install Intel mkl-dnn via script
    • python install_mkldnn.py
    • See detail by cmd: python install_mkldnn.py -h

Convert model

  • make sure you have tensorflow installed. (can check it by import tensorflow in python)

  • prepare tensorflow model in pb format. (need to freeze one if you do not have it, and here assume you have one named frozen.pb)

  • run the scripts, eg:

    # Transform the model to internal representation
    # Use --help to see all params
    python tf2topo.py --input_model_filename=./frozen.pb \
                  --weights_file=saved_model/weights.bin \
                  --pkl_file=saved_model/weights.pkl \
                  --topo_file=saved_model/topo.txt
    
    # Transform to C++ inference code which is based on MKLDNN
    python topo2code.py --topo=saved_model/topo.txt
    

Compile and test generated inference code

  • compile and test generated code as below:
    • cd inference_code
    • vi build.sh and make sure the path of MKLDNN_ROOT is correct
    • sh build.sh (Note: opencv is needed to compile the code, and it will create an executable file named 'test')
    • ./test -W ../saved_model/weights.bin -b 1 -l 100 (Type ./test -H for help)

Integrate generated code to your own project

  • Please look into inference_code/Main.cpp for how to use the generated code. In general, the inferface looks like:

    // Create a Model object
    Model model(weight_path, batch_size);
    
    // Do inference by providing input and return the output
    output = model.inference(input);

Note:

  • 'Light' means it is a simple implementation, currently only support CNN networks. And even for CNN, many ops are still not supported.
  • We suggest using OpenVINO(TM) toolkit for inference acceleration, if you could accept close source inference engine.

light-model-transformer's People

Contributors

frank01606 avatar heagoo avatar

Stargazers

 avatar  avatar

Watchers

 avatar  avatar  avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.