Coder Social home page Coder Social logo

godatta / ml-inference-benchmarks Goto Github PK

View Code? Open in Web Editor NEW

This project forked from purdue-nrl/ml-inference-benchmarks

0.0 0.0 0.0 8.37 MB

GPU and CPU measurements for ML inference workloads for power, latency and throughput

Lua 81.58% CMake 1.96% C 2.11% C++ 3.26% Makefile 1.37% TeX 6.67% Shell 2.53% Python 0.52%

ml-inference-benchmarks's Introduction

Running the workloads

Steps to set up dependancies

Rerun torch just to make sure your version's the latest one

  1. git clone [email protected]:Aayush-Ankit/isca_workloads.git
  2. luarocks install torch
    Note: luarocks intall may have an issue with systems seating behind a proxy, so users have to change the git config file to use https:// than git://, so please follow this instructions.
    [url "https://"]
    insteadOf = git://
    Add above statements to your repository gitconfig file "e.g: vim ~/.gitconfig " For more information refer to this link --> https://github.com/luarocks/luarocks/wiki/LuaRocks-through-a-proxy
  3. luarocks install nn
  4. luarocks install dpnn
  5. luarocks install torchx

To use with CUDA

  1. luarocks install cutorch
  2. luarocks install cunn
  3. luarocks install cunnx

Install RNN dependancy (allows using sequencers)

  1. cd rnn
  2. luarocks make rocks/rnn-scm-1.rockspec
    Note: the above command may break due to no proper rnn directory CMakeList cleanup. If that occurs please delete rnn and reclone the directory and run it again.
    2.a rm rnn
    2.b git clone [email protected]:Element-Research/rnn.git
    2.c Repeat steps 1 and 2 again

Yay! Setup's Done!!!

Running a benchmark on CPU/GPU

Some info. about the benchmarks

  1. wlm_bigLSTM - bigLSTM network for word-level language modelling (Google 1B dataset)
  2. wlm_anotherLSTM - another deep LSTM network for word-level language modelling (Google 1B dataset)
  3. nmt_l5 - Google Machine Tranalation for English-French (WMT15 dataset)
  4. nmt_l3 - Google Machine Tranalation for English-French (WMT15 dataset)

th <.lua> -gpu <0/1> -threads <non-zero> -batch <non-zero>

cmdline options:

  1. gpu > use 0 for CPU run, 1 for GPU run (default is CPU)
  2. threads > useful for CPU runs, can increase to evaluate CPU performance (default is 1)
  3. batch > can be varied to find the CPU, GPU numbers (inference time, power) variation. FOr GPU, can increase batchsize until torch throws THCudaCheck: out of memory error

Metrics of Interest which are printed

  1. Number of parameters in the network
  2. Inference time on (CPU/GPU) NOTE: For, CPU inference time, run it twice (and use the 2nd value) to make sure the data movement cost from HDD isn't included
  3. The <>pow.txt file shows gpu power consumption

ml-inference-benchmarks's People

Contributors

aayush-ankit avatar sairahul-chalamalasetti avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.