Coder Social home page Coder Social logo

dnn-gating's People

Contributors

weizhehua avatar ychzhang avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

dnn-gating's Issues

Question about sparsity report

Hello,

Great work! Thanks for sharing the code, I would like to inquire about how the sparsity is reported [https://github.com/cornell-zhang/dnn-gating/blob/master/utils/cg_utils.py#L176]. This calculates sparsity in feature maps per pixel, but as far as I know, this isn't how the filter pruning methods report FLOPs reduction. I was wondering if there is another FLOPs reduction report that calculates based on skipping the whole filter/slice or not to be comparable with filter pruning methods instead of sparsity measure?

Also, in your paper table 4, you calculated fps with batch size 32. Would you mind sharing code to achieve actual speedup?

Thanks!

Two BN in Channel Gating

Did you implement two separate BN for base path (Xp) and conditional path (Xr) as figure 4 mentioned in your paper?
I only find one BN here.

d = self.gt(torch.sigmoid(self.alpha*(self.bn(Yp)-self.threshold)) - 0.5 * torch.ones_like(Yp))

Also it seems like this BN is for making decision map instead of combining outputs from Xp and Xr?
2bn

question about the QuantizedConv2d

In your implementation of QuantizedConv2d, I found your pipeline is: quantize -> dequantize -> conv, why not use quantize -> conv -> dequantize?

in my opinion, the pipeline (quantize -> conv -> dequantize) will be faster, because the conv can be boosted by low-bit calculation.

Hope to get your reply.
Thanks

threshold pre layer

tc

Did you implement threshold tc mentioned in formula (4) in this repo?
I think the line below only implements ฮ”

d = self.gt(torch.sigmoid(self.alpha*(self.bn(Yp)-self.threshold)) - 0.5 * torch.ones_like(Yp))

This implementation does not SKIP the computation.

According to the code here, the implementation of CGConv2d does not skip the computation of Y_r, which is proposed in your paper.

In fact, this implementation computes Y and Y_p separately, and produces the result by Y*d+Y_p*(1-d). This implementation increases the computation, right? So how do you actually skip and save the computation?

Thanks!

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.