Coder Social home page Coder Social logo

pytorchdiscreteflows's Introduction

The discrete normalizing flow code is originally taken and modified from: https://github.com/google/edward2/blob/master/edward2/tensorflow/layers/discrete_flows.py and https://github.com/google/edward2/blob/master/edward2/tensorflow/layers/utils.py Which was introduced in the paper: https://arxiv.org/abs/1905.10347

The demo file, MADE, and MLP were modified and taken from: https://github.com/karpathy/pytorch-normalizing-flows

To my knowledge as of March 10th 2020, this is the only working demo of discrete normalizing flows in PyTorch. The code in edward2 (implemented in TF2 and Keras lacks any tutorials. Since the release of this repo, demo code for reproducing Figure 2 using edward2 has been shared here.)

To use this package, clone the repo satisfy the below package requirements, then run DiscreteFlowDemo.ipynb. If this works, you can run Figure2Replication.ipynb where I fail to replicate the Figure 2 and any other aspects of Discrete Flows.

Requirements: Python 3.0+ PyTorch 1.2.0+ Numpy 1.17.2+

Implementation details: NB. Going from Andre Karpathy's notation, flow.reverse() goes from the latent space to the data and flow.forward() goes from the data to the latent space. This is the inverse of some other implementations including the original Tensorflow one. Implements Bipartite and Autoregressive discrete normalizing flows. Also has an implementation of MADE and a simple MLP.

Something I didn't know from reading the paper or Edward2 code is that the prior distribution itself is learnt during training. This makes the model more expressive and is acceptable because the prior is a factorized distribution that is still simple enough to sample from and work with.

TODOs:

  • Ensure that the scaling functionality works (this should not matter for being able to reproduce the first few figures.)
  • Reproduce the figures from the original paper (I have been unable to do this thus far...)
  • Allow MADE autoregressive flow to have a non-natural ordering.

Failed replication of Figure 2 mixture of gaussians:

Figure 2 in the paper looks like this:

PaperFigure

Meanwhile, in my attempted replication (checkout Figure2Replication.ipynb) where I have the same data and factorized distribution, my generated outputs are:

AnalyzeTraining

And sampling from the latent space and doing a backwards pass from z->x through the trained model gives:

GenSamples

pytorchdiscreteflows's People

Contributors

trentbrick avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.