lollcat / fab-torch Goto Github PK
View Code? Open in Web Editor NEWFlow Annealed Importance Sampling Bootstrap (FAB). ICLR 2023.
License: MIT License
Flow Annealed Importance Sampling Bootstrap (FAB). ICLR 2023.
License: MIT License
Use more ideas from the prioritised replay paper: https://arxiv.org/pdf/1511.05952.pdf
Add linting, typing and documentation for all functions
Create BNN target problem
See here for the structure that we are aiming to be able fit easily into.
Following this, Vincent can plug this module into his code to run the Aladine Dipeptide example.
This #75 PR has made changes that may effect the results of these experiments.
I re-ran the notebooks to make sure they were working nicely.
Additionally the full experiments should be re-run to make sure they are working nicely.
For the buffer we can shuffle samples between epochs over the minibatches.
Replicate the results from the "Bootstrap Your Flow" paper, and add clear example notebooks for these.
Currently, have manually placed points on distributions modes. Additionally we can create a 2D test-set via MCMC and then sample from this for pairs of dimensions for higher dimensional many well problems to get samples approximately from p(x).
I would like to play around with the experiments for the alanine dipeptide.
In order to use the train.py file in experiments/aldp, I need the val.pt file that is used for evaluate_aldp.
It would be really helpful if this data could also be provided, so that I can run the code on my local machine
Fix bug in geometric spacing for ais. This doesn't currently effect any of the experiments, as they use linear spacing.
Currently there are many decisions that can be made in the algorithm - these should be benchmarked in simple tests to get a good idea of their effects. This includes:
Testing various versions of the loss
Use exponential moving average of normalisation constant
When calculating fab-loss, we can (1) use the unnormalise log weights returned by AIS, as this will still give an expectation proportional to alpha divergence with alpha=2, or (2) we can normalise using the current batch of weights, or (3) we could use an exponential moving average of the normalisation constant, calculated during the training process.
Currently we are doing (2), but it may be better to do (3), and it is worth comparing the performance of all three approaches.
Testing various transition operators
Performance bottlenecks
Which parts of FAB are the slowest - can we add JIT to speed these up?
The following improvements can be made to the GMM problem
set_eval_mode=False
results in the step size being tuned even if adjust_step_size
was initially set to False.Setup basic repository structure.
Will add more detailed notes to this issue as I go along.
For Many Well problem allow evaluation of samples and log weights, without having to specify log_prob_fn
Make it easy for user to enter criterion for samples to be valid.
Automatically filter invalid samples from the buffer.
By default this can be if the samples are out of a bounds, or the target/flow log prob is infinate/nan.
Currently this is done in the buffer direclty, but it would be better to expose the option to the user so that they can control it, and so that it's effects are clear.
For alanine dipeptide this could be used to optionally filter based on chirality.
Add the following alternative losses:
Run tests on double well / many well & GMM problems.
Look into using replay memory (saving samples, log weights and target log probs) in a large data-structure and re-using them), and/or PPO style re-use of samples - as using samples once per the current approach seems super inefficient.
Configs for dw4 and many well should be made cleaner.
use_buffer
and prioritised_buffer
)A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.