Coder Social home page Coder Social logo

lightonai / dfa-scales-to-modern-deep-learning Goto Github PK

View Code? Open in Web Editor NEW
80.0 9.0 10.0 24.07 MB

Study on the applicability of Direct Feedback Alignment to neural view synthesis, recommender systems, geometric learning, and natural language processing.

License: MIT License

Python 98.13% Shell 1.87%

dfa-scales-to-modern-deep-learning's Introduction

Direct Feedback Alignment Scales to Modern Deep Learning Tasks and Architectures

GitHub license Twitter

Code for our paper Direct Feedback Alignment Scales to Modern Deep Learning Tasks and Architectures.

We study the applicability of Direct Feedback Alignment (DFA) to neural view synthesis, recommender systems, geometric learning, and natural language processing. At variance with common beliefs, we show that challenging tasks can be tackled in the absence of weight transport.

Reproducing our results

  • Instructions for reproduction are given within each task folder, in the associated README.md file.

Requirements

  • A requirements.txt file is available at the root of this repository, specifying the required packages for all of our experiments;
  • Our DFA implementation, TinyDFA, is pip-installable: from the TinyDFA folder, run pip install .;
  • tsnecuda may require installation from source: see the tsne-cuda repository for details;
  • Neural rendering datasets can be found on the NeRF website—other datasets will be automatically fetched.

Citation

If you found this code and findings useful in your research, please consider citing:

@article{launay2020dfascaling,
  title={Direct Feedback Alignment Scales to Modern Deep Learning Tasks and Architectures},
  author={Launay, Julien and Iacopo, Poli and Francois, Boniface and Krzakala, Florent},
  journal={arXiv preprint arXiv:2006.12878},
  year={2020}
}

About LightOn/LightOn Cloud

LightOn develops a light-based technology required in large scale artificial intelligence computations. Our ambition is to significantly reduce the time and energy required to make sense of the world around us.

Please visit https://cloud.lighton.ai/ for more information on accessing our technology.

dfa-scales-to-modern-deep-learning's People

Contributors

fraboniface avatar iacolippo avatar llucid-97 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

dfa-scales-to-modern-deep-learning's Issues

Need better instructions for installing dependencies

Congratulations on your paper. I was trying to play around a bit and found that the dependencies are not easy to install (at least on arm64)

  1. The requirements file is missing an = sign on line that refers to seaborn.
  2. The package sentencepiece==1.9.0 is not available with pip on arm64.
  3. torch==1.5.0+cu101 is not available with pip

The first two problems are easily fixable and the third one too, but having encountered three hiccups in installing just half of the dependencies listed in requirements.txt, I decided to file this issue. Please add better instructions for installing dependencies, or if possible, please provide docker images (arm64 compatible) to solve all hassles in one go.

Where do I get `bpe_models/wikitext-2.bpe.32000`?

I'm trying to run train_lm.py to replicate Table 5 in your paper. For the simple baseline experiment python train_lm.py --gpu_id 0 --beta2 0.98 I am missing the file bpe_models/wikitext-2.bpe.32000. How do I download or generate this file? Also the default --dataset is 'wikitext103', not 'wikitext2', is this compatible with bpe_models/wikitext-2.bpe.32000?

Thanks! And cool project!

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.