Coder Social home page Coder Social logo

nas-without-training's Introduction

This repository contains code for replicating our paper, NAS Without Training.

Setup

  1. Download the datasets.
  2. Download NAS-Bench-201.
  3. Install the requirements in a conda environment with conda env create -f environment.yml.

We also refer the reader to instructions in the official NAS-Bench-201 README.

Reproducing our results

To reproduce our results:

conda activate nas-wot
./reproduce.sh 3 # average accuracy over 3 runs
./reproduce.sh 500 # average accuracy over 500 runs (this will take longer)

Each command will finish by calling process_results.py, which will print a table. ./reproduce.sh 3 should print the following table:

Method Search time (s) CIFAR-10 (val) CIFAR-10 (test) CIFAR-100 (val) CIFAR-100 (test) ImageNet16-120 (val) ImageNet16-120 (test)
Ours (N=10) 1.75 89.50 +- 0.51 92.98 +- 0.82 69.80 +- 2.46 69.86 +- 2.21 42.35 +- 1.19 42.38 +- 1.37
Ours (N=100) 17.76 87.44 +- 1.45 92.27 +- 1.53 70.26 +- 1.09 69.86 +- 0.60 43.30 +- 1.62 43.51 +- 1.40

./reproduce 500 will produce the following table:

Method Search time (s) CIFAR-10 (val) CIFAR-10 (test) CIFAR-100 (val) CIFAR-100 (test) ImageNet16-120 (val) ImageNet16-120 (test)
Ours (N=10) 1.67 88.61 +- 1.58 91.58 +- 1.70 67.03 +- 3.01 67.15 +- 3.08 39.74 +- 4.17 39.76 +- 4.39
Ours (N=100) 17.12 88.43 +- 1.67 91.24 +- 1.70 67.04 +- 2.91 67.12 +- 2.98 40.68 +- 3.41 40.67 +- 3.55

To try different sample sizes, simply change the --n_samples argument in the call to search.py, and update the list of sample sizes this line of process_results.py.

Note that search times may vary from the reported result owing to hardware setup.

Plotting histograms

In order to plot the histograms in Figure 1 of the paper, run:

python plot_histograms.py

to produce:

alt text

The code is licensed under the MIT licence.

Acknowledgements

This repository makes liberal use of code from the AutoDL library. We also rely on NAS-Bench-201.

Citing us

If you use or build on our work, please consider citing us:

@misc{mellor2020neural,
    title={Neural Architecture Search without Training},
    author={Joseph Mellor and Jack Turner and Amos Storkey and Elliot J. Crowley},
    year={2020},
    eprint={2006.04647},
    archivePrefix={arXiv},
    primaryClass={cs.LG}
}

nas-without-training's People

Contributors

d-x-y avatar daikikatsuragawa avatar jack-willturner avatar

Watchers

 avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.