Coder Social home page Coder Social logo

automl / svge Goto Github PK

View Code? Open in Web Editor NEW
12.0 9.0 0.0 11.14 MB

Smooth Variational Graph Embeddings for Efficient Neural Architecture Search

License: Apache License 2.0

Makefile 0.01% C++ 2.32% Python 92.26% Shell 0.11% CSS 0.03% C 1.78% Cuda 2.60% TeX 0.57% HTML 0.17% Batchfile 0.01% Gnuplot 0.01% Cython 0.16%
automl neural-architecture-search nas bayesian-optimization variational-autoencoders graph-neural-networks

svge's Introduction

Smooth Variational Graph Embeddings for Efficient Neural Architecture Search

"Smooth Variational Graph Embeddings for Efficient Neural Architecture Search"
Jovita Lukasik, David Friede, Arber Zela, Frank Hutter, Margret Keuper.
IJCNN 2021

Abstract

Neural architecture search (NAS) has recently been addressed from various directions, including discrete, sampling-based methods and efficient differentiable approaches. While the former are notoriously expensive, the latter suffer from imposing strong constraints on the search space. Architecture optimization from a learned embedding space for example through graph neural network based variational autoencoders builds a middle ground and leverages advantages from both sides. Such approaches have recently shown good performance on several benchmarks. Yet, their stability and predictive power heavily depends on their capacity to reconstruct networks from the embedding space. In this paper, we propose a two-sided variational graph autoencoder, which allows to smoothly encode and accurately reconstruct neural architectures from various search spaces. We evaluate the proposed approach on neural architectures defined by the ENAS approach, the NAS-Bench-101 and the NAS-Bench-201 search space and show that our smooth embedding space allows to directly extrapolate the performance prediction to architectures outside the seen domain (e.g. with more operations). Thus, it facilitates to predict good network architectures even without expensive Bayesian optimization or reinforcement learning.

Installation

In order to use this code, install requirements:

pip install -r requirements.txt

Datasets

You can download the prepared ENAS dataset here and place them to datasets/ENAS
You can download the prepared NAS-Bench-101 dataset here and place them to datasets/nasbench101
You can download the prepared NAS-Bench-B201 dataset here and place them to datasets/nasbench201

Also download nasbench_only108.tfrecord and save it to datasets/nasbench101 and save NAS-Bench-201-v1_0-e61699.pth in datasets/nasbench201

Pretrained VAE models:

Load pretrained state dicts to folder state_dicts

Run experiments from the paper

Vae training:

export PYTHONPATH=$PWD
python Training_VAE/train_svge.py --model SVGE --data_search_space NB101 

The model can be changed to DGMG, and the data_search space to NB201 or ENAS.

Performance Prediction:

export PYTHONPATH=$PWD
python Performance_Prediction/train_surrogate_model.py --model SVGE_acc --data_search_space NB101 

Extrapolation:

Extrapolation_Ability/eval_extrapolation.py --model SVGE_acc  --data_search_space NB101 --path_state_dict state_dicts/SVGE_acc_NB101/

change data_search_space to ENAS and also path_state_dict state_dicts/SVGE_acc_ENAS/

To train the 12-layer ENAS architecture: cd Bayesian_Optimization and change the flat enas architcture represetation in arcs_scores in fully_train_ENAS12.py

python fully_train_ENAS12.py

To train the best NAS-Bench-101 cell include adjacency matrix and node operations in Extrapolation_Ability/cell_training/train_cell.py To train certain cell on CIFAR10

python train_cell.py --data_set cifar10

to train on ImageNet16-120, first download Imagenet16 from NAS-Bench-201 repo and save it in 'Extrapolation_Ability/cell_training/data/'

python train_cell.py --data_set Imagenet16 --batch_size 256 --epochs 200 --val_portion 0.5

Bayesian Optimization

For performing Bayesian optimzation, first follow the steps from the official D-VAE repository
Change to directory Bayesian Optimization and run for BO on NAS-Bench-101

./run_bo_Nb101.sh 

and

./run_bo.sh

to run BO on ENAS

Citation

@inproceedings{LukasikSVGe2021,
  author    = {Jovita Lukasik and
               David Friede and
               Arber Zela and
               Frank Hutter and
               Margret Keuper},
  title     = {Smooth Variational Graph Embeddings for Efficient Neural Architecture
               Search},
  booktitle = {International Joint Conference on Neural Networks, {IJCNN} 2021, Shenzhen,
               China, July 18-22, 2021},
  year      = {2021},
}

Reference

svge's People

Contributors

arberzela avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

svge's Issues

Planned release?

Hello,

May I ask when the authors are planning to release this repository?

Thanks and Regards,
Yash Akhauri

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.