Coder Social home page Coder Social logo

xrosliang / gb-gnn Goto Github PK

View Code? Open in Web Editor NEW

This project forked from delta2323/gb-gnn

0.0 0.0 0.0 455 KB

Optimization and Generalization Analysis of Transduction through Gradient Boosting and Application to Multi-scale Graph Neural Networks

Home Page: https://arxiv.org/abs/2006.08550

License: Other

Jupyter Notebook 95.27% Python 4.60% Shell 0.13%

gb-gnn's Introduction

This is the code for the paper titled Optimization and Generalization Analysis of Transduction through Gradient Boosting and Application to Multi-scale Graph Neural Networks (arXiv).

Dependency

  • networkx==2.1
  • numpy==1.15.1
  • optuna==1.3.0
  • pytorch-ignite==0.3.0
  • scipy==1.1.0
  • torch==1.5.0
  • pytest==5.0.1 (for testing)

Preparation

Place https://github.com/tkipf/gcn/tree/master/gcn/data as lib/dataset/data/kipf/ (e.g., gcn/data/ind.citeseer.allx should be copied to lib/dataset/data/kipf/ind.citeseer.allx.)

Testing

Unit test

PYTHONPATH=$PYTHONPATH:. pytest test/

Small experiment test (run on GPU device 0)

bash test.sh

Usage

Commands

GB-GNN-Adj

bash run.sh --device <gpu_id> --dataset <dataset> --min-layer <L_min> --max-layer <L_max> --aggregation-model adj

GB-GNN-Adj + Fine Tuning

bash run.sh --device <gpu_id> --dataset <dataset> --min-layer <L_min> --max-layer <L_max> --aggregation-model adj --fine-tune

GB-GNN-KTA

bash run.sh --device <gpu_id> --dataset <dataset> --min-layer <L_min> --max-layer <L_max> --aggregation-model kta

GB-GNN-KTA + Fine Tuning

bash run.sh --device <gpu_id> --dataset <dataset> --min-layer <L_min> --max-layer <L_max> [--n-weak-learners 40] --aggregation-model kta --fine-tune

We set the maximum number of weak learners to 40, as opposed to the default value 100 due to memory constraints in the main paper. To reproduce it, we should set --n-weak-learners 40.

GB-GNN-II

bash run.sh --device <gpu_id> --dataset <dataset> --min-layer <L_min> --max-layer <L_max> --aggregation-model ii

GB-GNN-II + Fine Tuning

bash run.sh --device <gpu_id> --dataset <dataset> --min-layer <L_min> --max-layer <L_max> --aggregation-model ii --fine-tune

Option

  • <gpu_id>: GPU ID in use. If we use -1, the code runs on CPU
  • <dataset>: Dataset type. Either cora, citeseer, or pubmed values are allowed.
  • <L_min>, <L_max>: The minimum and maximum number of hidden layer size of the hyperparameter optimization search space. If we want to fix the hidden layer size to L, use L_min=L_max=L.

Output

It creates the output directory whose name is the execution time of the form YYMMDD_HHMMSS. The directory has the following files (not a comprehensive list).

  • acc.json: The accuracies on training, validation, and test datasets.
  • loss/: The transition of loss values of the best hyperparameter set on training (train.npy), validation (validation.npy), and test (test.npy) datasets
  • cosine.npy: The transition of cosine values between weak learners and negative gradient on the training dataset.
  • best_params.json: Chosen hyperparameters.

Accuracies, loss values, and cosine values are for the model with the best hyperparameter set.

Directory Structures

  • app: Experiment execution scripts
  • lib: Implementation of models and their training and evaluation procedures.
  • analysis: Notebooks for post processing experiment results.
  • test: Unit test code

gb-gnn's People

Contributors

delta2323 avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.