snap-research / graphless-neural-networks Goto Github PK
View Code? Open in Web Editor NEW[ICLR 2022] Code for Graph-less Neural Networks: Teaching Old MLPs New Tricks via Distillation (GLNN)
License: MIT License
[ICLR 2022] Code for Graph-less Neural Networks: Teaching Old MLPs New Tricks via Distillation (GLNN)
License: MIT License
@ShichangZh Hello, I would like to ask how the inference time in the paper is calculated, I did not find the relevant code part, thank you!
Thanks for sharing the code! The random seed in train_teacher.py seems not to work as every time run python train_teacher.py --exp_setting tran --teacher SAGE --dataset cora
will generate different results even with the same seed. Accordingly, we cannot reproduce the exact same results as stated in the paper when running bash experiments/sage_cpf.sh
. This seems a bug since the point of random seed is to reproduce the results. Could you please fix this?
The two different results with same random seed 0 (python train_teacher.py --exp_setting tran --teacher SAGE --dataset cora
):
The results of bash experiments/sage_cpf.sh
, which is different to the paper:
Hi! I am trying to start running the code but I have encountered the following error I can't figure out when trying to load the .npz cora file.
Using backend: pytorch
WARNING:root:The OGB package is out of date. Your version is 1.3.3, while the latest version is 1.3.5.
Traceback (most recent call last):
File "/home/aaron/anaconda3/envs/glnn/lib/python3.6/site-packages/numpy/lib/npyio.py", line 460, in load
return pickle.load(fid, **pickle_kwargs)
_pickle.UnpicklingError: invalid load key, '\x0a'.
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "train_teacher.py", line 346, in
main()
File "train_teacher.py", line 329, in main
score = run(args)
File "train_teacher.py", line 210, in run
labelrate_val=args.labelrate_val,
File "/home/aaron/graphless-neural-networks/dataloader.py", line 49, in load_data
kwargs["labelrate_val"],
File "/home/aaron/graphless-neural-networks/dataloader.py", line 85, in load_cpf_data
data = load_npz_to_sparse_graph(data_path)
File "/home/aaron/graphless-neural-networks/dataloader.py", line 526, in load_npz_to_sparse_graph
with np.load(file_name, allow_pickle=True) as loader:
File "/home/aaron/anaconda3/envs/glnn/lib/python3.6/site-packages/numpy/lib/npyio.py", line 463, in load
"Failed to interpret file %s as a pickle" % repr(file))
OSError: Failed to interpret file PosixPath('/home/aaron/graphless-neural-networks/data/cora.npz') as a pickle
I think it has something to do with how the file is saved with different versions of numpy? I have used the same exact requirements.txt file for the conda environment.
Thanks!
As mentioned in the paper, the inductive inference time of GLNN is compared to other inference acceleration methods of GNN on 10 randomly chosen nodes, but the code does not include these experiments. So could you please provide some more details about how the inference time is measured and compared ?
hello, I try to re-run citeseer under transduction setting.
The seeds are 1 2 3 4 5.
I get an average of 71.22, proving the correctness of my experiments.
however, for min-cut: I get
0.7159
0.6828
0.7444
0.9163
0.5613
It is highly unstable.
Meanwhile, for GLNN, I get:
0.9457
0.9499
0.9519
0.9670
0.9278
So maybe the min-cut just work for GLNN well and fail to capture the graph topology
There is an undefined function in your code(dataloader.py, line 257, rand_train_test_idx). I can't find the function from your code and imported packages. What is this?
the packages in requirement.txt is incomplete , and fail to use bash ./prepare_env.sh to install some packages
I have a question about the function graph_split
in the file utils.py.
According to the code, the tensors idx_test_ind and obs_idx_train may overlap.
def graph_split(idx_train, idx_val, idx_test, rate, seed):
idx_test_ind, idx_test_tran = idx_split(idx_test, rate, seed)
idx_obs = torch.cat([idx_train, idx_val, idx_test_tran])
N1, N2 = idx_train.shape[0], idx_val.shape[0]
obs_idx_all = torch.arange(idx_obs.shape[0])
obs_idx_train = obs_idx_all[:N1]
obs_idx_val = obs_idx_all[N1 : N1 + N2]
obs_idx_test = obs_idx_all[N1 + N2 :]
return obs_idx_train, obs_idx_val, obs_idx_test, idx_obs, idx_test_ind
For example, let V = [0,1,2,3,4,5] be all nodes in the graph and idx_train = [1,2], idx_val = [3,4], idx_test = [0, 5].
Suppose that idx_test_ind = [0] and idx_test_tran = [5] after the function idx_split(). Then we have idx_obs = [1,2,3,4,5], N1=2, N2 = 2, and obs_idx_all = [0,1,2,3,4]. Hence, the resulting observed sets are obs_idx_train = [0,1], obs_idx_val = [2,3], obs_idx_test = [4].
This means that idx_test_ind and obs_idx_train both have the element 0, which contradicts the inductive scenario.
paper:https://openreview.net/forum?id=Cs3r5KLdoj
code:https://github.com/meettyj/NOSMOG
超级overclaim,把min-cut说成自己提出来的。把几个GNN2MLP的issue说成自己发现的。
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.