The code is adopted and changed according to our needs from https://github.com/tkipf/pygcn
This github provides the implementation to reproduce the results in paper `Analysis of Convolutions, Non-linearity and Depth in Graph Neural Networks using Neural Tangent Kernel' (arxiv).
-
Install the required packages
pip install -r requirement.txt
-
cd ntk_gcn
-
There are three actions possible:
- To get the performance of depth=[1,2,4,8,16] on real datasets cora and citeseer, or DC-SBM using NTK, run the following script by changing the arguments accordingly
-
Linear/ReLU GCN --
python train.py --dataset "cora" --gcn_linear 0 --gcn_skip 0 --adj_norm "row_norm"
: pass --dataset as "citeseer" or "dc_sbm", --adj_norm as "col_norm" or "sym_norm" or "unnorm", for Linear GCN --gcn_linear 1 -
Linear/ReLU Skip-PC or Skip-alpha --
python train.py --dataset "cora" --gcn_linear 0 --gcn_skip 1 --skip_form "gcn" --adj_norm "row_norm"
: similar arguments as above and for skip alpha pass --skip_form "gcnii"
- To get the kernels similar to the ones in the paper,
- pass
--order_by_cls 1 --save_kernel 1
in the script and the kernel gets saved in the current working directory with the name 'dataset_norm_xxt_0_skip_form_depth.npy
. For getting DC-SBM results pass--dataset "dc_sbm"
- The kernels can be loaded in numpy and visualized as heatmaps
- To train the GCN of depth d,
- pass
--train_gcn 1 --layers d --csigma 1
argument along with the above arguments. Notecsigma
should be1
for linear GCN and2
for ReLU GCN.