Skip to content

Comparison between sparse training and SLR method on GNN dataset

License

Notifications You must be signed in to change notification settings

harveyp123/ICCD_SpTrn_SLR

Repository files navigation

Towards Sparsification of Graph Neural Networks

Comparison between SLR method and sparse training method on GNN dataset.





Train and prune:

we use the SLR method for train and prune.

1. Link prediction & SLR

Folder SLR_Link_Pred is for SLR training for link prediction, which follows dense training -> reweight training -> sparse training procedure.

2. Node classification & SLR

Folder SLR_Node_Class is for SLR training for node classification, which follows dense training -> reweight training -> sparse training procedure.

3. Graph Convolution Networks for Node classification datasets & SLR

Folder SLR_GCN_Node_Class is for SLR training for node classification, which follows dense training -> reweight training -> sparse training procedure. The training and prune experiment is done on 3 dataset: Cora, Pubmed, and CiteSeer

Sparse training:

we follow same experiment setup as RigL paper (rigging the lottery: making all tickets winners), using weight magnitute for drop and weight gradient for grow.

Publication

If you use our code in your design, please cite our ICCD'22 paper:

@inproceedings{PengSpsGNN,
  title={Towards Sparsification of Graph Neural Networks},
  author={Peng, Hongwu and Gurevin, Deniz and Huang, Shaoyi and Geng, Tong and Jiang, Weiwen and Khan, Omer and Ding, Caiwen},
  booktitle={2022 IEEE 40th International Conference on Computer Design (ICCD)},
  year={2022}
}

About

Comparison between sparse training and SLR method on GNN dataset

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 3

  •  
  •  
  •