Skip to content

wicaksonoleksono/ASTGCN-Knowledge-Distilation

Repository files navigation

GCN-Knowledge_distilation

This is a repository on how to utilize 2 types of Knowledge distilation on a regression poblem, Feature based Knowledge distilation and Response based knowledge distilation. I use these paper as a reference : https://arxiv.org/pdf/1908.00858.pdf , https://arxiv.org/pdf/1412.6550.pdf

bellow are the best tuned parameter alpha on GCN model,

Alpha is the ratio of used theacher output and ground truth output as a method of Knowledge distilation Best Condition

bellow are the result of Knowledge distilation On between control groups,

Student model (Reduced paremeters), Teacher model(Base model) and Distilled Model(Reduced Parameters with Knowledge Distilation Assistance) Best Condition2

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published