Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

a minor error in class DenseAtt(nn.Module) #35

Open
ruilialice opened this issue Nov 21, 2021 · 2 comments
Open

a minor error in class DenseAtt(nn.Module) #35

ruilialice opened this issue Nov 21, 2021 · 2 comments

Comments

@ruilialice
Copy link

ruilialice commented Nov 21, 2021

Thank for the excellent work and the well written code! But it seems that there is a minor error in class DenseAtt(nn.Module), which is used in the attention based aggregation part.

If we set "--use-att 1 --local-agg 1", which means that the algorithm will use the attention mechanism to update the embedding of the node, we will use class DenseAtt(nn.Module) .
However, there seems something wrong in line 26 here here

According to the formula (8) in sec 4.3 of the original paper, we should use the softmax function to compute the attention weight, but the code here here seems use the sigmoid function and multiplies with the adjacent matrix.

@olayinkaajayi
Copy link

Thank you for bringing this up. I just found this issue now while going through the code. Was hoping there might be a reason for this.

Also, since we are working with local aggregation, the sigmoid bit (supposed to be softmax) would need to use only the neighbours of each node for calculation. So non-neighbours can be masked with large values before passing to softmax, which would result in zero for those values.

@lannester666
Copy link

I find that set "--use-att 1 --local-agg 1", the result is far lower that the original paper,and will consume more time.But with "--use-att 1 --local-agg 0", the result is right with the paper.Why will this happen?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants