Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Is it possible to use with different shaped graph data? #2

Open
BraveDistribution opened this issue Feb 9, 2022 · 1 comment
Open

Comments

@BraveDistribution
Copy link

Hi there,

is there a way how to adjust the attention based layers to train the network with differently shaped graph data?

thanks.

@NingXJ1999
Copy link
Collaborator

The attention layer we use computes attention for a certain dimension in a 4D vector.

  • If you want to compute attention on other dimensions of a 4D vector, you can use the transpose or permute operations to change the dimension order of the vector.
  • If the input is a 3D vector, I think you can consider simplifying the calculation equation for attention.

For other layers, you may need to redesign the network structure according to your specific situation.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants