Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Ablation study and official code problem #46

Open
xingshulicc opened this issue Sep 7, 2022 · 0 comments
Open

Ablation study and official code problem #46

xingshulicc opened this issue Sep 7, 2022 · 0 comments

Comments

@xingshulicc
Copy link

First, thanks for your contribution, this paper inspired me a lot.
However, I still have some questions as follows, I hope you can answer:

  1. about the code, I think the output shape of Unfold operation is not right in your code, even if it has no error
  2. about the ablation study, it would be better to compare the dynamic convolution with your outlook attention. they are very similar exactly with each other, the only difference is the weights generation method. I am very interested in this.
  3. according to your paper, I modified your code with my own understanding:
    https://github.com/xingshulicc/Vision-In-Transformer-Model/blob/main/outlook_attention.py.
    Hoping you can give me some advice on my code.
    Thanks again.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant