Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Obtaining feature importances from attention maps #52

Open
sachinvs7 opened this issue Nov 13, 2024 · 3 comments
Open

Obtaining feature importances from attention maps #52

sachinvs7 opened this issue Nov 13, 2024 · 3 comments

Comments

@sachinvs7
Copy link

Hi
Do you have code examples to showcase section 5.3 in the Revisiting Deep Learning Models for Tabular Data paper? - "Obtaining feature importances from attention maps."

I'm implementing the FT-Transformer architecture and would like to derive/plot both attention maps and IG on a dataset. Couldn't find any examples of such methods for tabular deep learning online. Appreciate your guidance. Thanks !

@Yura52
Copy link
Collaborator

Yura52 commented Nov 13, 2024

Hi! Regarding our method for obtaining the feature importance maps, I recommend taking a look at this discussion: #2

As for IG, I don't remember for sure what we used to compute it, but I suspect we used the Captum library.

Does this help?

@sachinvs7
Copy link
Author

Thanks for getting back!. Also wanted to confirm if "rtdl_revisiting_models" is indeed the main package to focus on.
In discussion #2; n_features can include categorical ones too?

@Yura52
Copy link
Collaborator

Yura52 commented Dec 7, 2024

Hi!

  • Yes, rtdl_revisiting_models is the recommended implementation of the models from the paper.
  • Yes, I think the code in Feature importance results #2 allows using categorical features as well.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants