Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

attention layer on top of LSTMs #82

Open
Saran-nns opened this issue May 4, 2021 · 0 comments
Open

attention layer on top of LSTMs #82

Saran-nns opened this issue May 4, 2021 · 0 comments
Assignees
Labels
enhancement New feature or request help wanted Extra attention is needed

Comments

@Saran-nns
Copy link
Member

Attention mechanisms seem to improve the time series prediction/forecasting and classification performance sample paper

Deep learning models in traja can easily accommodate the attention layer

  1. Create a self-attention mechanism wrapper Reference
  2. Inject the attention layer instance on top of LSTM layers before and after encoding. Example here and here
  3. Add optional boolean arg for attention in autoencoding(ae, vae, vaegan) base models.
@Saran-nns Saran-nns added the enhancement New feature or request label May 4, 2021
@Saran-nns Saran-nns self-assigned this May 4, 2021
@Saran-nns Saran-nns added the help wanted Extra attention is needed label May 4, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request help wanted Extra attention is needed
Projects
None yet
Development

No branches or pull requests

1 participant