Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update flash attention op #616

Merged
merged 1 commit into from
Nov 27, 2024
Merged

Update flash attention op #616

merged 1 commit into from
Nov 27, 2024

Conversation

saienduri
Copy link
Contributor

@saienduri saienduri commented Nov 27, 2024

This commit updates the flash attention op to adhere to the addition of is_causal and scale args added by this commit: 2d46caa. Without this, we are seeing a fp8 attention export failure

@saienduri saienduri merged commit d6be43f into main Nov 27, 2024
5 of 8 checks passed
@saienduri saienduri deleted the attention-fix branch November 27, 2024 03:14
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants