Skip to content

Commit

Permalink
add notes text file
Browse files Browse the repository at this point in the history
  • Loading branch information
bashbaug committed May 31, 2024
1 parent f4dfa4d commit 2a6bf08
Showing 1 changed file with 5 additions and 0 deletions.
5 changes: 5 additions & 0 deletions samples/99_flashattention/notes.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
https://github.com/intel/mlir-extensions/blob/main/test/Integration/Dialect/XeGPU/flash_attention_fwd.mlir

https://github.com/intel/intel-extension-for-pytorch/blob/xpu-main/csrc/gpu/aten/operators/xetla/kernels/SDP/fmha_forward.hpp

https://github.com/NVIDIA/cudnn-frontend/blob/main/docs/operations/Attention.md

0 comments on commit 2a6bf08

Please sign in to comment.