Skip to content

Pull requests: Dao-AILab/flash-attention

Author
Filter by author
Loading
Label
Filter by label
Loading
Use alt + click/return to exclude labels
or + click/return for logical OR
Projects
Filter by project
Loading
Milestones
Filter by milestone
Loading
Reviews
Assignee
Filter by who’s assigned
Sort

Pull requests list

fix bug when is_grad is false
#1406 opened Dec 22, 2024 by woaixiaoxiao Loading…
Add missing tests/__init__.py
#1405 opened Dec 20, 2024 by BioGeek Loading…
Fix incorrect torch dtype
#1399 opened Dec 19, 2024 by kevmo314 Loading…
Create PEP 517 build metadata
#1394 opened Dec 18, 2024 by frostming Loading…
Add hipBLAS/cuBLAS distinction in benchmark_gemm.py
#1393 opened Dec 17, 2024 by garrettbyrd Loading…
fix a bug (issue #1390) caused by typo
#1392 opened Dec 17, 2024 by liguohao96 Loading…
Fix deprecation warnings
#1382 opened Dec 12, 2024 by rongou Loading…
flashattnvarlen support tree attention
#1188 opened Aug 30, 2024 by efsotr Loading…
add softmax_d for mha_bwd
#1161 opened Aug 19, 2024 by MayDomine Loading…
Windows actions
#1036 opened Jul 9, 2024 by bdashore3 Loading…
change condition to num_heads >= num_heads_k
#1030 opened Jul 5, 2024 by xenshinu Loading…
Fix +/-inf in LSE returned by forward
#978 opened Jun 3, 2024 by sgrigory Loading…
add pyproject.toml with build dependencies
#958 opened May 17, 2024 by dhellmann Loading…
Relative position encoding
#956 opened May 14, 2024 by b-albar Loading…
1 of 4 tasks
ALiBi for the non-flash code path
#858 opened Feb 29, 2024 by Markus28 Loading…
ProTip! Type g p on any issue or pull request to go back to the pull request listing page.