Skip to content

Commit

Permalink
don't try cutlass on amd
Browse files Browse the repository at this point in the history
ghstack-source-id: 1fd9ca2733dd6532b190170a8fa3b6f42e088e6c
Pull Request resolved: fairinternal/xformers#1284

__original_commit__ = fairinternal/xformers@a676f4c
  • Loading branch information
bottler authored and xFormers Bot committed Jan 31, 2025
1 parent 65b9f2a commit 9457621
Showing 1 changed file with 4 additions and 0 deletions.
4 changes: 4 additions & 0 deletions xformers/ops/fmha/torch_attention_compat.py
Original file line number Diff line number Diff line change
Expand Up @@ -20,6 +20,10 @@ def is_flash_attention_available():


def is_pt_cutlass_compatible(force: bool = False) -> bool:
if torch.version.hip is not None:
if force:
raise ImportError("CUTLASS is not supported on ROCm")
return False
compatible = True

fwd_schema_str = (
Expand Down

0 comments on commit 9457621

Please sign in to comment.