Skip to content

Commit

Permalink
skip te unit test since fused attention only works when cuda>=12.1
Browse files Browse the repository at this point in the history
  • Loading branch information
tocean committed Oct 17, 2023
1 parent 29c6aab commit caeb253
Showing 1 changed file with 3 additions and 0 deletions.
3 changes: 3 additions & 0 deletions tests/te/test_replacer.py
Original file line number Diff line number Diff line change
Expand Up @@ -33,6 +33,9 @@ def tearDown(self):
@decorator.cuda_test
def test_replace(self):
"""Test replace function in TeReplacer."""
# fused attention need cuda version >= 12.1
if torch.version.cuda < '12.1':
return
te_transformer = te.TransformerLayer(
self.hidden_size, self.ffn_hidden_size, self.num_attention_heads, fuse_qkv_params=True
)
Expand Down

0 comments on commit caeb253

Please sign in to comment.