Skip to content

Commit

Permalink
after merge correction
Browse files Browse the repository at this point in the history
  • Loading branch information
Aleksandr Malyshev committed Oct 22, 2024
1 parent c801632 commit ff36712
Showing 1 changed file with 0 additions and 5 deletions.
5 changes: 0 additions & 5 deletions vllm/attention/backends/rocm_flash_attn.py
Original file line number Diff line number Diff line change
Expand Up @@ -621,11 +621,6 @@ def forward(
value = value[:num_prefill_tokens]

if prefill_meta := attn_metadata.prefill_metadata:
(query_seq_start_loc, query_max_seq_len, key_seq_start_loc,
key_max_seq_len, seq_lens,
causal_mask) = _get_seq_len_block_table_args(
prefill_meta, attn_type)

# Prompt run.
# normal attention and DECODER
if attn_type == AttentionType.DECODER and (
Expand Down

0 comments on commit ff36712

Please sign in to comment.