Skip to content

Commit

Permalink
Fixed permute
Browse files Browse the repository at this point in the history
  • Loading branch information
isamu-isozaki committed Aug 8, 2023
1 parent 1f153a3 commit a49b2b6
Showing 1 changed file with 1 addition and 0 deletions.
1 change: 1 addition & 0 deletions muse/modeling_transformer.py
Original file line number Diff line number Diff line change
Expand Up @@ -847,6 +847,7 @@ def forward(self, hidden_states, encoder_hidden_states=None, encoder_attention_m
hidden_states = hidden_states.view(b, c, h, w)
attention_output = self.attention(hidden_states)
attention_output = attention_output.view(b, c, seq_length)
attention_output = attention_output.permute(0, 2, 1)
if self.use_normformer:
attention_output = self.post_attn_layer_norm(attention_output)
print("residual", residual.shape)
Expand Down

0 comments on commit a49b2b6

Please sign in to comment.