You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Here outs is a tensor of the shape [num_dir * num_layers, bsz, hidden_size].
The goal is to combine the tensor to the form [num_layers, bsz, num_dir * hidden_size].
The error is clear as the inner concatenate function should join the tensors on the final dimension instead of the first. Significantly current version of the implementation would mix up tensor values between different entries in the same batch. A quick fix would be to change dim=0 to dim=-1.
However, the code is still rather convoluted and includes one too many for loop which hampers the readability of the code. I suggest using purely reshape and transpose operations for this task.
combine_bidir
is a function insource/embed.py#L230
.It's used to concatenate the forward and backward hidden tensors from a bidirectional LSTM.
Here
outs
is a tensor of the shape[num_dir * num_layers, bsz, hidden_size]
.The goal is to
combine
the tensor to the form[num_layers, bsz, num_dir * hidden_size]
.The error is clear as the inner concatenate function should join the tensors on the final dimension instead of the first. Significantly current version of the implementation would mix up tensor values between different entries in the same batch. A quick fix would be to change
dim=0
todim=-1
.However, the code is still rather convoluted and includes one too many for loop which hampers the readability of the code. I suggest using purely reshape and transpose operations for this task.
The text was updated successfully, but these errors were encountered: