Releases: lucidrains/block-recurrent-transformer-pytorch
Releases · lucidrains/block-recurrent-transformer-pytorch
0.4.4
0.4.3
fix states being overridden in generate
0.4.2
handle an edge case
0.4.1
update to einops 0.6.1 or greater
0.4.0
add compressed memories
0.3.3
add ability to read the state at a layer earlier than the layer in wh…
0.3.2
always return memories and states from attention blocks, but not the …
0.3.1
remove enhanced recurrence, does not seem to do all that much here
0.3.0
refactor to process block by block, to ready for decoupling layers at…
0.2.2
simplify and cleanup