Skip to content

Releases: lucidrains/block-recurrent-transformer-pytorch

0.4.4

20 Aug 17:00
Compare
Choose a tag to compare
better init

0.4.3

05 Jul 20:31
Compare
Choose a tag to compare
fix states being overridden in generate

0.4.2

20 Apr 21:40
Compare
Choose a tag to compare
handle an edge case

0.4.1

20 Apr 21:11
Compare
Choose a tag to compare
update to einops 0.6.1 or greater

0.4.0

20 Apr 20:37
Compare
Choose a tag to compare
add compressed memories

0.3.3

18 Apr 18:17
Compare
Choose a tag to compare
add ability to read the state at a layer earlier than the layer in wh…

0.3.2

18 Apr 01:44
Compare
Choose a tag to compare
always return memories and states from attention blocks, but not the …

0.3.1

17 Apr 23:33
Compare
Choose a tag to compare
remove enhanced recurrence, does not seem to do all that much here

0.3.0

17 Apr 21:23
Compare
Choose a tag to compare
refactor to process block by block, to ready for decoupling layers at…

0.2.2

04 Apr 14:46
Compare
Choose a tag to compare
simplify and cleanup