Skip to content

Releases: facebookresearch/xformers

v0.0.6

24 Nov 16:18
2eb8b65
Compare
Choose a tag to compare

Fixed

  • Fix self attention optimization not being triggered, broken residual path [#119]
  • Improve speed by not using contiguous Tensors when not needed [#119]

Added

  • Attention mask wrapper [#113]
  • ViT comparison benchmark [#117]

v0.0.5

18 Nov 17:07
4c04e8f
Compare
Choose a tag to compare

fixing the 0.0.4 pip package, next release will be better in that we'll try to expose pre-built binaries

v0.0.4

17 Nov 04:53
1328ba7
Compare
Choose a tag to compare
  • Fixing causality not being respected by the scaled dot product attention
  • Fixing Favor causal trainability
  • Enabling FusedLayerNorm by default if Triton is available
  • Fixing Favor with fp16

v0.03

05 Nov 23:01
5ea376e
Compare
Choose a tag to compare

[0.0.3] - 2021-11-01

Fixed

  • Nystrom causal attention [#75]

v0.0.2

01 Nov 21:19
2af959d
Compare
Choose a tag to compare

[0.0.2] - 2021-11-01

Fixed

  • More robust blocksparse [#24]

Added

  • Rotary embeddings [#32]
  • More flexible layernorm [#50]
  • More flexible blockfactory config (key deduplication)