Skip to content

v0.3.9 - Small updates

Compare
Choose a tag to compare
@nreimers nreimers released this 18 Nov 08:25
· 1081 commits to master since this release

This release only include some smaller updates:

  • Code was tested with transformers 3.5.1, requirement was updated so that it works with transformers 3.5.1
  • As some parts and models require Pytorch >= 1.6.0, requirement was updated to require at least pytorch 1.6.0. Most of the code and models will work with older pytorch versions.
  • model.encode() stored the embeddings on the GPU, which required quite a lot of GPU memory when encoding millions of sentences. The embeddings are now moved to CPU once they are computed.
  • The CrossEncoder-Class now accepts a max_length parameter to control the truncation of inputs
  • The Cross-Encoder predict method has now a apply_softmax parameter, that allows to apply softmax on-top of a multi-class output.