Replies: 3 comments 10 replies
-
Yes, many HuggingFace models work! I welcome people to drop recommendations of HuggingFace models that have proven effective for them. From the models guide I had success using sgpt (Muennighoff/SGPT-125M-weightedmean-msmarco-specb-bitfit). Folks using Semantra in other languages have had success too:
|
Beta Was this translation helpful? Give feedback.
-
This isn't the right way to use Q&A format, but whatever: Can the question and answer models be used? https://www.sbert.net/docs/pretrained_models.html#multi-qa-models Would that make it better for putting a question in the search box and it shows the answer as the result? |
Beta Was this translation helpful? Give feedback.
-
I tried to combine both the MTEB Leaderboard and the Sentence-Transformers benchmark into one table and highlighted the Semantra built-in options in green: https://docs.google.com/spreadsheets/d/1uOXb5SOOouFMuUdDTT7FpZAdesfoEE7slyWpri4bac4/edit?usp=sharing Sorted by "Retrieval" task because I think that's the most applicable to Semantra? Or should it be sorted by STS? |
Beta Was this translation helpful? Give feedback.
-
As in #45
We can use the ones listed on sbert.net/docs/pretrained_models.html. For instance,
--transformer-model sentence-transformers/paraphrase-multilingual-mpnet-base-v2
works for me.Do others work, too? There is a leaderboard here: https://huggingface.co/spaces/mteb/leaderboard
I'm not clear on whether they all have a compatible interface or can be used in the same way.
Beta Was this translation helpful? Give feedback.
All reactions