Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Can't load distiluse-base-multilingual-cased-v2 model #2924

Open
b5y opened this issue Sep 7, 2024 · 1 comment
Open

Can't load distiluse-base-multilingual-cased-v2 model #2924

b5y opened this issue Sep 7, 2024 · 1 comment

Comments

@b5y
Copy link

b5y commented Sep 7, 2024

Hello,

I am not able to load sentence-transformers/distiluse-base-multilingual-cased-v2 model. Script example:

from sentence_transformers import SentenceTransformer

model_name = 'sentence-transformers/distiluse-base-multilingual-cased-v2'

model = SentenceTransformer(model_name)

Raising error:

RuntimeError: Failed to import transformers.models.distilbert.modeling_distilbert because of the following error (look up to see its traceback):
libcudart.so.11.0: cannot open shared object file: No such file or directory

The same error is raised for the sentence-transformers/distiluse-base-multilingual-cased-v1 model, but everything works fine if I try to load the sentence-transformers/msmarco-bert-base-dot-v5.

Transformers version: 4.44.2
Sentence Transformers version: 3.0.1
Python version: 3.11.7
Torch version: 2.4.0

Do you have any thoughts on how to fix this error? I tried downgrading transformers to 4.44.0, but it didn't work.

I appreciate any help you can provide.

CC: @tomaarsen, @muellerzr

@b5y
Copy link
Author

b5y commented Sep 7, 2024

It looks like it solved the problem by removing flash-attn. Can't find a time to figure out why this didn't work with flash-attn with torch 2.4 because it worked in another env with torch 2.3.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant