Having issue while running the reader in Haystack 2 #7020
-
I am having an issue running the warm-up method of the reader. I want to run the model locally. THis is the code:
This is the error I'm getting:
How to resolve it? |
Beta Was this translation helpful? Give feedback.
Replies: 3 comments 5 replies
-
Probably, you can fix the issue, by installing |
Beta Was this translation helpful? Give feedback.
-
Hi, my code works at home, but at the office I'm in a container with a mixture of system and local libraries. Everything was working until I implemented the re-ranker: import: instantiation: I have seen this error with accelerate elsewhere, and people have said to uninstall transformer_engine, which I can't due to permissions (container environment is standardized). My main question is if there is a way to avoid that call to accelerate to bypass the conflict. I have used many models, embedders, etc. without having this issue. |
Beta Was this translation helpful? Give feedback.
-
Ok, I solved it (after about 12 hours). I installed from source transformer-engine (https://github.com/NVIDIA/TransformerEngine), first installing flash-attn==1.0.7 (--no-build-isolation). Definitely conflict with container. |
Beta Was this translation helpful? Give feedback.
It seems the error message suggests something different (I managed to reproduce the error in Colab, without
accelerate
).However, you can read about device management in Haystack 2 in this page.