You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
RuntimeError: Failed to import transformers.models.llama.modeling_llama because of the following error (look up to see its traceback):
Failed to import transformers.generation.utils because of the following error (look up to see its traceback):
cannot import name 'split_torch_state_dict_into_shards' from 'huggingface_hub' (/home/guest/anaconda3/envs/AnyGPT/lib/python3.9/site-packages/huggingface_hub/__init__.py)
I find the huggingface_hub version is 0.17.3, that is why throw error. But when I update it, pip show:
ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts.
tokenizers 0.14.1 requires huggingface_hub<0.18,>=0.16.4, but you have huggingface-hub 0.24.0 which is incompatible.
I can't downgrade huggingface_hub because it needs split_torch_state_dict_into_shards. So how to solve it, should I ignore pip's error?
The text was updated successfully, but these errors were encountered:
I just input:
then output:
I find the huggingface_hub version is 0.17.3, that is why throw error. But when I update it, pip show:
I can't downgrade huggingface_hub because it needs
split_torch_state_dict_into_shards
. So how to solve it, should I ignore pip's error?The text was updated successfully, but these errors were encountered: