Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

怎么加载altclip模型? #527

Open
susht3 opened this issue Aug 14, 2023 · 6 comments
Open

怎么加载altclip模型? #527

susht3 opened this issue Aug 14, 2023 · 6 comments
Labels
question Further information is requested

Comments

@susht3
Copy link

susht3 commented Aug 14, 2023

Description

按照readme,安装了transformers,然后import以下代码:

from modeling_altclip import AltCLIP
from processing_altclip import AltCLIPProcessor,

报错没有这个模块

Alternatives

No response

@susht3 susht3 added the question Further information is requested label Aug 14, 2023
@susht3 susht3 closed this as completed Aug 14, 2023
@susht3 susht3 reopened this Aug 14, 2023
@susht3
Copy link
Author

susht3 commented Aug 14, 2023

加载hf_altclip,可以import代码了,但是下载不了模型:
from hf_altclip.modeling_altclip import AltCLIP
from hf_altclip.processing_altclip import AltCLIPProcessor

报错:

model = AltCLIP.from_pretrained("BAAI/AltCLIP")
Downloading (…)lve/main/config.json: 100%|██████████████████████████████████████████████████████████████████████████████████████████████████| 5.13k/5.13k [00:00<00:00, 710kB/s]
You are using a model of type altclip to instantiate a model of type clip. This is not supported for all configurations of models and can yield errors.
Traceback (most recent call last):
File "", line 1, in
File "anaconda3/envs/py3/lib/python3.8/site-packages/transformers/modeling_utils.py", line 2269, in from_pretrained
config, model_kwargs = cls.config_class.from_pretrained(
File "anaconda3/envs/py3/lib/python3.8/site-packages/transformers/configuration_utils.py", line 553, in from_pretrained
return cls.from_dict(config_dict, **kwargs)
File "/anaconda3/envs/py3/lib/python3.8/site-packages/transformers/configuration_utils.py", line 696, in from_dict
config = cls(**config_dict)
File "hf_altclip/configuration_altclip.py", line 13, in init
super().init(text_config_dict, vision_config_dict, projection_dim, logit_scale_init_value, **kwargs)
TypeError: init() got multiple values for argument 'text_config'

@susht3
Copy link
Author

susht3 commented Aug 14, 2023

Traceback (most recent call last):
File "test.py", line 10, in
model = AltCLIP.from_pretrained("BAAI/AltCLIP")
File "anaconda3/envs/py3/lib/python3.8/site-packages/transformers/modeling_utils.py", line 1833, in from_pretrained
config, model_kwargs = cls.config_class.from_pretrained(
File "//anaconda3/envs/py3/lib/python3.8/site-packages/transformers/configuration_utils.py", line 534, in from_pretrained
config_dict, kwargs = cls.get_config_dict(pretrained_model_name_or_path, **kwargs)
File "//anaconda3/envs/py3/lib/python3.8/site-packages/transformers/configuration_utils.py", line 561, in get_config_dict
config_dict, kwargs = cls._get_config_dict(pretrained_model_name_or_path, **kwargs)
File "/anaconda3/envs/py3/lib/python3.8/site-packages/transformers/configuration_utils.py", line 649, in _get_config_dict
raise EnvironmentError(
OSError: We couldn't connect to 'https://huggingface.co' to load this model, couldn't find it in the cached files and it looks like BAAI/AltCLIP is not the path to a directory containing a config.json file.
Checkout your internet connection or see how to run the library in offline mode at 'https://huggingface.co/docs/transformers/installation#offline-mode'.

@susht3
Copy link
Author

susht3 commented Aug 14, 2023

加了auth_token=true也不行:
Traceback (most recent call last):
File /anaconda3/envs/py3/lib/python3.8/site-packages/transformers/configuration_utils.py", line 616, in _get_config_dict
resolved_config_file = cached_path(
File "/anaconda3/envs/py3/lib/python3.8/site-packages/transformers/utils/hub.py", line 284, in cached_path
output_path = get_from_cache(
File anaconda3/envs/py3/lib/python3.8/site-packages/transformers/utils/hub.py", line 494, in get_from_cache
raise EnvironmentError("You specified use_auth_token=True, but a huggingface token was not found.")
OSError: You specified use_auth_token=True, but a huggingface token was not found.

@920232796
Copy link
Contributor

image 这个错误是transformers版本的问题,您可以尝试降低一下版本,比如降低到4.19.x版本

@susht3
Copy link
Author

susht3 commented Sep 18, 2023

image 这个错误是transformers版本的问题,您可以尝试降低一下版本,比如降低到4.19.x版本

版本4.20和4.19都不行,下载报错:OSError: We couldn't connect to 'https://huggingface.co' to load this model, couldn't find it in the cached files and it looks lik
e BAAI/AltCLIP is not the path to a directory containing a config.json file.
Checkout your internet connection or see how to run the library in offline mode at 'https://huggingface.co/docs/transformers/insta
llation#offline-mode'

@qingfengcss
Copy link

请问这个问题解决了吗,测试通过transformers 加载遇到一样的问题

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Further information is requested
Projects
None yet
Development

No branches or pull requests

3 participants