You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
An officially supported task in the examples folder (such as GLUE/SQuAD, ...)
My own task or dataset (give details below)
Reproduction
I have cloned text-generation-webui on my M2 MacBook Pro and am trying to install a model from huggingface in it. I am following the steps and have chosen Transformers as the model loader, but am getting this error:
23:48:52-670995 INFO TRANSFORMERS_PARAMS=
{'low_cpu_mem_usage': True, 'torch_dtype': torch.float16}
23:48:52-672915 ERROR Failed to load the model.
Traceback (most recent call last):
File "/Users/rob/IA/text-generation-webui/modules/ui_model_menu.py", line 232, in load_model_wrapper
shared.model, shared.tokenizer = load_model(selected_model, loader)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/rob/IA/text-generation-webui/modules/models.py", line 93, in load_model
output = load_func_map[loader](model_name)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/rob/IA/text-generation-webui/modules/models.py", line 172, in huggingface_loader
model = LoaderClass.from_pretrained(path_to_model, **params)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/rob/IA/text-generation-webui/installer_files/env/lib/python3.11/site-packages/transformers/models/auto/auto_factory.py", line 564, in from_pretrained
return model_class.from_pretrained(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/rob/IA/text-generation-webui/installer_files/env/lib/python3.11/site-packages/transformers/modeling_utils.py", line 3652, in from_pretrained
hf_quantizer = AutoHfQuantizer.from_config(config.quantization_config, pre_quantized=pre_quantized)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/rob/IA/text-generation-webui/installer_files/env/lib/python3.11/site-packages/transformers/quantizers/auto.py", line 148, in from_config
return target_cls(quantization_config, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/rob/IA/text-generation-webui/installer_files/env/lib/python3.11/site-packages/transformers/quantizers/quantizer_gptq.py", line 47, in __init__
from optimum.gptq import GPTQQuantizer
ModuleNotFoundError: No module named 'optimum'
Expected behavior
Model shoud load fine.
Thank you for your help. I'm pretty new with AI tools and any help is appreciated.
The text was updated successfully, but these errors were encountered:
System Info
transformers
version: 4.46.2Who can help?
No response
Information
Tasks
examples
folder (such as GLUE/SQuAD, ...)Reproduction
I have cloned text-generation-webui on my M2 MacBook Pro and am trying to install a model from huggingface in it. I am following the steps and have chosen
Transformers
as the model loader, but am getting this error:Expected behavior
Model shoud load fine.
Thank you for your help. I'm pretty new with AI tools and any help is appreciated.
The text was updated successfully, but these errors were encountered: