Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ModuleNotFoundError: No module named 'optimum' when loading model in TextGenerationWebUI #34765

Open
4 tasks
Robtles opened this issue Nov 17, 2024 · 0 comments
Open
4 tasks
Labels

Comments

@Robtles
Copy link

Robtles commented Nov 17, 2024

System Info

  • transformers version: 4.46.2
  • Platform: macOS-15.1-arm64-arm-64bit-Mach-O
  • Python version: 3.13.0
  • Huggingface_hub version: 0.26.2
  • Safetensors version: 0.4.5
  • Accelerate version: not installed
  • Accelerate config: not found
  • PyTorch version (GPU?): not installed (NA)
  • Tensorflow version (GPU?): not installed (NA)
  • Flax version (CPU?/GPU?/TPU?): not installed (NA)
  • Jax version: not installed
  • JaxLib version: not installed

Who can help?

No response

Information

  • The official example scripts
  • My own modified scripts

Tasks

  • An officially supported task in the examples folder (such as GLUE/SQuAD, ...)
  • My own task or dataset (give details below)

Reproduction

I have cloned text-generation-webui on my M2 MacBook Pro and am trying to install a model from huggingface in it. I am following the steps and have chosen Transformers as the model loader, but am getting this error:

23:48:52-670995 INFO     TRANSFORMERS_PARAMS=                                                                                    
{'low_cpu_mem_usage': True, 'torch_dtype': torch.float16}

23:48:52-672915 ERROR    Failed to load the model.                                                                               
Traceback (most recent call last):
  File "/Users/rob/IA/text-generation-webui/modules/ui_model_menu.py", line 232, in load_model_wrapper
    shared.model, shared.tokenizer = load_model(selected_model, loader)
                                     ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/rob/IA/text-generation-webui/modules/models.py", line 93, in load_model
    output = load_func_map[loader](model_name)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/rob/IA/text-generation-webui/modules/models.py", line 172, in huggingface_loader
    model = LoaderClass.from_pretrained(path_to_model, **params)
            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/rob/IA/text-generation-webui/installer_files/env/lib/python3.11/site-packages/transformers/models/auto/auto_factory.py", line 564, in from_pretrained
    return model_class.from_pretrained(
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/rob/IA/text-generation-webui/installer_files/env/lib/python3.11/site-packages/transformers/modeling_utils.py", line 3652, in from_pretrained
    hf_quantizer = AutoHfQuantizer.from_config(config.quantization_config, pre_quantized=pre_quantized)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/rob/IA/text-generation-webui/installer_files/env/lib/python3.11/site-packages/transformers/quantizers/auto.py", line 148, in from_config
    return target_cls(quantization_config, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/rob/IA/text-generation-webui/installer_files/env/lib/python3.11/site-packages/transformers/quantizers/quantizer_gptq.py", line 47, in __init__
    from optimum.gptq import GPTQQuantizer
ModuleNotFoundError: No module named 'optimum'

Expected behavior

Model shoud load fine.

Thank you for your help. I'm pretty new with AI tools and any help is appreciated.

@Robtles Robtles added the bug label Nov 17, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

1 participant