Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

server.py run distutils error. #69

Open
TYeniyayla opened this issue Dec 9, 2024 · 1 comment
Open

server.py run distutils error. #69

TYeniyayla opened this issue Dec 9, 2024 · 1 comment

Comments

@TYeniyayla
Copy link

When I run the server.py file with the python server.py command, I get the following error. How do I fix it?

Traceback (most recent call last):
  File "H:\ExLLama\ExUI\exui\server.py", line 11, in <module>
    from backend.models import update_model, load_models, get_model_info, list_models, remove_model, load_model, unload_model, get_loaded_model
  File "H:\ExLLama\ExUI\exui\backend\models.py", line 5, in <module>
    from exllamav2 import(
  File "H:\pinokio\bin\miniconda\lib\site-packages\exllamav2\__init__.py", line 3, in <module>
    from exllamav2.model import ExLlamaV2
  File "H:\pinokio\bin\miniconda\lib\site-packages\exllamav2\model.py", line 33, in <module>
    from exllamav2.config import ExLlamaV2Config
  File "H:\pinokio\bin\miniconda\lib\site-packages\exllamav2\config.py", line 5, in <module>
    from exllamav2.stloader import STFile, cleanup_stfiles
  File "H:\pinokio\bin\miniconda\lib\site-packages\exllamav2\stloader.py", line 5, in <module>
    from exllamav2.ext import none_tensor, exllamav2_ext as ext_c
  File "H:\pinokio\bin\miniconda\lib\site-packages\exllamav2\ext.py", line 276, in <module>
    exllamav2_ext = load \
  File "H:\pinokio\bin\miniconda\lib\site-packages\torch\utils\cpp_extension.py", line 1314, in load
    return _jit_compile(
  File "H:\pinokio\bin\miniconda\lib\site-packages\torch\utils\cpp_extension.py", line 1721, in _jit_compile
    _write_ninja_file_and_build_library(
  File "H:\pinokio\bin\miniconda\lib\site-packages\torch\utils\cpp_extension.py", line 1833, in _write_ninja_file_and_build_library
    _run_ninja_build(
  File "H:\pinokio\bin\miniconda\lib\site-packages\torch\utils\cpp_extension.py", line 2081, in _run_ninja_build
    vc_env = distutils._msvccompiler._get_vc_env(plat_spec)
AttributeError: module 'distutils' has no attribute '_msvccompiler'. Did you mean: 'ccompiler'?
@turboderp
Copy link
Member

It is attempting to compile the C++/CUDA extension for ExLlamaV2, and on Windows that needs the MSVC and CUDA Toolkit to be installed. You can install a prebuilt wheel of ExLlamaV2 instead, from here.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants