You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Traceback (most recent call last):
File "/data/program/miniconda3/envs/xinference/lib/python3.10/site-packages/llama_cpp/llama_cpp.py", line 75, in _load_shared_library
return ctypes.CDLL(str(_lib_path), **cdll_args) # type: ignore
File "/data/program/miniconda3/envs/xinference/lib/python3.10/ctypes/init.py", line 374, in init
self._handle = _dlopen(self._name, mode)
OSError: /lib/x86_64-linux-gnu/libc.so.6: version `GLIBC_2.32' not found (required by /data/program/miniconda3/envs/xinference/lib/python3.10/site-packages/llama_cpp/lib/libllama.so)
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/data/program/miniconda3/envs/xinference/bin/xinference-local", line 5, in
from xinference.deploy.cmdline import local
File "/data/program/miniconda3/envs/xinference/lib/python3.10/site-packages/xinference/init.py", line 37, in
_install()
File "/data/program/miniconda3/envs/xinference/lib/python3.10/site-packages/xinference/init.py", line 34, in _install
install_model()
File "/data/program/miniconda3/envs/xinference/lib/python3.10/site-packages/xinference/model/init.py", line 17, in _install
from .llm import _install as llm_install
File "/data/program/miniconda3/envs/xinference/lib/python3.10/site-packages/xinference/model/llm/init.py", line 20, in
from .core import (
File "/data/program/miniconda3/envs/xinference/lib/python3.10/site-packages/xinference/model/llm/core.py", line 26, in
from ...types import PeftModelConfig
File "/data/program/miniconda3/envs/xinference/lib/python3.10/site-packages/xinference/types.py", line 399, in
from llama_cpp import Llama
File "/data/program/miniconda3/envs/xinference/lib/python3.10/site-packages/llama_cpp/init.py", line 1, in
from .llama_cpp import *
File "/data/program/miniconda3/envs/xinference/lib/python3.10/site-packages/llama_cpp/llama_cpp.py", line 88, in
_lib = _load_shared_library(_lib_base_name)
File "/data/program/miniconda3/envs/xinference/lib/python3.10/site-packages/llama_cpp/llama_cpp.py", line 77, in _load_shared_library
raise RuntimeError(f"Failed to load shared library '{_lib_path}': {e}")
RuntimeError: Failed to load shared library '/data/program/miniconda3/envs/xinference/lib/python3.10/site-packages/llama_cpp/lib/libllama.so': /lib/x86_64-linux-gnu/libc.so.6: version `GLIBC_2.32' not found (required by /data/program/miniconda3/envs/xinference/lib/python3.10/site-packages/llama_cpp/lib/libllama.so)
(xinference) root@10-60-176-5:/data/models/xinference/tmp#
Expected behavior / 期待表现
卸载 llama_cpp_python 启动正常,但是llama cpp 无法加载GPU ;
期待问题解决!
The text was updated successfully, but these errors were encountered:
同问,centos7.9 也是这问题,RuntimeError: Failed to load shared library '/root/anaconda3/envs/xinference/lib/python3.11/site-packages/llama_cpp/lib/libllama.so': /lib64/libc.so.6: version `GLIBC_2.32' not found (required by /root/anaconda3/envs/xinference/lib/python3.11/site-packages/llama_cpp/lib/libllama.so)
System Info / 系統信息
Ubuntu 20.24 LTS
CDUD 12.4
vllm 0.5.3.post1
vllm-flash-attn 2.5.9.post1
sentence-transformers 3.0.1
transformers 4.43.3
transformers-stream-generator 0.0.5
llama-cpp-python 0.2.82
Running Xinference with Docker? / 是否使用 Docker 运行 Xinfernece?
Version info / 版本信息
xinrerence 0.14版本
The command used to start Xinference / 用以启动 xinference 的命令
XINFERENCE_HOME=/data/models/xinference/models XINFERENCE_MODEL_SRC=modelscope HF_ENDPOINT=https://hf-mirror.com CUDA_VISIBLE_DEVICES=0 xinference-local --host 0.0.0.0 --port 9997 --auth-config /data/models/xinference/auth_config.json --log-level debug
Reproduction / 复现过程
安装:llama_cpp_python-0.2.82
来源 https://abetlen.github.io/llama-cpp-python/whl/cu124/llama-cpp-python/
CMAKE_ARGS="-DLLAMA_CUDA=on" pip install llama_cpp_python-0.2.82-cp310-cp310-linux_x86_64.whl
xinference 启动:
XINFERENCE_HOME=/data/models/xinference/models XINFERENCE_MODEL_SRC=modelscope HF_ENDPOINT=https://hf-mirror.com CUDA_VISIBLE_DEVICES=0 xinference-local --host 0.0.0.0 --port 9997 --auth-config /data/models/xinference/auth_config.json --log-level debug
xinference 后报错
Traceback (most recent call last):
File "/data/program/miniconda3/envs/xinference/lib/python3.10/site-packages/llama_cpp/llama_cpp.py", line 75, in _load_shared_library
return ctypes.CDLL(str(_lib_path), **cdll_args) # type: ignore
File "/data/program/miniconda3/envs/xinference/lib/python3.10/ctypes/init.py", line 374, in init
self._handle = _dlopen(self._name, mode)
OSError: /lib/x86_64-linux-gnu/libc.so.6: version `GLIBC_2.32' not found (required by /data/program/miniconda3/envs/xinference/lib/python3.10/site-packages/llama_cpp/lib/libllama.so)
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/data/program/miniconda3/envs/xinference/bin/xinference-local", line 5, in
from xinference.deploy.cmdline import local
File "/data/program/miniconda3/envs/xinference/lib/python3.10/site-packages/xinference/init.py", line 37, in
_install()
File "/data/program/miniconda3/envs/xinference/lib/python3.10/site-packages/xinference/init.py", line 34, in _install
install_model()
File "/data/program/miniconda3/envs/xinference/lib/python3.10/site-packages/xinference/model/init.py", line 17, in _install
from .llm import _install as llm_install
File "/data/program/miniconda3/envs/xinference/lib/python3.10/site-packages/xinference/model/llm/init.py", line 20, in
from .core import (
File "/data/program/miniconda3/envs/xinference/lib/python3.10/site-packages/xinference/model/llm/core.py", line 26, in
from ...types import PeftModelConfig
File "/data/program/miniconda3/envs/xinference/lib/python3.10/site-packages/xinference/types.py", line 399, in
from llama_cpp import Llama
File "/data/program/miniconda3/envs/xinference/lib/python3.10/site-packages/llama_cpp/init.py", line 1, in
from .llama_cpp import *
File "/data/program/miniconda3/envs/xinference/lib/python3.10/site-packages/llama_cpp/llama_cpp.py", line 88, in
_lib = _load_shared_library(_lib_base_name)
File "/data/program/miniconda3/envs/xinference/lib/python3.10/site-packages/llama_cpp/llama_cpp.py", line 77, in _load_shared_library
raise RuntimeError(f"Failed to load shared library '{_lib_path}': {e}")
RuntimeError: Failed to load shared library '/data/program/miniconda3/envs/xinference/lib/python3.10/site-packages/llama_cpp/lib/libllama.so': /lib/x86_64-linux-gnu/libc.so.6: version `GLIBC_2.32' not found (required by /data/program/miniconda3/envs/xinference/lib/python3.10/site-packages/llama_cpp/lib/libllama.so)
(xinference) root@10-60-176-5:/data/models/xinference/tmp#
Expected behavior / 期待表现
卸载 llama_cpp_python 启动正常,但是llama cpp 无法加载GPU ;
期待问题解决!
The text was updated successfully, but these errors were encountered: