You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Oct 25, 2024. It is now read-only.
I have succeeded to run ./chatglm3-6b by Intel-extension-for-transformers on my laptop, but I try to use Nueral Chat to run the same model(./chatglm3-6b) then fail:
" Process finished with exit code 137 (interrapted by signal 9:SIGKILL)"
I have succeeded to run ./chatglm3-6b by Intel-extension-for-transformers on my laptop, but I try to use Nueral Chat to run the same model(./chatglm3-6b) then fail:
" Process finished with exit code 137 (interrapted by signal 9:SIGKILL)"
Code:
from intel_extension_for_transformers.neural_chat import build_chatbot, PipelineConfig
from intel_extension_for_transformers.transformers import RtnConfig
config = PipelineConfig(
model_name_or_path='./chatglm3-6b',
optimization_config=RtnConfig(
bits=4,
compute_dtype="int8",
weight_dtype="int4_fullrange"
)
)
chatbot = build_chatbot(config)
response = chatbot.predict(query="Hi")
CPU: I7-13700H
Memory: 16G
Ubuntu 22.04
Q1: How to config then let the Neural Chat work for chatglm3-6b ?
Q2: How to realize Server API based on the Q4 version model bin file (ne_chatglm2_q_nf4_bestla_cfp32_g32.bin) ?
Thanks a lot.
The text was updated successfully, but these errors were encountered: