Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Model crashes after a while #54

Open
yankluf opened this issue Jan 15, 2025 · 0 comments
Open

Model crashes after a while #54

yankluf opened this issue Jan 15, 2025 · 0 comments

Comments

@yankluf
Copy link

yankluf commented Jan 15, 2025

Model:
deepseek-coder-v2:16b-lite-instruct-q4_K_M
(From Ollama repo)

GPU:
NVIDIA RTX3060 (12GB)

Ollama version:
0.5.5

UI:
Open-WebUI 0.5.4

Description:
After a few prompts, DeepSeek Coder V2 (Lite) stops generating the response and the model stops running (GPU usage goes to 0%). If you try to force it to keep generating the answer, the model initializes but crashes again (instantly), and sometimes generates random information about algebra for a couple of seconds before going down.

Ollama log:

llm_load_vocab: control-looking token: 100002 '<|fim▁hole|>' was not control-type; this is probably a bug in the model. its type will be overridden
llm_load_vocab: control-looking token: 100004 '<|fim▁end|>' was not control-type; this is probably a bug in the model. its type will be overridden
llm_load_vocab: control-looking token: 100003 '<|fim▁begin|>' was not control-type; this is probably a bug in the model. its type will be overridden

Relevant info:
It seems that more users are experiencing the same/similar issue:
https://www.reddit.com/r/SillyTavernAI/comments/1hzuzrf/deepseek_on_openrouter_stops_responding_after/

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant