Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

A warning: Using this model may not work as intended #34

Open
AGenchev opened this issue Aug 13, 2024 · 2 comments
Open

A warning: Using this model may not work as intended #34

AGenchev opened this issue Aug 13, 2024 · 2 comments
Labels
bug Something isn't working

Comments

@AGenchev
Copy link

The new version is much better, I mean it started working easily and with Vulcan offloading.
The local mode returns a warning:
[node-llama-cpp] Using this model ("~/.humanifyjs/models/Phi-3.1-mini-4k-instruct-Q4_K_M.gguf") to tokenize text with special tokens and then detokenize it resulted in a different text. There might be an issue with the model or the tokenizer implementation. Using this model may not work as intended

Should I ignore this warning or this isn't expected and I must investigate what happens ?

@jehna
Copy link
Owner

jehna commented Aug 13, 2024

Yes, this warning is a known issue for now. The model should work fine, at least in my testing. I'll check later if I can find a fix for this.

@jehna
Copy link
Owner

jehna commented Aug 13, 2024

I'll keep this issue open for now so I don't forget to fix it

@jehna jehna reopened this Aug 13, 2024
@jehna jehna added the bug Something isn't working label Aug 13, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants