Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Error with 4bit-128g model loading #439

Open
Magnaderra opened this issue Aug 19, 2023 · 4 comments
Open

Error with 4bit-128g model loading #439

Magnaderra opened this issue Aug 19, 2023 · 4 comments

Comments

@Magnaderra
Copy link

Hi to all!

When I load this model (from directory): https://huggingface.co/OccamRazor/pygmalion-6b-gptq-4bit,
I get error:

FileNotFoundError: [Errno 2] No such file or directory: 'C:\\KoboldAI\\models\\pygmalion-6b-gptq-4bit\\quantize_config.json'

Here full bug report: Bug_Report.txt

I previously used KoboldAI's fork by 0cc4m and there was no such error.

Here the debug dump: kobold_debug.zip

@henk717
Copy link
Owner

henk717 commented Aug 19, 2023

Did you select occam gptq as the GPTQ backend? Because for autogptq you need an extra file.

@Magnaderra
Copy link
Author

Did you select occam gptq as the GPTQ backend? Because for autogptq you need an extra file.

Yes, I have occam gptq selected

@henk717
Copy link
Owner

henk717 commented Aug 19, 2023

Traced this back to the newer transformers version we are using compared to occam's old branch.
We will have to find a way to make the changes required to handle this old model, for the time being you have two options.
You can either downgrade to the old transformers by opening the KoboldAI command prompt and typing this : pip install transformers == 4.28
This method will harm your model compatibility but restores this old model, you can also use a different version of the model such as the one in the chat model menu.

@Magnaderra
Copy link
Author

pip install transformers == 4.28

Okay, so far that's worked.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants