You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Launching privateGPT...
llama.cpp: loading model from models/ggml-vic7b-q4_0.bin
error loading model: unexpectedly reached end of file
llama_load_model_from_file: failed to load model
Traceback (most recent call last):
File "C:\TCHT\privateGPT\privateGPT.py", line 83, in
main()
File "C:\TCHT\privateGPT\privateGPT.py", line 36, in main
llm = LlamaCpp(model_path=model_path, max_tokens=model_n_ctx, n_batch=model_n_batch, callbacks=callbacks, verbose=False)
File "C:\ProgramData\miniconda3\envs\pgpt\lib\site-packages\langchain\load\serializable.py", line 74, in init
super().init(**kwargs)
File "pydantic\main.py", line 341, in pydantic.main.BaseModel.init
pydantic.error_wrappers.ValidationError: 1 validation error for LlamaCpp root
Could not load Llama model from path: models/ggml-vic7b-q4_0.bin. Received error (type=value_error)
Exception ignored in: <function Llama.del at 0x000001A4A7009B40>
Traceback (most recent call last):
File "C:\ProgramData\miniconda3\envs\pgpt\lib\site-packages\llama_cpp\llama.py", line 1445, in del
if self.ctx is not None:
AttributeError: 'Llama' object has no attribute 'ctx'
The text was updated successfully, but these errors were encountered:
I did receive this error also during the installation:
Use Conda (y/n) [Default: y]:
C:\Users\ashak\Documents\WindowsPowerShell\profile.ps1 : The term
'C:\Users\ashak\Documents\WindowsPowerShell\profile.ps1' is not recognized as the name of a cmdlet, function,
script file, or operable program. Check the spelling of the name, or if a path was included, verify that the
path is correct and try again.
At line:1 char:1
Launching privateGPT...
llama.cpp: loading model from models/ggml-vic7b-q4_0.bin
error loading model: unexpectedly reached end of file
llama_load_model_from_file: failed to load model
Traceback (most recent call last):
File "C:\TCHT\privateGPT\privateGPT.py", line 83, in
main()
File "C:\TCHT\privateGPT\privateGPT.py", line 36, in main
llm = LlamaCpp(model_path=model_path, max_tokens=model_n_ctx, n_batch=model_n_batch, callbacks=callbacks, verbose=False)
File "C:\ProgramData\miniconda3\envs\pgpt\lib\site-packages\langchain\load\serializable.py", line 74, in init
super().init(**kwargs)
File "pydantic\main.py", line 341, in pydantic.main.BaseModel.init
pydantic.error_wrappers.ValidationError: 1 validation error for LlamaCpp
root
Could not load Llama model from path: models/ggml-vic7b-q4_0.bin. Received error (type=value_error)
Exception ignored in: <function Llama.del at 0x000001A4A7009B40>
Traceback (most recent call last):
File "C:\ProgramData\miniconda3\envs\pgpt\lib\site-packages\llama_cpp\llama.py", line 1445, in del
if self.ctx is not None:
AttributeError: 'Llama' object has no attribute 'ctx'
The text was updated successfully, but these errors were encountered: