-
Notifications
You must be signed in to change notification settings - Fork 18
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Updated llama-cpp-python (and its dependencies with env variables) #24
base: main
Are you sure you want to change the base?
Conversation
@SnehalSwadhin This represents a major change to the existing dependencies. Are you certain that If not, it will take us some time to determine if all of these changes are compatible with solutions that worked in the previous runtime. This is a really impressive deep dive into advanced package installation with pixi 👏 |
I was able to get results with structured_output in langchain pretty easily with ollama (about a week). Just using llama.cpp has been a pain for more than a month now. I actually gave up at some point😅, but came back today to give it a last try. Because it works on my device, its very difficult to debug. And I was not able to find any other fix for: Llama.create_chat_completion() got an unexpected keyword argument 'logprobs' Anyway, it was fun figuring out something new🙌 |
At this point I just want to see one of my submission run successfully. Though the sad part is that it probably does not score as high as some submissions on the leaderboard. |
If the change would be important to your solution we can certainly consider the PR, just let me know! I also enjoyed reading your PR -- this is the first competition for which we've used Pixi to manage dependencies, and your code uses a lot Pixi functionality that I hadn't seen before. FYI, there is a second DrivenData competition that uses the same data, but is an unsupervised challenge that does not rely on code execution for submissions. If you're interested, you could experiment with some of these methods for that challenge. |
I've got a teammate working on the second track. |
@klwetstone Although I managed to bypass the Invoking any llm with structured_output only returns I'd request you to try and update the library. |
Updated llama-cpp-python to 0.3.1 using Pypi.
Conda had 0.2.24 as the latest version. I was not able to run any ChatModel in langchain in this version (was getting an error that Llama.create_chat_completion() got an unexpected keyword argument 'logprobs')
I had to add a few environment variables and dependencies for updating llama-cpp-python