-
Notifications
You must be signed in to change notification settings - Fork 16
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Cannot bypass the OPENAI_API_KEY requirement when using local server #6
Comments
I'm guessing the created Try this: Edit config :langchain, :openai_key, "UNUSED" That should do it. My other guess is that it was trying to connect to the actual OpenAI API and the errors requiring the key may have been coming from there. |
Also, make sure you're using the latest version of the https://github.com/brainlid/langchain?tab=readme-ov-file#alternative-openai-compatible-apis |
The mod of adding, UNUSED to the config.exs file still doesn't bypass. This is so odd. The error message simply states
There is some code that is forcing an api key to be included. In the get_api_key function, I see your comments:
Could this be the cause for rejecting a null value? By the way, I'm adding the endpoint to the chat_open_ai.ex as follows:
I opted to do it this way because I don't understand in which file I'm supposed to add alternate endpoints per your updated langchain v0.15:
|
Ha! The "Stripe API error" message was because I copied the initial API key management code from the Elixir Stripe client library. I updated the comment to remove the Stripe reference. From what I can tell, the error message is coming from OpenAI. Meaning the langchain library is not raising the error. Instead, an API request is being made to OpenAI and they are rejecting the call because an API key is required. Depending on how you're doing the field overrides, it probably isn't working. The demo project has been updated to use a newer version of the langchain library. Ensure you are using v0.1.6 or later. A recent fix was made to allow the With that done, you can test out the override in the two following places:
In those code locations, the LLMChain is setup and the OpenAI model is configured. You'd add the Hope that helps! |
SUCCESS! Updating to 0.1.6, and placing the overrides in those two locations resolved the error message. I'm now having conversations locally using mistral 7B instruct. |
Awesome @18a93a664c! I'd love to hear how well it does at running functions. The physical fitness agent has three functions it uses.
Glad it's working for you! |
The fitness agent doesn't record anything with this setup. I'm entering the same responses as you did in your video and I don't receive the "recorded" responses from the agent. |
Thanks for trying that out and letting me know! Functions are really hard (like impossible) to get working right when the LLM doesn't specifically support them. I'm not surprised they don't work, but I am a little bummed. I was hoping they had figured out a way to do it with a general model like Mistral. |
This is a continuation from my previous issue...
I'm using a local server setup with the LMStudio app. This dev environment does not require an openai key. They instruct you to leave that field blank.
I have created a .env file and didn't assign a value to OPENAI_API_KEY...I still get an error message stating it needs to be set.
I have commented out every line of code that references anything to do with api_key...I still get an error message stating it needs to be set.
How do I bypass the need for an api key?
The text was updated successfully, but these errors were encountered: