-
Notifications
You must be signed in to change notification settings - Fork 178
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support api other than openai or gemini? #30
Comments
as long as it is listed here, i can try to support it |
@hanxiao please add deepseek! |
Suppose we can get Ollama; that opens up many possibilities. Adding closed-sourced APIs does not add value, as Gemini and "Open"AI are more than sufficient. I only mention it because there is a Community Provider in the link you listed @hanxiao . However, I can't determine whether that would be difficult or a simple implementation. And, of course, it becomes "use at your own risk" because who knows if every model on Ollama would work or not. I assume only a handful might. |
ollama is supported already btw, however we need to find some llms that r good at structured output |
I think latest Devin changes broke the system... keeps getting stuck when running in server mode :( As a joke I asked it: "When is Donald Trump going to make America great again?" and it got stuck after Step 8 and stopped processing completely. It has stuck when asking other open ended questions after the Devin changes. |
i didnt use server mode, did u test |
Tried running locally with Ollama/qwen2.5 but got this error after following README: if (LLM_PROVIDER === 'openai' && !OPENAI_API_KEY) throw new Error("OPENAI_API_KEY not found"); I'll test that question in another shell using Gemini (i didnt use server mode, did u test npm run dev $question?) |
The server seems to be functioning fine. It might have been a network issue that was not handled properly for that single instance. Everything is looking good! |
yeah for local llm, u still need to set this OPENAI_API_KEY but it can be whatever value unless ur local llm server has auth. i added this to the readme section |
Thanks for a quick response. Excited to try it out tomorrow!
…On Thu, Feb 6, 2025, 9:44 PM Han Xiao ***@***.***> wrote:
Error: OPENAI_API_KEY not found
yeah for local llm, u still need to set this OPENAI_API_KEY but it can be
whatever value unless ur local llm server has auth.
i added this to the readme section
—
Reply to this email directly, view it on GitHub
<#30 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AEVHSWMXLY7VLASW2UEYELL2OQMYJAVCNFSM6AAAAABWTB6676VHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDMNBRHAYDENBYGA>
.
You are receiving this because you commented.Message ID:
***@***.***>
|
like, what if I use api from POE?
The text was updated successfully, but these errors were encountered: