Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support api other than openai or gemini? #30

Open
dutry opened this issue Feb 6, 2025 · 11 comments
Open

Support api other than openai or gemini? #30

dutry opened this issue Feb 6, 2025 · 11 comments

Comments

@dutry
Copy link

dutry commented Feb 6, 2025

like, what if I use api from POE?

@hanxiao
Copy link
Member

hanxiao commented Feb 6, 2025

as long as it is listed here, i can try to support it

@dutry
Copy link
Author

dutry commented Feb 6, 2025

@hanxiao please add deepseek!

@hanxiao
Copy link
Member

hanxiao commented Feb 6, 2025

u mean deepseek-r1 right? note r1 does not support structured output, at least not via vercel ai-sdk

Image

@paul5007
Copy link
Contributor

paul5007 commented Feb 6, 2025

Suppose we can get Ollama; that opens up many possibilities. Adding closed-sourced APIs does not add value, as Gemini and "Open"AI are more than sufficient.

I only mention it because there is a Community Provider in the link you listed @hanxiao . However, I can't determine whether that would be difficult or a simple implementation. And, of course, it becomes "use at your own risk" because who knows if every model on Ollama would work or not. I assume only a handful might.

@hanxiao
Copy link
Member

hanxiao commented Feb 6, 2025

ollama is supported already btw, however we need to find some llms that r good at structured output

@paul5007
Copy link
Contributor

paul5007 commented Feb 6, 2025

I think latest Devin changes broke the system... keeps getting stuck when running in server mode :(

As a joke I asked it: "When is Donald Trump going to make America great again?" and it got stuck after Step 8 and stopped processing completely. It has stuck when asking other open ended questions after the Devin changes.

@hanxiao
Copy link
Member

hanxiao commented Feb 6, 2025

i didnt use server mode, did u test npm run dev $question?

@paul5007
Copy link
Contributor

paul5007 commented Feb 6, 2025

Tried running locally with Ollama/qwen2.5 but got this error after following README:

if (LLM_PROVIDER === 'openai' && !OPENAI_API_KEY) throw new Error("OPENAI_API_KEY not found");
^
Error: OPENAI_API_KEY not found

I'll test that question in another shell using Gemini (i didnt use server mode, did u test npm run dev $question?)

@paul5007
Copy link
Contributor

paul5007 commented Feb 6, 2025

The server seems to be functioning fine. It might have been a network issue that was not handled properly for that single instance. Everything is looking good!

@hanxiao
Copy link
Member

hanxiao commented Feb 7, 2025

Error: OPENAI_API_KEY not found

yeah for local llm, u still need to set this OPENAI_API_KEY but it can be whatever value unless ur local llm server has auth.

i added this to the readme section

@paul5007
Copy link
Contributor

paul5007 commented Feb 7, 2025 via email

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants