Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add ability to perform OpenAI API compatible calls? #96

Open
LankyPoet opened this issue Jun 28, 2024 · 2 comments
Open

Add ability to perform OpenAI API compatible calls? #96

LankyPoet opened this issue Jun 28, 2024 · 2 comments

Comments

@LankyPoet
Copy link

Hi,
I would love if instead of relying on llama cpp python we could use any backend of our choosing. For instance LM Studio using API at http://localhost:1234/v1/chat/completions . Can that be added as an option instead of having to load/unload models within ComfyUI itself? Reference Plush nodes for one implementation of this.

Thank you!

@gokayfem
Copy link
Owner

Thanks for the suggestion, i thought about this. Actually before llama cpp python, i was planning to do this with lm studio, but you need to install another program etc... than i choose llama cpp python. But yes i can add it to my nodes. I actually have OpenAi nodes, it already accepts custom url(i use it with deepseek api same way). All I need to do is add custom url to UI. I might add vision capability to that also.

@LankyPoet
Copy link
Author

Sounds amazing, thank you!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants