You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi, guys
since you guys already implement vendor_multimodal_api_key, vendor_multimodal_model_name, would you please add a new paramter vendor_multimodal_api_base.
This is very useful for those who can only access those api behand proxy.
Thank you very much!
The text was updated successfully, but these errors were encountered:
Hi @yangyu,
Can you explain in more details your issue? Like what you're trying to do and why it doesn't work.
In what part of the project do you want us to add this? In the Python client, our frontend or in our backend?
Hi. Sacha
I would like this parameter can be added to the LlmaParser constructor to
use my own API keys.
For example:
I am using proxy to access OpenAI , Claude , Gemini APIs.
When I create instance of LlamaParser. I want to pass the proxy endpoint as
api_base of the multimodal LLM API endpoint.
Currently the enhancement is for backend of python client. Of course if you
guys can implement to all that would be great!
Best regards,
Yang Yu
Sacha Bron ***@***.***>于2024年9月24日 周二18:44写道:
Hi, guys
since you guys already implement vendor_multimodal_api_key, vendor_multimodal_model_name, would you please add a new paramter vendor_multimodal_api_base.
This is very useful for those who can only access those api behand proxy.
Thank you very much!
The text was updated successfully, but these errors were encountered: