You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I suggest to add a new langgraph.openai.baseUrl parameter in the settings of the aws-genai plugin, that would look like this:
genai:
agents:
general: # This matches the URL in the frontenddescription: ...prompt: ...langgraph:
openai:
baseUrl: ${OPENAI_API_BASE_URL}apiKey: ${OPENAI_API_KEY}modelName: ${QUERY_MODEL_NAME}
🎤 Context
OpenAI APIs have become a de-facto standard for many AI frameworks and many people today also use Ollama to facilitate the management of models as a Service.
Ollama provides its own API but also has an OpenAI compatible interface.
Therefore, it would make sense to be able to configure another baseURL parameter for the endpoints than the default OpenAI endpoint (https://api.openai.com/v1).
✌️ Possible Implementation
I will submit a PullRequest for it (already developed and tested on my local dev env).
The text was updated successfully, but these errors were encountered:
🔖 Feature description
I suggest to add a new
langgraph.openai.baseUrl
parameter in the settings of theaws-genai
plugin, that would look like this:🎤 Context
OpenAI APIs have become a de-facto standard for many AI frameworks and many people today also use Ollama to facilitate the management of models as a Service.
Ollama provides its own API but also has an OpenAI compatible interface.
Therefore, it would make sense to be able to configure another baseURL parameter for the endpoints than the default OpenAI endpoint (https://api.openai.com/v1).
✌️ Possible Implementation
I will submit a PullRequest for it (already developed and tested on my local dev env).
The text was updated successfully, but these errors were encountered: