-
-
Notifications
You must be signed in to change notification settings - Fork 492
api_type = azure support for OpenAI custom models #178
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
Would be good to support a That plugin could work today by setting the global Instead, having a way to pass llm/llm/default_plugins/openai_models.py Lines 181 to 183 in 7744cf9
|
Worth reviewing to see what other options might be useful. |
Got it working! git diff | llm -m azure-gpt4 -s 'explain this change'
That's with this in - model_id: azure-gpt4
model_name: gpt4
api_base: https://special-magic-secret-thing.openai.azure.com/
api_key_name: azure
api_version: 2023-03-15-preview
api_type: azure
api_engine: gpt4 And the branch I'm about to push. |
Simon |
Any plan to merge this? This seems like an useful tool. But some are limited to use Azure OpenAI, so this would be needed. Thanks. |
Adrian I think Simon could close this Issue now! |
Was just reading the code and noticed the same. Tested it and works fine. Many thanks. |
Just to bring some extra closure to this, I wrote up some docs on how to use this feature. #337 |
I believe this has broken in a recent version of the openai library. The call to |
Seeing the same error. Did you figure out a way around this? |
No, though I didn't look very hard. For my use case, I ended up using aider-chat. |
I came here trying to figure out the same thing. There is now a separate client class for Azure OpenAI, which has a different interface. Here is an example patch that gets this working again: diff --git a/llm/default_plugins/openai_models.py b/llm/default_plugins/openai_models.py
index ddd5676..aceb724 100644
--- a/llm/default_plugins/openai_models.py
+++ b/llm/default_plugins/openai_models.py
@@ -522,6 +522,13 @@ class _Shared:
kwargs["http_client"] = logging_client()
if async_:
return openai.AsyncOpenAI(**kwargs)
+ elif self.api_type == 'azure':
+ return openai.AzureOpenAI(
+ azure_endpoint=self.api_base,
+ azure_deployment=self.model_name,
+ api_version=self.api_version,
+ api_key=kwargs.get("api_key"),
+ )
else:
return openai.OpenAI(**kwargs) Example config in - model_id: azure-gpt4o
model_name: gpt-4o
api_base: https://azure-api.example.com
api_key_name: azure
api_version: 2024-06-01
api_type: azure |
i do have a working plugin: llm-azure, which is used by quite some people. it might make sense to include llm-azure into the plugin directory. there are already two PRs regarding this: |
Well, this is maddening. I see that the latest As someone who primarily has access to Azure models, not having [edit: removed redundant sentence] |
OK, what do I have to do to get a test account so I can try out Azure myself? |
https://twitter.com/simonw/status/1693706702519140571
Example code here says: https://learn.microsoft.com/en-us/azure/ai-services/openai/quickstart?tabs=command-line&pivots=programming-language-python
The text was updated successfully, but these errors were encountered: