Skip to content

api_type = azure support for OpenAI custom models #178

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
simonw opened this issue Aug 21, 2023 · 15 comments
Open

api_type = azure support for OpenAI custom models #178

simonw opened this issue Aug 21, 2023 · 15 comments
Labels
enhancement New feature or request plugins

Comments

@simonw
Copy link
Owner

simonw commented Aug 21, 2023

https://twitter.com/simonw/status/1693706702519140571

Example code here says: https://learn.microsoft.com/en-us/azure/ai-services/openai/quickstart?tabs=command-line&pivots=programming-language-python

openai.api_key = os.getenv("AZURE_OPENAI_KEY")
openai.api_base = os.getenv("AZURE_OPENAI_ENDPOINT") # your endpoint should look like the following https://YOUR_RESOURCE_NAME.openai.azure.com/
openai.api_type = 'azure'
openai.api_version = '2023-05-15' # this may change in the future
@simonw simonw added enhancement New feature or request plugins labels Aug 21, 2023
@simonw
Copy link
Owner Author

simonw commented Aug 21, 2023

Would be good to support a llm-openai-azure plugin.

That plugin could work today by setting the global openai.api_type = 'azure' variable, but I worry that would break other plugins that also use the openai library.

Instead, having a way to pass api_type to Chat here would be good:

def __init__(
self, model_id, key=None, model_name=None, api_base=None, headers=None
):

@simonw
Copy link
Owner Author

simonw commented Aug 21, 2023

Worth reviewing to see what other options might be useful. api_version looks like one.

@simonw
Copy link
Owner Author

simonw commented Aug 21, 2023

Got it working!

git diff | llm -m azure-gpt4 -s 'explain this change'

This change in the Python file 'openai_models.py' expands the 'Chat' Model object and its associated functions to include more attributes: 'api_type', 'api_version', and 'api_engine'.

Initially, the 'Chat' Model only had the attributes 'model_id', 'key', 'model_name', 'api_base', and 'headers'. The update introduces 'api_type', 'api_version', and 'api_engine' to the initialisation function (init) and also makes necessary changes in the register function to accommodate these added attributes.

Also, additional error checks are added in the Class 'Chat'. If the api_type, api_version, and api_engine attributes are not null, it will add them to the keyword arguments.

This change likely allows for more specificity or functionality when using the 'Chat' model, perhaps allowing users to specify the type, version, and engine of the API in use.

That's with this in extra-openai-models.yml:

- model_id: azure-gpt4
  model_name: gpt4
  api_base: https://special-magic-secret-thing.openai.azure.com/
  api_key_name: azure
  api_version: 2023-03-15-preview
  api_type: azure
  api_engine: gpt4

And the branch I'm about to push.

@kevinddnj
Copy link

Simon
I've implemented this, I just replaced the openai_models.py file in the latest distribution and configured the YAML file from your example. It works great and thank you for doing the implementation. It's a big deal for us as we deal with health data and need to run on a protected and firewalled instance like the Azure deployments. Do you intend to merge this soon?

@dozsa
Copy link

dozsa commented Nov 8, 2023

Any plan to merge this? This seems like an useful tool. But some are limited to use Azure OpenAI, so this would be needed. Thanks.

@kevinddnj
Copy link

Adrian
I think it has been merged - at any rate I didn't need to monkey patch the v0.11.1 and v0.12 releases this week and I just inspected the current source for 0.12 and the support is there. So as long as you ensure the right entries are in the .yaml configuration file you should be OK.

I think Simon could close this Issue now!
Kevin

@dozsa
Copy link

dozsa commented Nov 8, 2023

Was just reading the code and noticed the same. Tested it and works fine. Many thanks.

@bnookala
Copy link

bnookala commented Nov 8, 2023

Just to bring some extra closure to this, I wrote up some docs on how to use this feature. #337

@jamwil-cbre
Copy link

I believe this has broken in a recent version of the openai library. The call to get_client in default_plugins/openai_models.py results in TypeError: OpenAI.__init__() got an unexpected keyword argument 'api_type'

@dozsa
Copy link

dozsa commented Feb 2, 2025

I believe this has broken in a recent version of the openai library. The call to get_client in default_plugins/openai_models.py results in TypeError: OpenAI.__init__() got an unexpected keyword argument 'api_type'

Seeing the same error. Did you figure out a way around this?

@jamwil-cbre
Copy link

Seeing the same error. Did you figure out a way around this?

No, though I didn't look very hard. For my use case, I ended up using aider-chat.

@JWCook
Copy link

JWCook commented Feb 28, 2025

I believe this has broken in a recent version of the openai library. The call to get_client in default_plugins/openai_models.py results in TypeError: OpenAI.__init__() got an unexpected keyword argument 'api_type'

I came here trying to figure out the same thing.

There is now a separate client class for Azure OpenAI, which has a different interface. Here is an example patch that gets this working again:

diff --git a/llm/default_plugins/openai_models.py b/llm/default_plugins/openai_models.py
index ddd5676..aceb724 100644
--- a/llm/default_plugins/openai_models.py
+++ b/llm/default_plugins/openai_models.py
@@ -522,6 +522,13 @@ class _Shared:
             kwargs["http_client"] = logging_client()
         if async_:
             return openai.AsyncOpenAI(**kwargs)
+        elif self.api_type == 'azure':
+            return openai.AzureOpenAI(
+                azure_endpoint=self.api_base,
+                azure_deployment=self.model_name,
+                api_version=self.api_version,
+                api_key=kwargs.get("api_key"),
+            )
         else:
             return openai.OpenAI(**kwargs)

Example config in ~/.config/io.datasette.llm/extra-openai-models.yaml:

- model_id: azure-gpt4o
  model_name: gpt-4o
  api_base: https://azure-api.example.com
  api_key_name: azure
  api_version: 2024-06-01
  api_type: azure

@fabge
Copy link

fabge commented Mar 19, 2025

i do have a working plugin: llm-azure, which is used by quite some people.

it might make sense to include llm-azure into the plugin directory. there are already two PRs regarding this:

@rcarmo
Copy link

rcarmo commented Jun 1, 2025

Well, this is maddening. I see that the latest uv tool update llm has bumped the version of openai to 1.82.1, but it has also apparently broken llm-azure, so now neither llm-azure nor patching extra-openai-models.yaml work.

As someone who primarily has access to Azure models, not having llm support this properly since the beginning is very frustrating, since #751 has lain unacknowledged for months and I've never had llm working properly without patching it (which admittedly isn't very hard, but hey, I can only do it so many times).

[edit: removed redundant sentence]

@simonw
Copy link
Owner Author

simonw commented Jun 1, 2025

OK, what do I have to do to get a test account so I can try out Azure myself?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request plugins
Projects
None yet
Development

No branches or pull requests

8 participants