Skip to content

feat: add azure openai support #92

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 27 commits into
base: main
Choose a base branch
from

Conversation

thegovind
Copy link

@thegovind thegovind commented Apr 16, 2025

Summary

Adds optional Azure OpenAI support alongside the existing OpenAI flow. When AZURE_OPENAI_ENDPOINT (plus related env vars) is set, Codex will authenticate via Entra ID and call the Responses API against your Azure OpenAI deployments. Otherwise it falls back to the standard OPENAI_API_KEY path.

Closes #11 #174


What’s Changed

  • Model & Agent code
    • model‑utils.ts
      • Detects AZURE_OPENAI_ENDPOINT; if present, uses @azure/identity’s DefaultAzureCredential and the SDK’s AzureOpenAI client to list deployments via Responses API.
      • Falls back to OPENAI_API_KEY flow or default model list on errors.
    • agent-loop.ts
      • Updated reasoning initialization to work correctly when using Azure models.
  • Configuration
    • Exported four new env vars in config.ts:
      • AZURE_OPENAI_ENDPOINT
      • AZURE_OPENAI_API_VERSION
      • AZURE_OPENAI_DEPLOYMENT (optional)
      • AZURE_OPENAI_API_KEY (fallback)
  • CLI error handling
    • cli.tsx now errors if neither OPENAI_API_KEY nor AZURE_OPENAI_ENDPOINT is configured.
    • Enhanced error messaging to guide through az login and Azure env‑var setup.
  • README updates
    • Added a “Using Azure OpenAI” section under Configuration with bash export examples.
    • Clarified that Entra ID auth (via Azure AD) is supported for Azure OpenAI.
  • Dependencies
    • Added @azure/identity and bumped openai to include AzureOpenAI support.

How to Test

  1. Deploy your model

  2. Set up env vars

    az login
    export AZURE_OPENAI_ENDPOINT="https://<your‑resource>.openai.azure.com"
    export AZURE_OPENAI_API_VERSION="2024-02-01"
    export AZURE_OPENAI_DEPLOYMENT="your-deployment-name" # optional
    export AZURE_OPENAI_API_KEY="your-api-key" # if not present, it'll use Entra ID
  3. Get the Codex CLI by building from your PR branch
    Clone your specific PR branch and build the CLI:

    # Clone your fork of the repository
    git clone https://github.com/thegovind/codex.git
    
    # Navigate into the cloned repository
    cd codex
    
    # Checkout your specific PR branch
    git checkout gk/azure-openai
    
    # Navigate to the CLI package directory
    cd codex-cli
    
    # Enable corepack
    corepack enable
    
    # Install dependencies and build
    pnpm install
    pnpm build
    
    # Get the usage and the options (optional)
    node ./dist/cli.js --help
    
    # After building, you can either:
    # - Run the locally-built CLI directly using `node ./dist/cli.js <args>`
    # - Link the command globally for convenience using `pnpm link` (then use `codex <args>`)
  4. Invoke Codex
    Run the locally built CLI using node from the codex-cli directory:

    node ./dist/cli.js "explain this function"

    (Alternatively, if you ran pnpm link in the previous step, you can simply use codex "explain this function" from anywhere).


Tests

  • Updated and ran all tests successfully
  • Update default API version to 2025-04-01-preview once it’s available in Azure OpenAI
    (Note: Temporary Conditional logic added for Azure OpenAI will be removed in another PR)

Checklist

  • Model listing and agent loop updated for Azure path
  • Config exports for Azure env vars
  • CLI guards/error messages for missing credentials
  • README examples for Azure OpenAI
  • Added @azure/identity, bumped openai dependency
  • Write and update tests for Azure code path

cc @theabhinavdas @nikodem-wrona for SDK review.

@thegovind thegovind marked this pull request as ready for review April 16, 2025 19:54
@alioftech
Copy link

That was fast! Was looking for this. Thanks

Copy link

github-actions bot commented Apr 16, 2025

All contributors have signed the CLA ✍️ ✅
Posted by the CLA Assistant Lite bot.

@thegovind
Copy link
Author

I have read the CLA Document and I hereby sign the CLA

@junseinagao
Copy link

@thegovind

Hi,

I tried running this PR, but Codex still throws the error “Missing OpenAI API key.” even though I’ve set the following environment variables: AZURE_OPENAI_ENDPOINT, AZURE_OPENAI_API_VERSION, and AZURE_OPENAI_DEPLOYMENT.

It looks like we might also need to update cli.tsx.

@theabhinavdas
Copy link
Contributor

@thegovind Adding tests will also help.

@thegovind
Copy link
Author

@thegovind

Hi,

I tried running this PR, but Codex still throws the error “Missing OpenAI API key.” even though I’ve set the following environment variables: AZURE_OPENAI_ENDPOINT, AZURE_OPENAI_API_VERSION, and AZURE_OPENAI_DEPLOYMENT.

It looks like we might also need to update cli.tsx.

@junseinagao Thanks for testing. I’ve only added support for Entra ID, so you’ll need to run az login locally before running this and ensure the relevant Azure environment variables are set. I’ve also updated the CLI instructions as you suggested. Thank you!

@thegovind
Copy link
Author

@thegovind Adding tests will also help.

Agree, working on it.

@adhishthite
Copy link

@thegovind

Thanks for this, can we make it work with non-Entra ID Azure OpenAI as well, i.e.: Just the AZURE_OPENAI_KEY and stuff? Would be much more helpful!

Thanks!

@adhishthite
Copy link

@thegovind

#174

Since you're working on this, can you also take a look at this issue above - so that a TON of people will benefit? Much appreciated.

@tibo-openai
Copy link
Collaborator

Would it be possible to build this in a way that doesn't depend on "@azure/identity"?

@thegovind
Copy link
Author

thegovind commented Apr 17, 2025

@adhishthite @tibo-openai @junseinagao @theabhinavdas @alioftech and others following this - Thanks for your patience! I’ll be pushing more updates soon, but we need to wait for our Azure OpenAI Inference team (@YunsongB) to update the underlying APIs before everything works end-to-end.

The latest API version from our side needs to reach parity with OpenAI (with 2025-04-01-preview):
Azure OpenAI Inference Specs

If you try to use the most recent spec (2025-03-01-preview), you'll likely hit this:

image

Specifically, ReasoningItem has been updated in OpenAI’s latest inference API. This change needs to propagate into Azure OpenAI with a new API version so the reasoning summary can be set properly in the Codex loop.

You could add a conditional for AOAI here, but that’s probably unnecessary since the update is expected very shortly.

So in short: let’s not put the cart before the horse 😄 I’ll follow up with the commits as soon as the updated APIs are live - likely next week.

@pskoett
Copy link

pskoett commented Apr 17, 2025

This is great, win win

@moosh3
Copy link

moosh3 commented Apr 18, 2025

Nice!

@APiTJLillo
Copy link

Amazing, keeping an eye on this!

@adamgoslin
Copy link

@thegovind Any timeline for the update to the Azure openai inference api? If the release is delayed, it might be wise to add the conditional for AOAI and get this merged. You can always come back and remove it.

@thegovind
Copy link
Author

Thank you for testing with Azure OpenAI and providing your feedback. I’ve escalated this internally. We are currently experiencing capacity issues due to a spike in demand - both from new users and, more importantly, from migrating existing 4o token consumers to o4-mini or o3. As we increase capacity, you should see fewer timeouts and less general degradation.

@doggy8088
Copy link

@thegovind Have you implement retry mechanism in this pull request?

@abazabaaa
Copy link

Is there something stopping this from being merged? Not sure how this process works but I also have several people waiting on this.

@thegovind
Copy link
Author

Is there something stopping this from being merged? Not sure how this process works but I also have several people waiting on this.

I’ve received all the necessary approvals (@theabhinavdas - could you also officially approve the PR as indicated here?), and several of you have tested and verified it. At this point, it’s really up to OpenAI, and specifically @tibo-openai.

@arynaq
Copy link

arynaq commented Apr 25, 2025

Can we get this in? OpenAI org verification wants my passport, no thank you, I will run this via Azure

@cevatkerim
Copy link

/compact didn't work for me I had to make some changes. Can I contribute somehow? @thegovind

@thegovind
Copy link
Author

/compact didn't work for me I had to make some changes. Can I contribute somehow? @thegovind

Yes, please go ahead-thank you. As for conflicts, I’ve resolved merge conflicts countless times now. I’ll do it one last time once I know that OpenAI and @tibo-openai intend to support Azure OpenAI. Frankly, it’s starting to look like they don’t, which goes against the spirit of open source.

@bsormagec
Copy link

/compact didn't work for me I had to make some changes. Can I contribute somehow? @thegovind

Yes, please go ahead-thank you. As for conflicts, I’ve resolved merge conflicts countless times now. I’ll do it one last time once I know that OpenAI and @tibo-openai intend to support Azure OpenAI. Frankly, it’s starting to look like they don’t, which goes against the spirit of open source.

Create a new fork similar to open-codex if they're unresponsive; there's no need to handle it that way. :)

@heymaaz
Copy link

heymaaz commented Apr 26, 2025

Would it be possible to build this in a way that doesn't depend on "@azure/identity"?

Yes, please go ahead-thank you. As for conflicts, I’ve resolved merge conflicts countless times now. I’ll do it one last time once I know that OpenAI and @tibo-openai intend to support Azure OpenAI. Frankly, it’s starting to look like they don’t, which goes against the spirit of open source.

I think that they don't want to add the azure/identity dependency @thegovind

@thegovind
Copy link
Author

thegovind commented Apr 26, 2025

Would it be possible to build this in a way that doesn't depend on "@azure/identity"?

Yes, please go ahead-thank you. As for conflicts, I’ve resolved merge conflicts countless times now. I’ll do it one last time once I know that OpenAI and @tibo-openai intend to support Azure OpenAI. Frankly, it’s starting to look like they don’t, which goes against the spirit of open source.

I think that they don't want to add the azure/identity dependency @thegovind

@heymaaz @bsormagec @tibo-openai
This approach is not consistent with OpenAI’s official language SDKs, whether in Python or in the recommended Node implementation. Many of the companies I support prohibit static API keys for security reasons and instead require token-based authentication and authorization, which the official SDKs address through the azure-identity library. I do not intend to launch another fork such as “azure-codex” - the goal is a positive-sum outcome that supports both OpenAI and Azure OpenAI so GPT models remain relevant for SWE agents. I trust OpenAI will act in that same spirit of partnership.

@thegovind thegovind mentioned this pull request Apr 26, 2025
@jslitzkerttcu
Copy link

@tibo-openai can we get a response on this? many of us are waiting for this to be implemented so that we can use the official release and not create a separate fork.

@seanabreau
Copy link

seanabreau commented Apr 26, 2025

@tibo-openai can we get a response on this? many of us are waiting for this to be implemented so that we can use the official release and not create a separate fork.

In the openai-python library, azure-identity is currently listed only as a dev dependency.

Could an alternative approach work here to avoid adding it as a mandatory dependency?

For instance:

  1. Modify the client initialization to accept an Azure AD token string or a token provider function directly.
  2. Document how users can leverage azure-identity (or other methods) in their own code to generate the necessary tokens to pass to the client.

This approach seems like it could provide the needed Azure AD authentication flexibility without imposing the dependency on all users of the library. Would you be open to a solution like this?

@thegovind
Copy link
Author

thegovind commented Apr 27, 2025

@tibo-openai can we get a response on this? many of us are waiting for this to be implemented so that we can use the official release and not create a separate fork.

In the openai-python library, azure-identity is currently listed only as a dev dependency.

Could an alternative approach work here to avoid adding it as a mandatory dependency?

For instance:

  1. Modify the client initialization to accept an Azure AD token string or a token provider function directly.
  2. Document how users can leverage azure-identity (or other methods) in their own code to generate the necessary tokens to pass to the client.

This approach seems like it could provide the needed Azure AD authentication flexibility without imposing the dependency on all users of the library. Would you be open to a solution like this?

@seanabreau - I’m happy to refactor to highlight azure-identity in an example (similar to official SDK), but first I need clarity on one point: will OpenAI and @tibo-openai support Azure OpenAI as a target for Codex?

Code generation with Codex and GitHub Copilot was our first proof of PMF for transformer models, well before ChatGPT captured the spotlight. Building on that collaborative ethos, now extended to open-source contributions, I trust OpenAI will guide Codex so it remains valuable to developers everywhere, including those on Azure OpenAI.

Frankly, with Claude and Gemini gaining ground, our relevance and usage are no longer where they once were with Dev audience. Ensuring Codex works on Azure OpenAI would be a decisive step toward regaining that momentum.

@pablospe
Copy link

2025-04-01-preview inference API in our data plane (current ETA is 4/22)

One question. It seems a bit of delay for this. Any idea why? It seems they are going directly to 2025-05-01: Azure/azure-rest-api-specs#34051?

@Nepomuceno
Copy link

Do we know if OpenAI will give any response about this support ? Is there a way to get it working today using the API key method or not even that work ?

@APiTJLillo
Copy link

You can open the fork that was made for this pull request and download/run through there. I was able to successfully do it, I can't remember how. I think Manus told me how.

@thegovind
Copy link
Author

You can open the fork that was made for this pull request and download/run through there. I was able to successfully do it, I can't remember how. I think Manus told me how.

Yes, my fork will remain available for use with Azure OpenAI. Instructions are above. You'll need to set AZURE_OPENAI_ENDPOINT and authenticate using either az login (Entra ID - token based) or Azure OpenAI API Keys. Provision/retrieve these credentials via Azure AI Foundry

@Nepomuceno
Copy link

My question was more until this is not merged is there some environment variables that I can set tp be able to use it with azureopenai and an api key from main ?

@pablospe
Copy link

Do we know if OpenAI will give any response about this support ? Is there a way to get it working today using the API key method or not even that work ?

You might want to try using the litellm proxy and set the OPENAI_BASE_URL as a workaround, rather than waiting for this PR to be merged. Here’s an example:
#26 (comment)

While we’ll likely need to be patient for the PR to be merged (it currently has conflicts with the main branch), using the litellm proxy is straightforward and I would say generally recommended. It allows you to support a wide range of models without having to update every piece of software to use Azure OpenAI—just point your applications to the proxy and you’re good to go.

@chriswl
Copy link

chriswl commented Apr 29, 2025

With the changes on this branch, the AOAI settings are not applied when asking to explain the current command (press x in chat when a command needs to be approved). This uses a vanilla OpenAI instance: code

@CetinSert
Copy link

Everyone! Let's get this rolling!

@ilovetocode
Copy link

below are the steps I took to get the fork from thegovind working (these may or may not all be required, but, it is what worked for me):

  • build fork locally and point PATH to ./codex-cli/bin/cli.js
  • unset OPENAI_API_KEY, else openai provider is used
  • create openai azure resource manually not through IaC due to this
  • regional endpoint didn't work, only, the *.openai.azure.com endpoint worked
  • deployment name seemingly must match model name

and set the env vars:

export AZURE_OPENAI_ENDPOINT="https://<your-instance-name>.openai.azure.com"
export AZURE_OPENAI_DEPLOYMENT="o4-mini"
export AZURE_OPENAI_API_KEY="<your key>"
export AZURE_OPENAI_API_VERSION="2025-04-01-preview" <- previous versions do not seem to support the responses API, and 2025-05-01-preview returns a 404

is there any real reason this pr hasn't been merged?

@fouad-openai fouad-openai changed the title (feat) add azure openai support feat: add azure openai support Apr 30, 2025
@fouad-openai
Copy link
Collaborator

hey @thegovind thanks for this contribution! we landed a PR to add support for multiple providers a ~week ago and would be great if you could use the latest as starting point, here's providers.ts as a reference point

@thegovind
Copy link
Author

hey @thegovind thanks for this contribution! we landed a PR to add support for multiple providers a ~week ago and would be great if you could use the latest as starting point, here's providers.ts as a reference point

Thank you, @fouad-openai! Working on it now.

@thegovind
Copy link
Author

thegovind commented May 1, 2025

hey @thegovind thanks for this contribution! we landed a PR to add support for multiple providers a ~week ago and would be great if you could use the latest as starting point, here's providers.ts as a reference point

Hi @fouad-openai @tibo-openai I'm starting a separate PR (please leave this one open for now) - #769 - as the provider approach is quite different from what Codex was released with.

Now, I have 7 tests in main that are failing. I'm not sure if others are seeing this as well (this is on Windows, but I tested in both bash and pwsh). It would be good to resolve this first.

I've tested my changes with Azure OpenAI locally, and they are working fine.

image

@thegovind thegovind mentioned this pull request May 1, 2025
7 tasks
@assinchu
Copy link

assinchu commented May 1, 2025

image

it just hangs, any idea? how to run codex in debug mode

@thegovind
Copy link
Author

image it just hangs, any idea? how to run codex in debug mode

Have you tried in #769 branch? I'm still waiting on OpenAI to merge #769

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Supporting Azure OpenAI