Skip to content

Adds Azure OpenAI support #769

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 6 commits into
base: main
Choose a base branch
from

Conversation

thegovind
Copy link

@thegovind thegovind commented May 1, 2025

Summary

This PR introduces support for Azure OpenAI as a provider within the Codex CLI. Users can now configure the tool to leverage their Azure OpenAI deployments by specifying "azure" as the provider in config.json and setting the corresponding AZURE_OPENAI_API_KEY and AZURE_OPENAI_API_VERSION environment variables. This functionality is added alongside the existing provider options (OpenAI, OpenRouter, etc.).

Related to #92

Note: This PR is currently in Draft status because tests on the main branch are failing. It will be marked as ready for review once the main branch is stable and tests are passing.


What’s Changed

  • Configuration (config.ts, providers.ts, README.md):
    • Added "azure" to the supported providers list in providers.ts, specifying its name, default base URL structure, and environment variable key (AZURE_OPENAI_API_KEY).
    • Defined the AZURE_OPENAI_API_VERSION environment variable in config.ts with a default value (2025-03-01-preview).
    • Updated README.md to:
      • Include "azure" in the list of providers.
      • Add a configuration section for Azure OpenAI, detailing the required environment variables (AZURE_OPENAI_API_KEY, AZURE_OPENAI_API_VERSION) with examples.
  • Client Instantiation (terminal-chat.tsx, singlepass-cli-app.tsx, agent-loop.ts, compact-summary.ts, model-utils.ts):
    • Modified various components and utility functions where the OpenAI client is initialized.
    • Added conditional logic to check if the configured provider is "azure".
    • If the provider is Azure, the AzureOpenAI client from the openai package is instantiated, using the configured baseURL, apiKey (from AZURE_OPENAI_API_KEY), and apiVersion (from AZURE_OPENAI_API_VERSION).
    • Otherwise, the standard OpenAI client is instantiated as before.
  • Dependencies:
    • Relies on the openai package's built-in support for AzureOpenAI. No new external dependencies were added specifically for this Azure implementation beyond the openai package itself.

How to Test

This has been tested locally and confirmed working with Azure OpenAI.

  1. Configure config.json:
    Ensure your ~/.codex/config.json (or project-specific config) includes Azure and sets it as the active provider:
    {
      "providers": {
        // ... other providers
        "azure": {
          "name": "AzureOpenAI",
          "baseURL": "https://YOUR_RESOURCE_NAME.openai.azure.com", // Replace with your Azure endpoint
          "envKey": "AZURE_OPENAI_API_KEY"
        }
      },
      "provider": "azure", // Set Azure as the active provider
      "model": "o4-mini" // Use your Azure deployment name here
      // ... other config settings
    }
  2. Set up Environment Variables:
    # Set the API Key for your Azure OpenAI resource
    export AZURE_OPENAI_API_KEY="your-azure-api-key-here"
    
    # Set the API Version (Optional - defaults to `2025-03-01-preview` if not set)
    # Ensure this version is supported by your Azure deployment and endpoint
    export AZURE_OPENAI_API_VERSION="2025-03-01-preview"
  3. Get the Codex CLI by building from this PR branch:
    Clone your fork, checkout this branch (feat/azure-openai), navigate to codex-cli, and build:
    # cd /path/to/your/fork/codex
    git checkout feat/azure-openai # Or your branch name
    cd codex-cli
    corepack enable
    pnpm install
    pnpm build
  4. Invoke Codex:
    Run the locally built CLI using node from the codex-cli directory:
    node ./dist/cli.js "Explain the purpose of this PR"
    (Alternatively, if you ran pnpm link after building, you can use codex "Explain the purpose of this PR" from anywhere).
  5. Verify: Confirm that the command executes successfully and interacts with your configured Azure OpenAI deployment.

Tests

  • Tested locally against an Azure OpenAI deployment using API Key authentication. Basic commands and interactions confirmed working.

Checklist

  • Added Azure provider details to configuration files (providers.ts, config.ts).
  • Implemented conditional AzureOpenAI client initialization based on provider setting.
  • Ensured apiVersion is passed correctly to the Azure client.
  • Updated README.md with Azure OpenAI setup instructions.
  • Manually tested core functionality against a live Azure OpenAI endpoint.
  • Add/update automated tests for the Azure code path (pending main stability).

cc @theabhinavdas @nikodem-wrona @fouad-openai @tibo-openai (adjust as needed)


I have read the CLA Document and I hereby sign the CLA

Adds support for Azure OpenAI by allowing users to specify 'azure' as the provider.

This change introduces the AzureOpenAI client and configures it with the appropriate API version.
The code now checks the provider and instantiates either the OpenAI or AzureOpenAI client accordingly.
@thegovind thegovind mentioned this pull request May 1, 2025
8 tasks
Adds support for Azure OpenAI by allowing users to specify 'azure' as the provider.

This change introduces the AzureOpenAI client and configures it with the appropriate API version.
The code now checks the provider and instantiates either the OpenAI or AzureOpenAI client accordingly.
Adds support for Azure OpenAI by allowing users to specify 'azure' as the provider.

This change introduces the AzureOpenAI client and configures it with the appropriate API version.
The code now checks the provider and instantiates either the OpenAI or AzureOpenAI client accordingly.
@doggy8088
Copy link

@thegovind Are you planning to use this PR to replace #92?

@jslitzkerttcu
Copy link

@thegovind Are you planning to use this PR to replace #92?

I believe so. Curious how this is coming along and if there’s anything the community can do to support your efforts here. Have been following this closely and am anxious to get a hold of this to compare performance with Claude Code.

@doggy8088
Copy link

@jslitzkerttcu I ran some tests. Claude Code is miles ahead; Codex didn't even come close to catching its tail lights!

@tibo-openai
Copy link
Collaborator

Nice thank you. Now that it is getting a bit more complicated, as part of this change, could you refactor the client instantiation so that we consistently instantiante it the same way across all the different callsites?

Concretely, extract all the

    if (config.provider?.toLowerCase() === "azure") {
      oai = new AzureOpenAI({
        apiKey: getApiKey(config.provider),
        baseURL: getBaseUrl(config.provider),
        apiVersion: AZURE_OPENAI_API_VERSION,
      });
    } else {
      oai = new OpenAI({
        apiKey: getApiKey(config.provider),
        baseURL: getBaseUrl(config.provider),
      });
    }

to a common file / function and use that one everywhere.

@tibo-openai tibo-openai marked this pull request as ready for review May 2, 2025 16:28
@thegovind
Copy link
Author

Nice thank you. Now that it is getting a bit more complicated, as part of this change, could you refactor the client instantiation so that we consistently instantiante it the same way across all the different callsites?

Concretely, extract all the


    if (config.provider?.toLowerCase() === "azure") {

      oai = new AzureOpenAI({

        apiKey: getApiKey(config.provider),

        baseURL: getBaseUrl(config.provider),

        apiVersion: AZURE_OPENAI_API_VERSION,

      });

    } else {

      oai = new OpenAI({

        apiKey: getApiKey(config.provider),

        baseURL: getBaseUrl(config.provider),

      });

    }

to a common file / function and use that one everywhere.

Sounds good. I'll work on this later tonight/tomorrow and update this PR

thegovind added 2 commits May 3, 2025 19:13
Refactors the creation of OpenAI and AzureOpenAI clients into a single `createOpenAIClient` function.

This change promotes code reuse and simplifies configuration across different components.
@thegovind
Copy link
Author

Nice thank you. Now that it is getting a bit more complicated, as part of this change, could you refactor the client instantiation so that we consistently instantiante it the same way across all the different callsites?

Concretely, extract all the

    if (config.provider?.toLowerCase() === "azure") {
      oai = new AzureOpenAI({
        apiKey: getApiKey(config.provider),
        baseURL: getBaseUrl(config.provider),
        apiVersion: AZURE_OPENAI_API_VERSION,
      });
    } else {
      oai = new OpenAI({
        apiKey: getApiKey(config.provider),
        baseURL: getBaseUrl(config.provider),
      });
    }

to a common file / function and use that one everywhere.

Alright, @tibo-openai - updated this PR accordingly. Please confirm and merge.

cc @theabhinavdas @nikodem-wrona @fouad-openai

@AdithyanI
Copy link

@thegovind Thank you for pushing this! Many of us truly appreciate it. Looking forward to this being in main.

@evmin
Copy link

evmin commented May 5, 2025

@parasgoyal12 Hope you will approve and merge this soon.
Thank you for reviewing.

@evmin
Copy link

evmin commented May 6, 2025

@tibo-openai - is it something you could possibly help with by any chance? Would greatly appreciate

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

7 participants