Skip to content

Codex

Recommended: use cc-switch for setup: https://github.com/farion1231/cc-switch/releases

It still modifies Codex's official config and environment variables.

Use cc-switch first. Manual config is only the fallback path.

Prerequisites

Codex CLI is installed through npm, so prepare these first:

  • Node.js
  • npm (normally included with Node.js)

Use the official Node.js download page: https://nodejs.org/en/download

  • macOS / Windows: download and install the LTS release from that page
  • Linux: use the distro-specific method or binary package linked from the same page

On macOS, if you already use Homebrew, you can also run:

bash
brew install node

This installs both Node.js and npm.

After installation, npm is usually available automatically. Then verify:

bash
node -v
npm -v

Once npm is available, install Codex.

Install

bash
npm i -g @openai/codex

VS Code Extension

As of March 31, 2026, OpenAI officially provides a Codex IDE extension for VS Code and states that it is compatible with most VS Code forks.

If you want to use Codex inside VS Code:

  1. Install the official Codex extension from the VS Code marketplace.
  2. Open VS Code settings and search for Codex to find extension-specific UI settings.
  3. The settings that actually control model choice, approvals, and sandbox behavior still live in the shared ~/.codex/config.toml.

So for this gateway, the VS Code extension does not need a separate custom config path. Reuse the same ~/.codex/config.toml and API key setup shown below.

API Key

When Codex uses API key auth mode, it usually stores the local credential in ~/.codex/auth.json.

If you only want to inspect the local file, use this as reference:

{
  "auth_mode": "apikey",
  "OPENAI_API_KEY": "codex_your_api_key"
}

OPENAI_API_KEY is your codex_... key. In most cases auth.json does not need manual editing; it is mainly useful to verify that Codex is using apikey mode locally.

For a temporary shell-only test, you can also export:

bash
export OPENAI_API_KEY="codex_your_api_key"

Manual Config Reference

If you do not use cc-switch, or only want to verify the local config file, check ~/.codex/config.toml:

model_provider = "custom"
model = "gpt-5.4"
disable_response_storage = true
model_reasoning_effort = "high"

[model_providers]
[model_providers.custom]
name = "custom"
base_url = "https://api.gemiaude.com/v1"
requires_openai_auth = true
wire_api = "responses"

[notice.model_migrations]
"gpt-5.2-codex" = "gpt-5.4"

Use a root endpoint such as https://api.gemiaude.com/v1 in base_url, not the full /v1/responses path.

This setup uses Codex API key auth with a custom model provider and the responses wire API. The API key comes from ~/.codex/auth.json or the current shell's OPENAI_API_KEY.

Verify

bash
codex

If Codex starts normally and resolves the model to gpt-5.4, the setup is working.

If you use the VS Code extension, opening the sidebar and starting a session successfully also confirms that the shared config is being picked up.

OpenAI-compatible gateway integration docs