Codex CLI / Codex SDK
Same shape as the Claude SDK path, but for the OpenAI wire format — Chat Completions and the newer Responses API both supported.
Same shape as the Claude SDK path, but for the OpenAI wire format. EntorinOpenAITransport intercepts both /v1/chat/completions (older) and /v1/responses (the newer Responses API Codex uses by default). The MCP and sandbox helpers are SDK-agnostic — they’re re-exported under adapters.codex:
from adapters.codex import (
EntorinOpenAITransport, WrappedMCPSession, run_sandbox,
)
from adapters.codex.transport import CAPABILITY_KIND # "model.openai.key"
transport = EntorinOpenAITransport(
ctx=ctx, bus=bus, ledger=ledger, gate=gate,
)
codex_http_client = httpx.AsyncClient(
transport=transport, base_url="https://api.openai.com",
)
# Hand `codex_http_client` to the SDK / CLI in place of its default.
Usage parsing tolerates both API shapes (usage.input_tokens / usage.output_tokens for Responses, usage.prompt_tokens / usage.completion_tokens for Chat Completions); you don’t pick at integration time.