Skip to content

CodexClient

The main entry point for the Codex API.

codex_open_client.CodexClient(*, headless: bool = False, no_browser: bool = False, token_path: str | Path = DEFAULT_TOKEN_PATH, login_handler: Callable[[str], str] | None = None, max_retries: int = 2, timeout: float = 120.0)

Python client for the Codex API.

Handles authentication, token refresh, and provides typed access to the Codex responses and models endpoints.

Usage::

import codex_open_client

client = codex_open_client.CodexClient()
response = client.responses.create(
    model="gpt-5.1-codex-mini",
    instructions="You are helpful.",
    input="Hello!",
)
print(response.output_text)

token: str property

The current access token.

account_id: str | None property

The ChatGPT account ID extracted from the current token.

login() -> None

Re-authenticate, replacing the current token.

Responses

codex_open_client._responses.Responses(client: CodexClient)

Access the Codex responses endpoint.

Usage::

response = client.responses.create(
    model="gpt-5.1-codex-mini",
    instructions="You are helpful.",
    input="Hello!",
)
print(response.output_text)

create(*, model: str, instructions: str, input: str | list[_InputItem | dict[str, Any]], stream: bool = False, tools: list[FunctionTool | dict[str, Any]] | None = None, tool_choice: Literal['auto', 'none', 'required'] | None = None, parallel_tool_calls: bool | None = None, reasoning: Reasoning | None = None, text: TextConfig | None = None, service_tier: Literal['auto', 'flex', 'priority'] | None = None, include: list[str] | None = None, previous_response_id: str | None = None, timeout: float | None = None) -> Response | ResponseStream

create(
    *,
    model: str,
    instructions: str,
    input: str | list[_InputItem | dict[str, Any]],
    stream: Literal[False] = False,
    tools: list[FunctionTool | dict[str, Any]]
    | None = None,
    tool_choice: Literal["auto", "none", "required"]
    | None = None,
    parallel_tool_calls: bool | None = None,
    reasoning: Reasoning | None = None,
    text: TextConfig | None = None,
    service_tier: Literal["auto", "flex", "priority"]
    | None = None,
    include: list[str] | None = None,
    previous_response_id: str | None = None,
    timeout: float | None = None,
) -> Response
create(
    *,
    model: str,
    instructions: str,
    input: str | list[_InputItem | dict[str, Any]],
    stream: Literal[True],
    tools: list[FunctionTool | dict[str, Any]]
    | None = None,
    tool_choice: Literal["auto", "none", "required"]
    | None = None,
    parallel_tool_calls: bool | None = None,
    reasoning: Reasoning | None = None,
    text: TextConfig | None = None,
    service_tier: Literal["auto", "flex", "priority"]
    | None = None,
    include: list[str] | None = None,
    previous_response_id: str | None = None,
    timeout: float | None = None,
) -> ResponseStream
create(
    *,
    model: str,
    instructions: str,
    input: str | list[_InputItem | dict[str, Any]],
    stream: bool,
    tools: list[FunctionTool | dict[str, Any]]
    | None = None,
    tool_choice: Literal["auto", "none", "required"]
    | None = None,
    parallel_tool_calls: bool | None = None,
    reasoning: Reasoning | None = None,
    text: TextConfig | None = None,
    service_tier: Literal["auto", "flex", "priority"]
    | None = None,
    include: list[str] | None = None,
    previous_response_id: str | None = None,
    timeout: float | None = None,
) -> Response | ResponseStream

Create a response from the Codex API.

Always streams internally from the API (stream: true in the body). When stream=False (default), collects the stream and returns the final Response. When stream=True, returns a ResponseStream that yields events as they arrive.

ResponseStream

codex_open_client._stream.ResponseStream(httpx_response: _HasText)

Iterable of ResponseStreamEvent from an SSE response.

Can be used as an iterator to process events one by one, or call get_final_response() to consume the stream and return the final Response.

__iter__() -> Iterator[ResponseStreamEvent]

get_final_response() -> Response

Consume the entire stream and return the final Response.

close() -> None

Close the underlying HTTP response.

__enter__() -> ResponseStream

__exit__(*args: object) -> None

Models

codex_open_client._models.Models(client: CodexClient)

Access the Codex models endpoint.

Usage::

models = client.models.list()
for m in models:
    print(m.slug, m.context_window)

list(*, force_refresh: bool = False) -> list[Model]

List models available to the authenticated user.

Results are cached for 5 minutes. Pass force_refresh=True to bypass.

Authentication Functions

login

codex_open_client.login(*, headless: bool = False, no_browser: bool = False, token_path: Path = DEFAULT_TOKEN_PATH) -> TokenData

Run the full OAuth PKCE login flow.

Parameters:

Name Type Description Default
headless bool

If True, print the auth URL and prompt the user to paste the callback URL back. No local server is started.

False
no_browser bool

If True, print the auth URL instead of opening a browser, but still start the local callback server to catch the redirect.

False
token_path Path

Where to store the resulting tokens.

DEFAULT_TOKEN_PATH

Returns:

Type Description
TokenData

The obtained token data.

get_token

codex_open_client.get_token(*, headless: bool = False, no_browser: bool = False, token_path: Path = DEFAULT_TOKEN_PATH) -> str

Get a valid access token, handling cache, refresh, and login automatically.

Parameters:

Name Type Description Default
headless bool

If True and login is needed, use headless mode.

False
no_browser bool

If True and login is needed, print URL instead of opening browser.

False
token_path Path

Path to the token storage file.

DEFAULT_TOKEN_PATH

Returns:

Type Description
str

A valid access token string.

refresh

codex_open_client.refresh(refresh_token: str, token_path: Path = DEFAULT_TOKEN_PATH) -> TokenData

Use a refresh token to obtain a new access token.

start_login

codex_open_client.start_login() -> PendingLogin

Begin a two-step login flow.

Returns a PendingLogin with a .url attribute containing the OAuth URL to present to the user. Pass the result to finish_login() along with the callback URL after the user authenticates.

Example::

auth = codex_open_client.start_login()
# ... show auth.url to the user, let them authenticate ...
tokens = codex_open_client.finish_login(auth, callback_url="http://localhost:1455/...")

finish_login

codex_open_client.finish_login(pending: PendingLogin, *, callback_url: str, token_path: Path = DEFAULT_TOKEN_PATH) -> TokenData

Complete a two-step login flow.

Parameters:

Name Type Description Default
pending PendingLogin

The PendingLogin returned by start_login().

required
callback_url str

The full redirect URL (including the code parameter) after the user authenticated in their browser.

required
token_path Path

Where to store the resulting tokens.

DEFAULT_TOKEN_PATH

Returns:

Type Description
TokenData

The obtained token data.

PendingLogin

codex_open_client.PendingLogin(url: str, _verifier: str, _state: str) dataclass

Opaque state for a two-step login flow.

Returned by start_login(), passed to finish_login().

Helper Functions

build_headers

codex_open_client.build_headers(token: str) -> dict[str, str]

Build the headers required for Codex API requests.

get_account_id

codex_open_client.get_account_id(token: str) -> str | None

Extract the ChatGPT account ID from a Codex OAuth JWT.