Add OpenAI Codex provider runtime and responses integration (without .agent/PLANS.md)

This commit is contained in:
George Pickett 2026-02-25 18:20:38 -08:00
parent e3cb957a10
commit 609b19b630
19 changed files with 1713 additions and 145 deletions

View file

@ -2,10 +2,14 @@
# Copy this file to .env and fill in your API keys
# =============================================================================
# LLM PROVIDER (OpenRouter)
# LLM PROVIDER
# =============================================================================
# OpenRouter provides access to many models through one API
# All LLM calls go through OpenRouter - no direct provider keys needed
# Provider selection override: auto | openrouter | nous | openai-codex
# If unset, Hermes auto-detects from auth/config.
# HERMES_INFERENCE_PROVIDER=auto
# OpenRouter key (required when using OpenRouter directly, and still used by
# some tools even when your primary chat provider is Nous/Codex/custom).
# Get your key at: https://openrouter.ai/keys
OPENROUTER_API_KEY=
@ -13,6 +17,11 @@ OPENROUTER_API_KEY=
# Examples: anthropic/claude-opus-4.6, openai/gpt-4o, google/gemini-2.0-flash, zhipuai/glm-4-plus
LLM_MODEL=anthropic/claude-opus-4.6
# OpenAI Codex provider uses Codex CLI auth state:
# hermes login --provider openai-codex
# (reads CODEX_HOME/auth.json, default: ~/.codex/auth.json)
# CODEX_HOME=~/.codex
# =============================================================================
# TOOL API KEYS
# =============================================================================