fix: respect model.default from config.yaml for openai-codex provider (#1896)

When config.yaml had a non-default model (e.g. gpt-5.3-codex) and the
provider was openai-codex, _normalize_model_for_provider() would replace
it with the latest available codex model because _model_is_default only
checked the CLI argument, not the config value.

Now _model_is_default is False when config.yaml has a model that differs
from the global fallback (anthropic/claude-opus-4.6), so the user's
explicit config choice is preserved.

Fixes #1887

Co-authored-by: Test <test@test.com>
This commit is contained in:
Teknium 2026-03-18 02:50:31 -07:00 committed by GitHub
parent e86bfd7667
commit 24ac577046
No known key found for this signature in database
GPG key ID: B5690EEEBB952194
2 changed files with 51 additions and 2 deletions

10
cli.py
View file

@ -1044,11 +1044,17 @@ class HermesCLI:
# env vars would stomp each other.
_model_config = CLI_CONFIG.get("model", {})
_config_model = _model_config.get("default", "") if isinstance(_model_config, dict) else (_model_config or "")
self.model = model or _config_model or "anthropic/claude-opus-4.6"
_FALLBACK_MODEL = "anthropic/claude-opus-4.6"
self.model = model or _config_model or _FALLBACK_MODEL
# Track whether model was explicitly chosen by the user or fell back
# to the global default. Provider-specific normalisation may override
# the default silently but should warn when overriding an explicit choice.
self._model_is_default = not model
# A config model that matches the global fallback is NOT considered an
# explicit choice — the user just never changed it. But a config model
# like "gpt-5.3-codex" IS explicit and must be preserved.
self._model_is_default = not model and (
not _config_model or _config_model == _FALLBACK_MODEL
)
self._explicit_api_key = api_key
self._explicit_base_url = base_url