fix: allow Anthropic API URLs as custom OpenAI-compatible endpoints
Removed the hard block on base_url containing 'api.anthropic.com'. Anthropic now offers an OpenAI-compatible /chat/completions endpoint, so blocking their URL prevents legitimate use. If the endpoint isn't compatible, the API call will fail with a proper error anyway. Removed from: run_agent.py, mini_swe_runner.py Updated test to verify Anthropic URLs are accepted.
This commit is contained in:
parent
76545ab365
commit
19b6f81ee7
3 changed files with 12 additions and 25 deletions
|
|
@ -253,13 +253,7 @@ class AIAgent:
|
|||
self.provider = "openai-codex"
|
||||
else:
|
||||
self.api_mode = "chat_completions"
|
||||
if base_url and "api.anthropic.com" in base_url.strip().lower():
|
||||
raise ValueError(
|
||||
"Anthropic's native /v1/messages API is not supported yet (planned for a future release). "
|
||||
"Hermes currently requires OpenAI-compatible /chat/completions endpoints. "
|
||||
"To use Claude models now, route through OpenRouter (OPENROUTER_API_KEY) "
|
||||
"or any OpenAI-compatible proxy that wraps the Anthropic API."
|
||||
)
|
||||
|
||||
self.tool_progress_callback = tool_progress_callback
|
||||
self.clarify_callback = clarify_callback
|
||||
self.step_callback = step_callback
|
||||
|
|
|
|||
Loading…
Add table
Add a link
Reference in a new issue