fix(anthropic): complete third-party Anthropic-compatible provider support (#12846)

Third-party gateways that speak the native Anthropic protocol (MiniMax,
Zhipu GLM, Alibaba DashScope, Kimi, LiteLLM proxies) now work end-to-end
with the same feature set as direct api.anthropic.com callers.  Synthesizes
eight stale community PRs into one consolidated change.

Five fixes:

- URL detection: consolidate three inline `endswith("/anthropic")`
  checks in runtime_provider.py into the shared _detect_api_mode_for_url
  helper.  Third-party /anthropic endpoints now auto-resolve to
  api_mode=anthropic_messages via one code path instead of three.

- OAuth leak-guard: all five sites that assign `_is_anthropic_oauth`
  (__init__, switch_model, _try_refresh_anthropic_client_credentials,
  _swap_credential, _try_activate_fallback) now gate on
  `provider == "anthropic"` so a stale ANTHROPIC_TOKEN never trips
  Claude-Code identity injection on third-party endpoints.  Previously
  only 2 of 5 sites were guarded.

- Prompt caching: new method `_anthropic_prompt_cache_policy()` returns
  `(should_cache, use_native_layout)` per endpoint.  Replaces three
  inline conditions and the `native_anthropic=(api_mode=='anthropic_messages')`
  call-site flag.  Native Anthropic and third-party Anthropic gateways
  both get the native cache_control layout; OpenRouter gets envelope
  layout.  Layout is persisted in `_primary_runtime` so fallback
  restoration preserves the per-endpoint choice.

- Auxiliary client: `_try_custom_endpoint` honors
  `api_mode=anthropic_messages` and builds `AnthropicAuxiliaryClient`
  instead of silently downgrading to an OpenAI-wire client.  Degrades
  gracefully to OpenAI-wire when the anthropic SDK isn't installed.

- Config hygiene: `_update_config_for_provider` (hermes_cli/auth.py)
  clears stale `api_key`/`api_mode` when switching to a built-in
  provider, so a previous MiniMax custom endpoint's credentials can't
  leak into a later OpenRouter session.

- Truncation continuation: length-continuation and tool-call-truncation
  retry now cover `anthropic_messages` in addition to `chat_completions`
  and `bedrock_converse`.  Reuses the existing `_build_assistant_message`
  path via `normalize_anthropic_response()` so the interim message
  shape is byte-identical to the non-truncated path.

Tests: 6 new files, 42 test cases.  Targeted run + tests/run_agent,
tests/agent, tests/hermes_cli all pass (4554 passed).

Synthesized from (credits preserved via Co-authored-by trailers):
  #7410  @nocoo           — URL detection helper
  #7393  @keyuyuan        — OAuth 5-site guard
  #7367  @n-WN            — OAuth guard (narrower cousin, kept comment)
  #8636  @sgaofen         — caching helper + native-vs-proxy layout split
  #10954 @Only-Code-A     — caching on anthropic_messages+Claude
  #7648  @zhongyueming1121 — aux client anthropic_messages branch
  #6096  @hansnow         — /model switch clears stale api_mode
  #9691  @TroyMitchell911 — anthropic_messages truncation continuation

Closes: #7366, #8294 (third-party Anthropic identity + caching).
Supersedes: #7410, #7367, #7393, #8636, #10954, #7648, #6096, #9691.
Rejects:    #9621 (OpenAI-wire caching with incomplete blocklist — risky),
            #7242 (superseded by #9691, stale branch),
            #8321 (targets smart_model_routing which was removed in #12732).

Co-authored-by: nocoo <nocoo@users.noreply.github.com>
Co-authored-by: Keyu Yuan <leoyuan0099@gmail.com>
Co-authored-by: Zoee <30841158+n-WN@users.noreply.github.com>
Co-authored-by: sgaofen <135070653+sgaofen@users.noreply.github.com>
Co-authored-by: Only-Code-A <bxzt2006@163.com>
Co-authored-by: zhongyueming <mygamez@163.com>
Co-authored-by: Xiaohan Li <hansnow@users.noreply.github.com>
Co-authored-by: Troy Mitchell <i@troy-y.org>
This commit is contained in:
Teknium
2026-04-19 22:43:09 -07:00
committed by GitHub
parent 491cf25eef
commit 65a31ee0d5
11 changed files with 911 additions and 58 deletions

View File

@@ -0,0 +1,84 @@
"""Tests for hermes_cli.auth._update_config_for_provider clearing stale fields.
When the user switches from a custom provider (e.g. MiniMax with
``api_mode: anthropic_messages``, ``api_key: mxp-...``) to a built-in
provider (e.g. OpenRouter), the stale ``api_key`` and ``api_mode`` would
otherwise override the new provider's credentials and transport choice.
Built-in providers that legitimately need a specific ``api_mode`` (copilot,
xai) compute it at request-resolution time in
``_copilot_runtime_api_mode`` / ``_detect_api_mode_for_url``, so removing
the persisted value here is safe.
"""
from __future__ import annotations
import yaml
from hermes_cli.auth import _update_config_for_provider
from hermes_cli.config import get_config_path
def _read_model_cfg() -> dict:
path = get_config_path()
if not path.exists():
return {}
data = yaml.safe_load(path.read_text()) or {}
model = data.get("model", {})
return model if isinstance(model, dict) else {}
def _seed_custom_provider_config(api_mode: str = "anthropic_messages") -> None:
"""Write a config.yaml mimicking a user on a MiniMax-style custom provider."""
path = get_config_path()
path.parent.mkdir(parents=True, exist_ok=True)
path.write_text(
yaml.safe_dump(
{
"model": {
"provider": "custom",
"base_url": "https://api.minimax.io/anthropic",
"api_key": "mxp-stale-key",
"api_mode": api_mode,
"default": "claude-sonnet-4-6",
}
},
sort_keys=False,
)
)
class TestUpdateConfigForProviderClearsStaleCustomFields:
def test_switching_to_openrouter_clears_api_key_and_api_mode(self):
_seed_custom_provider_config()
_update_config_for_provider(
"openrouter",
"https://openrouter.ai/api/v1",
default_model="anthropic/claude-sonnet-4.6",
)
model_cfg = _read_model_cfg()
assert model_cfg.get("provider") == "openrouter"
assert model_cfg.get("base_url") == "https://openrouter.ai/api/v1"
assert "api_key" not in model_cfg, (
"Stale custom api_key would leak into OpenRouter requests — must be cleared"
)
assert "api_mode" not in model_cfg, (
"Stale api_mode=anthropic_messages from MiniMax would mis-route "
"OpenRouter requests to the Anthropic SDK — must be cleared"
)
def test_switching_to_nous_clears_stale_api_mode(self):
_seed_custom_provider_config()
_update_config_for_provider("nous", "https://inference-api.nousresearch.com/v1")
model_cfg = _read_model_cfg()
assert model_cfg.get("provider") == "nous"
assert "api_mode" not in model_cfg
assert "api_key" not in model_cfg
def test_switching_clears_codex_responses_api_mode(self):
"""Also covers codex_responses, not just anthropic_messages."""
_seed_custom_provider_config(api_mode="codex_responses")
_update_config_for_provider("openrouter", "https://openrouter.ai/api/v1")
assert "api_mode" not in _read_model_cfg()