docs: add FAQ section for common issues

Cover three frequently reported problems:
- usage.input_tokens error from misconfigured ANTHROPIC_BASE_URL
- Cannot find package 'bundle' from outdated Bun version
- How to use non-Anthropic models (OpenAI/DeepSeek/Ollama)

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
This commit is contained in:
程序员阿江(Relakkes)
2026-04-03 01:27:27 +08:00
parent 6b72f3357c
commit 62bb0169bb
2 changed files with 72 additions and 0 deletions
+36
View File
@@ -252,6 +252,42 @@ src/
---
## FAQ
### Q: `undefined is not an object (evaluating 'usage.input_tokens')`
**Cause**: `ANTHROPIC_BASE_URL` is misconfigured. The API endpoint is returning HTML or another non-JSON format instead of a valid Anthropic protocol response.
This project uses the **Anthropic Messages API protocol**. `ANTHROPIC_BASE_URL` must point to an endpoint compatible with Anthropic's `/v1/messages` interface. The Anthropic SDK automatically appends `/v1/messages` to the base URL, so:
- MiniMax: `ANTHROPIC_BASE_URL=https://api.minimaxi.com/anthropic`
- OpenRouter: `ANTHROPIC_BASE_URL=https://openrouter.ai/api`
- OpenRouter (wrong): `ANTHROPIC_BASE_URL=https://openrouter.ai/anthropic` ❌ (returns HTML)
If your model provider only supports the OpenAI protocol, you need a proxy like LiteLLM for protocol translation. See the [Third-Party Models Guide](docs/third-party-models.en.md).
### Q: `Cannot find package 'bundle'`
```
error: Cannot find package 'bundle' from '.../claude-code-haha/src/entrypoints/cli.tsx'
```
**Cause**: Your Bun version is too old and doesn't support the required `bun:bundle` built-in module.
**Fix**: Upgrade Bun to the latest version:
```bash
bun upgrade
```
### Q: How to use OpenAI / DeepSeek / Ollama or other non-Anthropic models?
This project only supports the Anthropic protocol. If your model provider doesn't natively support the Anthropic protocol, you need a proxy like [LiteLLM](https://github.com/BerriAI/litellm) for protocol translation (OpenAI → Anthropic).
See the [Third-Party Models Guide](docs/third-party-models.en.md) for detailed setup instructions.
---
## Disclaimer
This repository is based on the Claude Code source leaked from the Anthropic npm registry on 2026-03-31. All original source code copyrights belong to [Anthropic](https://www.anthropic.com). It is provided for learning and research purposes only.
+36
View File
@@ -252,6 +252,42 @@ src/
---
## 常见问题
### Q: `undefined is not an object (evaluating 'usage.input_tokens')`
**原因**`ANTHROPIC_BASE_URL` 配置不正确,API 端点返回的不是 Anthropic 协议格式的 JSON,而是 HTML 页面或其他格式。
本项目使用 **Anthropic Messages API 协议**`ANTHROPIC_BASE_URL` 必须指向一个兼容 Anthropic `/v1/messages` 接口的端点。Anthropic SDK 会自动在 base URL 后面拼接 `/v1/messages`,所以:
- MiniMax`ANTHROPIC_BASE_URL=https://api.minimaxi.com/anthropic`
- OpenRouter`ANTHROPIC_BASE_URL=https://openrouter.ai/api`
- OpenRouter 错误写法:`ANTHROPIC_BASE_URL=https://openrouter.ai/anthropic` ❌(返回 HTML
如果你的模型供应商只支持 OpenAI 协议,需要通过 LiteLLM 等代理做协议转换,详见 [第三方模型使用指南](docs/third-party-models.md)。
### Q: `Cannot find package 'bundle'`
```
error: Cannot find package 'bundle' from '.../claude-code-haha/src/entrypoints/cli.tsx'
```
**原因**Bun 版本过低,不支持项目所需的 `bun:bundle` 等内置模块。
**解决**:升级 Bun 到最新版本:
```bash
bun upgrade
```
### Q: 怎么接入 OpenAI / DeepSeek / Ollama 等非 Anthropic 模型?
本项目只支持 Anthropic 协议。如果模型供应商不直接支持 Anthropic 协议,需要用 [LiteLLM](https://github.com/BerriAI/litellm) 等代理做协议转换(OpenAI → Anthropic)。
详细配置步骤请参考:[第三方模型使用指南](docs/third-party-models.md)
---
## Disclaimer
本仓库基于 2026-03-31 从 Anthropic npm registry 泄露的 Claude Code 源码。所有原始源码版权归 [Anthropic](https://www.anthropic.com) 所有。仅供学习和研究用途。