标签 TUI coding agent 下的文章

opencode 是什么就不需要我多说了,是一个非常好用的聚合的 TUI coding agent,原生支持 LSP 和 ASP,很适合佬友们有一堆渠道站的用法。

但是在搭配 codex 的时候,经常会出现下面这些问题

  1. Items not found
all 10 attempts failed: HTTP 404: 
{
    "error": {
        "code": null,
        "message": "{\n \"error\": {\n \"message\": \"Item with id 'rs_jgs9diuh934hu9shag09phq9raf0' not found. Items are not persisted when `store` is set to false. Try again with `store` set to true, or remove this item from your input.\",\n \"type\": \"invalid_request_error\",\n \"param\": \"input\",\n \"code\": null\n  }\n}(traceid: 824gj32q80-94gj-08)",
        "param": null,
        "type": "invalid_request_error"
    }
}

核心问题就是这个 items 字段。codex 全部使用的是有状态的 response api,所以 opencode 默认不会携带上下文给 api,而是只提供一个 item_id ,提供了一种更优雅的方案。
但是 L 站的转发,显然是没办法支持这个的,所以我们需要添加一个关键参数:

"store": false 

这样就会在每次都传递完整的上下文给中转站了。

  1. 不命中缓存
    为了节约宝贵的 token,现在各大中转站也都支持 prompt cache 了,但是也需要在配置文件中启用一下配置:
 "setCacheKey": true 

综上,给各位佬友们一个综合一些的 opencode.jsonc 的完整配置,mcp 就各位佬友自己随便定制啦

{
  "$schema": "https://opencode.ai/config.json",
  "model": "foxcode/google/gemma-3n-e4b",
  "provider": {
    "example": {
      "npm": "@ai-sdk/openai",
      "name": "example",
      "options": {
        "baseURL": "https://new.example.com/codex/v1",
        "setCacheKey": true
      },
      "models": {
        "gpt-5-2-high": {
          "id": "gpt-5.2",
          "name": "GPT 5.2 High",
          "options": {
            "reasoningEffort": "high",
            "textVerbosity": "low",
            "reasoningSummary": "auto",
            "store": false
          }
        },
        "gpt-5-2-medium": {
          "id": "gpt-5.2",
          "name": "GPT 5.2 Medium",
          "options": {
            "reasoningEffort": "medium",
            "textVerbosity": "low",
            "reasoningSummary": "auto",
            "store": false
          }
        },
        "gpt-5-2-xhigh": {
          "id": "gpt-5.2",
          "name": "GPT 5.2 XHigh",
          "options": {
            "reasoningEffort": "xhigh",
            "textVerbosity": "low",
            "reasoningSummary": "auto",
            "store": false
          }
        }
      }
    }
  },
  "mcp": {
    "auggie-mcp": {
      "type": "local",
      "command": [
        "auggie",
        "--mcp"
      ],
      "enabled": true
    },
    "chrome-devtools-mcp": {
      "type": "local",
      "command": [
        "npx",
        "-y",
        "chrome-devtools-mcp@latest"
      ],
      "enabled": false
    },
    "deepwiki": {
      "type": "remote",
      "url": "https://mcp.deepwiki.com/mcp",
      "enabled": true
    },
    "sequential-thinking": {
      "type": "local",
      "command": [
        "npx",
        "-y",
        "@modelcontextprotocol/server-sequential-thinking"
      ],
      "enabled": true
    }
  }
}

📌 转载信息
转载时间:
2026/1/12 10:37:54