opencode 免费配置使用 nvidia/minimax-m2.1 模型
之前有佬友发了白嫖老黄的 nvidia/minimax-m2.1 模型
正好最近在试用 opencode,就尝试配置了一下,放出配置文件给大家参考下 ~/.config/opencode/opencode.json
{ "$schema": "https://opencode.ai/config.json", "mcp": { "augment-context-engine": { "type": "local", "command": [ "auggie", "--mcp" ], "enabled": true }, "sequential": { "type": "local", "command": [ "npx", "@modelcontextprotocol/server-sequential-thinking" ], "enabled": true }, "playwright": { "type": "local", "command": [ "npx", "@playwright/mcp@latest" ], "enabled": true }, "context7": { "type": "local", "command": [ "npx", "@upstash/context7-mcp@latest" ], "enabled": true } }, "provider": { "nvidia": { "npm": "@ai-sdk/openai-compatible", "options": { "baseURL": "https://integrate.api.nvidia.com/v1", "apiKey": "你的 apikey" }, "models": { "minimax-m2.1": { "id": "minimaxai/minimax-m2.1" } } } }, "model": "nvidia/minimax-m2.1" } 配置好之后重启 opencode,就可以看到模型生效拉
当然 Kilo 也可以跑,配置起来更简单,这里就不放配置了


