pi-opencode-provider
A pi provider extension that adds OpenCode Zen & OpenCode Go support.
Package details
Install pi-opencode-provider from npm and Pi will load the resources declared by the package manifest.
$ pi install npm:pi-opencode-provider- Package
pi-opencode-provider- Version
0.7.0- Published
- Apr 28, 2026
- Downloads
- not available
- Author
- mdsitton
- License
- MIT
- Types
- extension
- Size
- 24.2 KB
- Dependencies
- 0 dependencies · 2 peers
Pi manifest JSON
{
"extensions": [
"./src/index.ts"
]
}Security note
Pi packages can execute code and influence agent behavior. Review the source before installing third-party packages.
README
pi-opencode-provider
Warning: I built this entire extension before realizing pi already has built-in OpenCode support. I am apparently blind. This extension is not strictly required.
So why does this exist?
The built-in OpenCode models are statically generated at pi build time from models.dev. When OpenCode adds a new model, you have to wait for a pi release to see it.
This extension does runtime model discovery instead:
- Fetches OpenCode's official
/modelsendpoints directly at startup - Merges metadata from
models.dev(context windows, pricing, reasoning support) - Registers the freshest model list with pi
New models show up without waiting for a pi release. Even if models.dev hasn't been updated yet, the extension fetches directly from OpenCode's API — new models are available immediately with best-effort default parameters (128k context, 16k max output).
Providers
This extension registers two providers that replace the built-in ones:
opencode— replaces the built-inopencode(OpenCode Zen)opencode-go— replaces the built-inopencode-go(OpenCode Go)
Installation
pi install pi-opencode-provider
Configure pi
Run /login, choose Use a subscription, select OpenCode Zen or OpenCode Go, and paste your API key when prompted. Then run /model to pick a model.
Migrating from the built-in providers
If you previously used OpenCode with pi's built-in support (via OPENCODE_API_KEY env var or auth.json), you still need to run /login at least once. The extension registers an OAuth-based provider that rewrites per-model base URLs for Anthropic models — this only takes effect once your API key is stored through the /login flow.
Provider behavior
OpenCode Zen
Zen models are mapped automatically to the correct backend API:
- OpenAI Chat Completions
- OpenAI Responses
- Anthropic Messages
- Google Generative AI
OpenCode Go
Go models are exposed through the OpenAI-compatible chat completions API.
Model discovery
On startup, the extension:
- Fetches the official model list from OpenCode's
/modelsendpoint - Merges in metadata from
models.dev - Registers the resolved models with pi, replacing the built-in ones
If the OpenCode model endpoint is unavailable, the extension falls back to models.dev. If metadata is still unavailable, conservative defaults (128k context, 16k max tokens) are used.
Development
npm install
npm run typecheck
License
MIT. See LICENSE.