@0xkobold/pi-ollama
Ollama extension for pi-coding-agent. Unified local + cloud Ollama support with model management
Package details
Install @0xkobold/pi-ollama from npm and Pi will load the resources declared by the package manifest.
$ pi install npm:@0xkobold/pi-ollama- Package
@0xkobold/pi-ollama- Version
0.4.1- Published
- Apr 8, 2026
- Downloads
- 1,585/mo ยท 187/wk
- Author
- moikapy
- License
- MIT
- Types
- extension
- Size
- 57.4 KB
- Dependencies
- 0 dependencies ยท 3 peers
Pi manifest JSON
{
"extensions": [
"./dist/index.js"
]
}Security note
Pi packages can execute code and influence agent behavior. Review the source before installing third-party packages.
README
Pi Ollama Extension
Ollama integration for pi-coding-agent with accurate model details from /api/show.
Changelog
v0.4.1
- Fix: Cloud models now correctly use
/v1endpoint. Previously,ollama-cloudwas registered withbaseUrl: "https://ollama.com", causing pi to hithttps://ollama.com/chat/completions(HTML homepage) instead ofhttps://ollama.com/v1/chat/completions. This was already fixed for the local provider but was missed when the cloud provider was introduced. - Fix: Trailing slashes in
cloudUrlconfig are now properly stripped before appending/v1.
Installation
# Via pi CLI
pi install npm:@0xkobold/pi-ollama
# Or in pi-config.ts
{
extensions: [
'npm:@0xkobold/pi-ollama'
]
}
# Or temporary (testing)
pi -e npm:@0xkobold/pi-ollama
Features
- ๐ฆ Local Ollama - Connect to localhost:11434
- โ๏ธ Ollama Cloud - Use ollama.com with API key
- ๐ Accurate Details - Uses
/api/showfor real context length - ๐๏ธ Vision Detection - Detects vision from capabilities array
- ๐ง Reasoning Models - Auto-detects thought-capable models
- ๐ Model Info - Query specific model parameters
Quick Start
# Check connection
/ollama-status
# List all models (with accurate context length)
/ollama-models
# Get detailed info for specific model
/ollama-info gemma3
/ollama-info llama3.1:70b
Commands
| Command | Description |
|---|---|
/ollama-status |
Check connection status |
/ollama-models |
List models with context length |
/ollama-info MODEL |
Show model details from /api/show |
How It Works
The extension uses Ollama's /api/show endpoint to get accurate model information:
curl http://localhost:11434/api/show -d '{
"model": "gemma3",
"verbose": true
}'
Response includes:
model_info.context_length- Accurate context windowcapabilities- ["completion", "vision"]details.parameter_size- "4.3B", "70B", etc.details.family- "gemma3", "llama", etc.
Model Display
Models are displayed with accurate metadata:
๐ Local:
๐๏ธ gemma3 (4.3B) (131,072 ctx)
๐ง codellama:70b (70B) (16,384 ctx)
llama3.1 (8B) (128,000 ctx)
Badges:
- โ๏ธ Cloud model
- ๐๏ธ Vision-capable
- ๐ง Reasoning-capable
Configuration
Configuration is loaded with the following precedence (highest to lowest):
- Environment variables (override everything)
pi.settings(runtime API, when available).pi/settings.json(project-local settings)~/.pi/agent/settings.json(global user settings)
Settings File
Add to your global settings (~/.pi/agent/settings.json):
{
"ollama": {
"baseUrl": "http://localhost:11434",
"cloudUrl": "https://ollama.com",
"apiKey": "your-ollama-cloud-api-key"
}
}
Or create project-specific settings (.pi/settings.json in your project root):
{
"ollama": {
"baseUrl": "http://custom:11434",
"apiKey": "project-specific-key"
}
}
Note: Project settings override global settings.
Environment Variables
export OLLAMA_HOST="http://localhost:11434"
export OLLAMA_HOST_CLOUD="https://ollama.com"
export OLLAMA_API_KEY="your-api-key"
Local Development
git clone https://github.com/0xKobold/pi-ollama
cd pi-ollama
npm install
npm run build
pi install ./
API Functions
import { fetchModelDetails, getContextLength, hasVisionCapability } from '@0xkobold/pi-ollama';
// Get model details
const details = await fetchModelDetails('gemma3', 'http://localhost:11434');
// Extract context length
const ctx = getContextLength(details?.model_info); // 131072
// Check vision support
const hasVision = hasVisionCapability(details); // true
Supported Capabilities
The extension detects:
- Vision: From
capabilitiesarray ormodel_infokeys - Reasoning: From model name (coder, r1, deepseek, think, reason)
- Context Length: From
model_info.*.context_length
License
MIT ยฉ 0xKobold