pi-edgee-proxy

Route Pi coding agent through Edgee AI Gateway for lossless token compression — any provider, any model

Package details

extension

Install pi-edgee-proxy from npm and Pi will load the resources declared by the package manifest.

$ pi install npm:pi-edgee-proxy
Package
pi-edgee-proxy
Version
2.1.3
Published
Apr 29, 2026
Downloads
not available
Author
ngsoftware
License
MIT
Types
extension
Size
19.7 KB
Dependencies
0 dependencies · 1 peer
Pi manifest JSON
{
  "extensions": [
    "./extensions"
  ]
}

Security note

Pi packages can execute code and influence agent behavior. Review the source before installing third-party packages.

README

pi-edgee-proxy

Route the Pi coding agent through Edgee AI Gateway for lossless token compression — any provider, any model.

Why

Edgee's token compression losslessly compresses tool_result messages — file reads, grep output, bash stdout — before they reach the LLM. For coding agents like Pi that read lots of files, this reduces token costs by up to 30% without any quality loss.

This extension bridges Pi with Edgee, similar to edgee launch claude for Claude Code, but works with all Edgee-supported providers: Anthropic, OpenAI, Google, Z.ai, DeepSeek, Mistral, xAI, and more.

Install

pi install npm:pi-edgee-proxy

Requirements

  • An Edgee account with API key
  • Store your API key in the OS keychain (recommended):
    /edgee-auth store sk-edgee-...
    
    Or set the EDGEE_API_KEY environment variable (fallback)

Usage

After installing, Edgee models appear in Pi's model picker (Ctrl+L) under the edgee provider:

edgee/anthropic/claude-sonnet-4
edgee/openai/gpt-5.2
edgee/zai/glm-5-turbo
edgee/google/gemini-2.5-flash
...

Or launch directly:

pi --provider edgee --model "zai/glm-5-turbo"

Configuring providers

By default, all Edgee providers are enabled. To filter which providers appear:

/edgee-setup

This shows available providers and how to filter them:

/edgee-setup zai,anthropic

This creates .pi/edgee.json:

{
  "providers": ["zai", "anthropic"]
}

Only models from those providers will appear in the model picker. Run /reload to apply.

You can also edit .pi/edgee.json manually (project-local) or ~/.pi/agent/edgee.json (global).

How it works

Pi → Edgee (lossless tool_result compression) → LLM Provider

Pi sends standard OpenAI Chat Completions requests to Edgee's gateway. Edgee compresses tool_result content losslessly, then forwards to the upstream provider. The model receives identical output. You pay for fewer tokens.

The extension fetches Edgee's model catalog dynamically at startup, so new models appear automatically.

Disable agentic (lossy) compression

Edgee has two compression modes. Make sure Agentic Token Compression is OFF in your Edgee Console to avoid lossy semantic rewriting of your conversation history. The lossless tool_result compression works automatically.

Environment variables

Variable Required Description
EDGEE_API_KEY No (fallback) Your Edgee API key (sk-edgee-...). Used if OS keychain has no key.

Commands

Command Description
/edgee-auth store sk-edgee-... Store API key in OS keychain
/edgee-auth remove Remove API key from OS keychain
/edgee-auth status Show where the key is stored
/edgee-setup Show available providers and configure filters
/edgee-setup provider1,provider2 Enable specific providers and save to config

License

MIT