@samfp/pi-telegram-bot
Telegram bot exposing pi as a conversational coding agent. Chat with pi in Telegram with streaming responses, tool execution, threaded sessions, and model switching.
Package details
Install @samfp/pi-telegram-bot from npm and Pi will load the resources declared by the package manifest.
$ pi install npm:@samfp/pi-telegram-bot- Package
@samfp/pi-telegram-bot- Version
0.1.0- Published
- Apr 19, 2026
- Downloads
- 85/mo · 22/wk
- Author
- samfp
- License
- MIT
- Types
- package
- Size
- 98.8 KB
- Dependencies
- 3 dependencies · 0 peers
Security note
Pi packages can execute code and influence agent behavior. Review the source before installing third-party packages.
README
pi-telegram-bot
Telegram bot exposing pi as a personal coding agent. Chat with pi in Telegram with streaming responses, tool execution, and model switching.
Setup
1. Create a Telegram bot
- Message @BotFather on Telegram
- Send
/newbotand follow the prompts - Copy the bot token
2. Get your Telegram user ID
- Message @userinfobot on Telegram
- It will reply with your user ID (a number like
123456789)
3. Configure environment
cp .env.example .env
# Edit .env with your TELEGRAM_BOT_TOKEN and TELEGRAM_USER_ID
4. Install and run
npm install
npm start
# or
./start.sh
Environment Variables
| Variable | Required | Default | Description |
|---|---|---|---|
TELEGRAM_BOT_TOKEN |
Yes | — | Bot token from BotFather |
TELEGRAM_USER_ID |
Yes | — | Your Telegram user ID (security: bot only responds to you) |
PROVIDER |
No | anthropic |
LLM provider |
MODEL |
No | claude-sonnet-4-5 |
Model ID |
THINKING_LEVEL |
No | off |
Thinking level (off/minimal/low/medium/high/xhigh) |
MAX_SESSIONS |
No | 10 |
Maximum concurrent sessions |
SESSION_IDLE_TIMEOUT |
No | 3600 |
Seconds before idle sessions are reaped |
SESSION_DIR |
No | ~/.pi/agent/sessions |
Session storage directory |
STREAM_THROTTLE_MS |
No | 1000 |
Minimum ms between message edits (Telegram rate limit) |
TELEGRAM_MSG_LIMIT |
No | 4000 |
Max message length before splitting |
ACK_REACTION |
No | 🦞 |
Emoji reaction on received messages (set empty to disable) |
Commands
| Command | Description |
|---|---|
/help |
Show available commands |
/new |
Start a fresh session (clears history) |
/cancel |
Abort the current stream |
/status |
Show session info (model, tokens, cwd) |
/model <name> |
Switch model (no args = list available) |
/thinking <level> |
Set thinking level |
/sessions |
List all active sessions |
/cwd <path> |
Change working directory |
/reload |
Reload extensions and prompt templates |
/diff |
Show git diff of uncommitted changes |
/compact |
Compact conversation to free context |
/context |
Show context window usage |
Unknown /commands are forwarded to pi as extension commands.
Features
- Streaming responses — Messages update in-place as tokens arrive, respecting Telegram's rate limit
- Ack reactions — Reacts to your message immediately so you know it was received
- Persistent sessions — Sessions survive bot restarts via on-disk registry
- Session management — Multiple concurrent sessions with idle timeout and auto-reaping
- File handling — Send photos/documents and they're saved to the session's working directory
- Tool activity — See what tools pi is using in real-time during streaming
- Context tracking — Warnings at 80% and 90% context usage
- Auto-diff — Automatic diff posting when pi modifies files
- Security — Only responds to your configured Telegram user ID
Architecture
index.ts → Entry point, dotenv, signal handlers
telegram.ts → grammY bot setup, message/command routing
thread-session.ts → Wraps pi AgentSession with streaming
session-manager.ts → Session lifecycle, limits, idle reaping
streaming-updater.ts → Throttled message editing with chunking
commands.ts → Telegram slash command handlers
config.ts → Environment variable parsing
session-registry.ts → Persistent session state across restarts
formatter.ts → Markdown + tool call formatting
file-handling.ts → Download Telegram files, vision extraction
diff-reviewer.ts → Git diff generation and posting
Requirements
- Node.js >= 20
- A pi-coding-agent installation (
@mariozechner/pi-coding-agent) - Anthropic API key (or other LLM provider configured in pi)