@ryan_nookpi/pi-extension-codex-fast-mode
Codex fast mode extension for pi.
Package details
Install @ryan_nookpi/pi-extension-codex-fast-mode from npm and Pi will load the resources declared by the package manifest.
$ pi install npm:@ryan_nookpi/pi-extension-codex-fast-mode- Package
@ryan_nookpi/pi-extension-codex-fast-mode- Version
0.1.3- Published
- Apr 24, 2026
- Downloads
- 524/mo · 181/wk
- Author
- ryan_nookpi
- License
- MIT
- Types
- extension
- Size
- 5.8 KB
- Dependencies
- 0 dependencies · 2 peers
Pi manifest JSON
{
"extensions": [
"./index.ts"
]
}Security note
Pi packages can execute code and influence agent behavior. Review the source before installing third-party packages.
README
@ryan_nookpi/pi-extension-codex-fast-mode
This extension helps pi use OpenAI Codex in a faster, lower-verbosity mode.
It is mainly intended for openai-codex with gpt-5.4 or gpt-5.5, where you want quick execution and shorter responses.
Install
pi install npm:@ryan_nookpi/pi-extension-codex-fast-mode
Great for
- prioritizing speed over long explanations
- keeping Codex responses concise
- toggling a faster Codex setup per session
Usage
/codex-fast on
/codex-fast off
/codex-fast status
Notes
- Target models:
openai-codex / gpt-5.4andopenai-codex / gpt-5.5 - It always applies
text.verbosity=low. - When fast mode is enabled, it also injects
service_tier=priority. - The setting is stored locally and persists across sessions.