@ryan_nookpi/pi-extension-codex-large-context
Codex large context window extension for pi.
Package details
Install @ryan_nookpi/pi-extension-codex-large-context from npm and Pi will load the resources declared by the package manifest.
$ pi install npm:@ryan_nookpi/pi-extension-codex-large-context- Package
@ryan_nookpi/pi-extension-codex-large-context- Version
0.1.1- Published
- Apr 25, 2026
- Downloads
- 246/mo · 246/wk
- Author
- ryan_nookpi
- License
- MIT
- Types
- extension
- Size
- 5.8 KB
- Dependencies
- 0 dependencies · 1 peer
Pi manifest JSON
{
"extensions": [
"./index.ts"
]
}Security note
Pi packages can execute code and influence agent behavior. Review the source before installing third-party packages.
README
@ryan_nookpi/pi-extension-codex-large-context
This extension raises the pi context window metadata for newer OpenAI Codex models.
It is intended for gpt-5.4 and gpt-5.5 model IDs when pi reports a smaller context window than the model can actually handle.
Install
pi install npm:@ryan_nookpi/pi-extension-codex-large-context
Usage
Large context mode is ON by default.
/codex-large-context on
/codex-large-context off
/codex-large-context status
Running /codex-large-context with no argument shows the current status.
What it does
- Watches
session_startandmodel_selectevents while enabled. - If the active model ID starts with
gpt-5.4orgpt-5.5, sets its context window to922000tokens. - Stores the on/off setting locally so it persists across sessions.
- Shows a small notification when the context window is updated in the interactive UI.
Notes
- This only changes pi's local model metadata for the active session/model selection.
- It does not change API limits on the provider side.
- If pi already reports a context window of
922000tokens or more, it leaves the model unchanged. - Turning the feature off prevents future context-window adjustments; it does not restore a model that was already adjusted earlier in the session.