- Model
zai/glm-5.1- Provider
vercel-ai-gateway- API
anthropic-messages- Base URL
https://ai-gateway.vercel.sh- Input
- text
- Reasoning
- Yes
- Context window
- 202,800
- Max tokens
- 64,000
- Cost / million input
- $1.4
- Cost / million output
- $4.4
- Cost / million cache read
- $0.26
- Cost / million cache write
- $0
Model config JSON
{
"providers": {
"vercel-ai-gateway": {
"apiKey": "YOUR_API_KEY",
"models": [
{
"id": "zai/glm-5.1",
"name": "GLM 5.1",
"reasoning": true,
"input": [
"text"
],
"contextWindow": 202800,
"maxTokens": 64000,
"cost": {
"input": 1.4,
"output": 4.4,
"cacheRead": 0.26,
"cacheWrite": 0
}
}
],
"api": "anthropic-messages",
"baseUrl": "https://ai-gateway.vercel.sh"
}
}
}