- Model
zai/glm-4.7- Provider
vercel-ai-gateway- API
anthropic-messages- Base URL
https://ai-gateway.vercel.sh- Input
- text
- Reasoning
- Yes
- Context window
- 131,000
- Max tokens
- 40,000
- Cost / million input
- $2.25
- Cost / million output
- $2.75
- Cost / million cache read
- $2.25
- Cost / million cache write
- $0
Model config JSON
{
"providers": {
"vercel-ai-gateway": {
"apiKey": "YOUR_API_KEY",
"models": [
{
"id": "zai/glm-4.7",
"name": "GLM 4.7",
"reasoning": true,
"input": [
"text"
],
"contextWindow": 131000,
"maxTokens": 40000,
"cost": {
"input": 2.25,
"output": 2.75,
"cacheRead": 2.25,
"cacheWrite": 0
}
}
],
"api": "anthropic-messages",
"baseUrl": "https://ai-gateway.vercel.sh"
}
}
}