- Model
meta/llama-3.3-70b- Provider
vercel-ai-gateway- API
anthropic-messages- Base URL
https://ai-gateway.vercel.sh- Input
- text
- Reasoning
- No
- Context window
- 128,000
- Max tokens
- 8,192
- Cost / million input
- $0.72
- Cost / million output
- $0.72
- Cost / million cache read
- $0
- Cost / million cache write
- $0
Model config JSON
{
"providers": {
"vercel-ai-gateway": {
"apiKey": "YOUR_API_KEY",
"models": [
{
"id": "meta/llama-3.3-70b",
"name": "Llama 3.3 70B Instruct",
"reasoning": false,
"input": [
"text"
],
"contextWindow": 128000,
"maxTokens": 8192,
"cost": {
"input": 0.72,
"output": 0.72,
"cacheRead": 0,
"cacheWrite": 0
}
}
],
"api": "anthropic-messages",
"baseUrl": "https://ai-gateway.vercel.sh"
}
}
}