T3 Pro Plan Context Window Size

According to the T3 FAQ, the Pro plan provides 1500 standard message credits per month, plus 100 premium credits for Claude. I'm wondering about the context window/token limits. Specifically, what is the maximum number of tokens I can send per message for each model available through T3? Is there a fixed token limit that applies to all models, or does it vary?
Solution:
I received an official response regarding the context window size limits for the T3 Chat models and thought it might be helpful to share: Most models on T3 Chat support the full context window size. There's no message limit, other than the model's overall context window limit. The exception is Anthropic models (Claude), which are capped at 32K tokens. Here's a breakdown of the context window sizes for each model:...
Jump to solution
1 Reply
Solution
fancyant
fancyant15h ago
I received an official response regarding the context window size limits for the T3 Chat models and thought it might be helpful to share: Most models on T3 Chat support the full context window size. There's no message limit, other than the model's overall context window limit. The exception is Anthropic models (Claude), which are capped at 32K tokens. Here's a breakdown of the context window sizes for each model: GPT-4o-mini - 128,000 Tokens GPT-4o - 128,000 Tokens o3-mini - 200,000 Tokens Claude 3.5 Sonnet - 32,000 Tokens Claude 3.7 Sonnet - 32,000 Tokens Claude 3.7 Sonnet (Reasoning) - 32,000 Tokens DeepSeek v3 (Fireworks) - 128,000 Tokens DeepSeek v3 (0324) - 64,000 Tokens DeepSeek R1 (OpenRouter) - 128,000 Tokens DeepSeek R1 (Llama Distilled) - 32,000 Tokens DeepSeek R1 (Qwen Distilled) - 16,000 Tokens Gemini 2.0 Flash - 1,000,000 Tokens Gemini 2.0 Flash Lite - 1,000,000 Tokens Gemini 2.5 Pro - 1,000,000 Tokens Llama 3.3 70b - 128,000 Tokens Qwen 2.5 32b - 8,000 Tokens Qwen qwq-32b - 128,000 Tokens

Did you find this page helpful?