We are launching two new plans designed to make AI coding faster and more accessible: Cerebras Code Pro ($50/month) and Code Max ($200/month). Both plans give you access to Qwen3-Coder, the world’s leading open-weight coding model—running at speeds of up to 2,000 tokens per second, with a 131k-token context window, no proprietary IDE lock-in, and no weekly limits!Cerebras Makes Code Generation InstantEven with the best frontier models, you still end up waiting around for completions. And as coding workflows get more agentic, the latency adds up fast. You’re not just waiting once. You have to wait on every LLM call across multi-step edits, tool use, retries, and planning.At 2,000 tokens per second, code generation becomes instant. And starting at $50/month, anyone can use Cerebras Code and enjoy fast code generation that keeps you in flow.Powered by a Frontier ModelQwen3‑Coder is Alibaba’s flagship coding agent model. The 480B parameter model delivers performance comparable to Claude Sonnet 4 and GPT‑4.1 in coding and agentic tasks, achieving leading performance on coding benchmarks such as Agentic Coding, Agentic Browser-Use, and BFCL.Bring your own AI IDEIf your code editor or tool supports OpenAI compatible inference endpoints, you can use it with Cerebras Code. Plug Cerebras Code into anything – Cursor, Continue.dev, Cline, RooCode, or whatever else you’re using. No extra setup. Just instant, high quality code generation inside your own workflow.Available nowCerebras Code Pro - ($50/month)Qwen3-Coder access with fast, high-context completions.Send up to 1,000 messages per day—enough for 3–4 hours of uninterrupted vibe coding.Ideal for indie devs, simple agentic workflows, and weekend projects.Cerebras Code Max - ($200/month)Qwen3-Coder access for heavy coding workflows.Send up to 5,000 messages/day.Ideal for full-time development, IDE integrations, code refactoring, and multi-agent systems.Cerebras Code Pro and Code Max are available today, no waitlist. Sign up, ...
First seen: 2025-08-01 23:11
Last seen: 2025-08-02 08:13