Over the last year, anyone who cares about open weight language models has been watching Chinese labs.Qwen, DeepSeek and others now define a lot of what "state of the art open MoE" looks like. In the United States, most of the action has centered on polishing other people's checkpoints.At Arcee AI we want to add something that has been missing in that picture: a serious open weight model family trained end to end in America, by an American company, with weights that businesses and developers can actually own.That family is Trinity.Trinity Nano and Trinity Mini are available now.Trinity Large is currently training on 2048 B300 GPUs and will arrive in January 2026.Trinity Mini is our fully post-trained reasoning model. Trinity Nano Preview is something different: a personality-forward chat model that pushes the limits of sparsity with only 800M non-embedding parameters active per token across 56 layers and 128 experts. It's charming, it's fun to talk to, and it may be unstable in edge cases. This is an experimental release, not a thinking model. Nano Preview is available to download from Hugging Face but won't be hosted on our API.This is the story of why we decided to go all in on pretraining, how Nano and Mini came to life, and where Trinity is headed next.Why we decided to own pretrainingFor a while, our strategy looked like everyone else's. Take a strong open base, post train it hard, wire it into tools and RAG, and ship.That approach carried us very far. You can get impressive behavior with a good base, careful data and an instruction stack that matches the product.At the same time, a few pressures kept building:Ceilings on certain workloads: On some high stakes use cases, we kept iterating on post training and could see clear diminishing returns. Failure patterns pointed back to missing capabilities in the foundation, not to tuning mistakes.Jurisdictional Safety: Enterprise buyers are increasingly asking where the base model came from, what data went into it, an...
First seen: 2025-12-02 01:52
Last seen: 2025-12-02 11:53