TL;DR - We should train future large AI models in space to make use of abundant solar energy, cooling, and the ability to freely scale up. Hey all, we're building data centers in space. We’re launching our first satellite next year, which will have the most powerful GPUs ever put in space by >100x. We will launch a larger iteration each year until we reach gigawatt scale. ❌The Problem Future hyperscale data centers will put a huge strain on electricity grids, freshwater distribution, and the Western world’s permitting systems. It will simply not be possible to deploy multi-gigawatt scale data centers rapidly in the way we build data centers today. ✨Our Solution We take advantage of falling launch costs to make use of inexpensive solar energy in space and low-cost passive radiative cooling, rapidly scaling up orbital data centers almost indefinitely without the physical or permitting constraints faced on Earth. This will ensure we can continue training ever larger models without destroying the environment. ⚙️ Read our white paper Check out our white paper for more information on why space data centers are the future and how we’re going about making this happen. See a short video on the design here. VIDEO 🏆 What we’ve achieved so far Booked our first launch (May 2025) and our second launch (H2 2026) Set up our payload manufacturing facility in Redmond, WA Designed and started building and testing our first spacecraft, with the fastest GPUs to ever launch to space by ~100x Created concept designs for our micro data center (2026 launch) and our Hypercluster data center (launching when Starship-class launch vehicles enter commercial service) Secured high-value LOIs for H100 compute time in space 🔜 What’s next Launch and complete our demonstrator mission, which will train the first LLM in space! Prototype our micro-data center design Secure contracts which incumbent hyper scalers 👥 Our Team Philip, CEO, is a second-time founder who has worked at McKinsey & Co. working on ...
First seen: 2025-05-13 21:32
Last seen: 2025-05-14 01:33