OpenAI on Tuesday rolled out its o3-Pro model for ChatGPT Pro and Teams subscribers, slashed o3 pricing by 80 percent, and dropped a blog post from CEO Sam Altman teasing "intelligence too cheap to meter." "The average query uses about 0.34 watt-hours, about what an oven would use in a little over one second, or a high-efficiency light bulb would use in a couple of minutes. It also uses about 0.000085 gallons of water; roughly one fifteenth of a teaspoon," Altman said in a post. This is in line with prior outside estimates. Epoch AI published a similar figure in February. The firm said, "a GPT-4o query consumes around 0.3 watt-hours for a typical text-based question, though this increases substantially to 2.5 to 40 watt-hours for queries with very long inputs." But looking at AI energy usage on an average query basis grossly oversimplifies concerns about the technology's environmental impact, given the massive number of queries users are entering - over a billion a day as of last December, according to the company. When the MIT Technology Review explored AI energy usage recently, the conclusion did not align with Altman's claim that "Intelligence too cheap to meter is well within grasp." Rather, the publication cited research from the Lawrence Berkeley National Laboratory estimating AI-specific purposes in data centers will consume between 165 and 326 terawatt-hours of energy in 2028 - enough to power 22% of all US households. OpenAI's o3 model isn't too cheap to meter, but due to an optimized inference stack, it's 80 percent less expensive than it used to be: Input: $2 per 1M tokens; Output: $8 per 1M tokens. But there are still many cheaper models. Overall, Altman's musings skew toward techno-optimism – surprise! He posits a flood of wondrous discoveries a decade hence arising from AI superintelligence, whatever that is. "Maybe we will go from solving high-energy physics one year to beginning space colonization the next year; or from a major materials science brea...
First seen: 2025-06-12 10:46
Last seen: 2025-06-12 11:46