OLMo – a fully open LLM outperforming GPT 4o mini

https://news.ycombinator.com/rss Hits: 2
Summary

OLMo 2 is a family of fully-open language models, developed start-to-finish with open and accessible training data, open-source training code, reproducible training recipes, transparent evaluations, intermediate checkpoints, and more. Details are in our technical report.

First seen: 2025-07-09 01:32

Last seen: 2025-07-09 02:33