Endless AI-Generated Wikipedia

https://news.ycombinator.com/rss Hits: 1
Summary

edit: I temporarily disabled new page generation because of automated traffic, but I’m re-enabling it with a rate limit. I built an infinite, AI-generated wiki. You can try it out at endlesswiki.com! Why build an AI-generated wiki? Large language models are like Borges’ infinite library. They contain a huge array of possible texts, waiting to be elicited by the right prompt - including some version of Wikipedia. What if you could explore a model by interacting with it as a wiki? The idea here is to build a version of Wikipedia where all the content is AI-generated. You only have to generate a single page to get started: when a user clicks any link on that page, the page for that link is generated on-the-fly, which will include links of its own. By browsing the wiki, users can dig deeper into the stored knowledge of the language model. This works because wikipedias connect topics very broadly. If you follow enough links, you can get from any topic to any other topic. In fact, people already play a game where they try to race from one page to a totally unrelated page by just following links. It’s fun to try and figure out the most likely chain of conceptual relationships between two completely different things. In a sense, EndlessWiki is a collaborative attempt to mine the depths of a language model. Once a page is generated, all users will be able to search for it or link it to their friends. Architecture The basic design is very simple: a MySQL database with a pages table, and a Golang server. When the server gets a wiki/some-slug request, it looks up some-slug in the database. If it exists, it serves the page directly; if not, it generates the page from a LLM and saves it to the database before serving it. I’m using Kimi K2 for the model. I chose a large model because larger models contain more facts about the world (which is good for a wiki), and Kimi specifically because in my experience Groq is faster and more reliable than other model inference providers. Speed...

First seen: 2025-09-27 03:21

Last seen: 2025-09-27 03:21