We find that a highly simplified transformer neural network is able to compute Conway’s Game of Life perfectly, just from being trained on examples of the game. The simple nature of this model allows us to look at its structure and observe that it really is computing the Game of Life — it is not a statistical model that predicts the most likely next state based on all the examples it has been trained on. We observe that it learns to use its attention mechanism to compute 3x3 convolutions — 3x3 convolutions are a common way to implement the Game of Life, since it can be used to count the neighbours of a cell, which is used to decide whether the cell lives or dies. We refer to the model as SingleAttentionNet, because it consists of a single attention block, with single-head attention. The model represents a Life grid as a set of tokens, with one token per grid cell. The following figure shows a Life game, computed by a SingleAttentionNet model: The following figure shows examples of the SingleAttentionNet model’s attention matrix, over the course of training: This shows the model learning to compute a 3 by 3 average pool via its attention mechanism, (with the middle cell excluded from the average). Details The full code is made available, here. The problem is modeled as: model(life_grid) = next_life_grid Where gradient descent is used to minimize the loss: loss = cross_entropy(true_next_life_grid, predicted_next_life_grid) Life grids are generated randomly, to provide a limitless source of training pairs, (life_grid, next_life_grid). Some examples: Model diagram The model in the diagram processes 2-by-2 Life grids, which means 4 tokens in total per grid. Blue text indicates parameters that are learned via gradient descent. The arrays are labelled with their shape, (with the batch dimension omitted). Training On a GPU, training the model takes anywhere from a couple of minutes, to 10 minutes, or fails to converge, depending on the seed and other training hyperparameter...
First seen: 2025-05-17 11:47
Last seen: 2025-05-17 13:47