Recently, collaborators and I proposed a new type of neural networks called the Kolmogorov-Arnold Networks (KANs), which are somewhat similar to but mostly different from Multi-Layer Perceptrons (MLPs). The technical differences between MLPs and KANs can be found in our paper and many discussions over the internet. This blogpost does not delve into technicalities, but want to lay out quick philosophical thoughts, open to discussion. I will attempt to answer the follwoing questions: Q1: Are KANs and MLPs the same? Q2: What’s the philosophical difference between KANs and MLPs? Q3: Which is more aligned with science, KANs or MLPs? Q1: Are KANs and MLPs the same? The argument that “KANs and MLPs are the same because they have the same expressive power” is similar to the argument “A human being and a cup are the same because they are both made up of atoms.” It is well accepted in physics that each (energy) level has a physical theory, so even if two systems share the same theory in the most microscopic level, they are not necessarily the same on higher levels because of emeregent phenomenon. Back to the MLPs vs KANs case, even though MLPs and KANs have the same expressive power (which is a foundational aspect), other aspects emerging from it might be quite different. These emerging properties include optimization, generalization, interpretability, etc. Q2: What’s the philosophical difference between KANs and MLPs? Reductionism vs. Holism While MLPs are more aligned with holism, KANs are more aligned with reductionism. The design principle of MLPs is “more is different”. In an MLP, each neuron is simple because it has fixed activation functions. However, what matters is the complicated connection patterns among neurons. The magical power of MLPs performing a task is an emergent behavior which is attributed to collective contribution from all neurons. By contrast, in a KAN, each activation function is complicated because it has learnable functions. By sparsification and pr...
First seen: 2025-08-21 23:38
Last seen: 2025-08-22 07:00