The Universe Only Does Four Things
The Reason There Are Four Statistical Shapes of Reality
Arithmetic is usually treated as a tool. A convenience. A way to solve problems or describe them more cleanly. It is not typically considered metaphysical, although thinkers like David Deutsch have gestured in that direction. But there are reasons to believe arithmetic does more than describe the world. It may be part of what gives the world structure in the first place.
The four basic arithmetic operations — addition, subtraction, multiplication, and division — do not just yield answers. When applied across large numerical domains, they produce distinctive statistical distributions. These distributions are not contingent on real-world data or assumptions about cause and effect. They are intrinsic to the operations themselves. This is the starting point for a deeper proposal: that arithmetic operations are not just mathematical functions, but ontological generators.
What does that mean? It means each operation reflects a fundamental mode of interaction. A way in which quantities, domains, or systems can relate to one another. Not in the sense of metaphor — not "subtraction is like entropy" — but in the stronger sense that the distribution produced by subtraction resembles the signature of entropy because they share an underlying structural logic.
This post explores the possibility that the arithmetic operations are not passive descriptions of the world but active mechanisms that define the kinds of patterns that can exist in it. I am not a professional philosopher of mathematics, and some of the terminology below — such as "constraint profile" and "interaction domain" — was suggested to me by ChatGPT in the course of working through this idea. But the patterns themselves should be real and replicable, if maybe a little counterintuitive.
Start with subtraction. If you subtract one randomly chosen number from another, and discard any negative results, a long-tail distribution emerges. Most of the outcomes are small and few are very large. This is not noise but rather a signature of loss: partial removal across unbounded domains. The pattern produced resembles what statisticians (apparently) call a power-law or heavy-tailed distribution. Such shapes appear in things like extinction events, income inequality, blackouts, bushfires, and so on. Subtraction does not cause these phenomena, but it shares their structure. The process of narrowing — of taking something away without bounds — naturally generates a world in which small losses are common and vast losses are rare but real.
Division, by contrast, produces a spike near one. Divide two large positive numbers, and the result is typically close to one. But when the denominator approaches zero, the output explodes. The resulting distribution resembles a Poisson spike followed by steep decay. This is not a gradual drift but a sharp discontinuity: this is segmentation or categorisation. The imposition of internal structure on a continuous field. When applied cognitively, this resembles the creation of discrete concepts making up the seamless flow of experience. The mind does this constantly, and division may model the statistical shape of that process. The large spike represents a standardised expectation, and the tapering tail represents the rarity of anomalies. When structure is imposed on continuity, this is the shape that appears.
Addition is more placid. When you add together randomly selected numbers from a wide range — including both positive and negative values — the result is flat. The extremes cancel. The highs and lows average out. What emerges is a uniform distribution. This is not the silence of nothingness but the stillness of balance. No peaks. No spikes. Just flatness. This is the logic of equilibrium: not through homogeneity, but through inclusion. The uniform distribution describes systems where extremes are not eliminated but integrated. It is the shape of decentralisation, fairness, and homeostasis. Addition is not exciting. But it is foundational to systems that endure (and seeing as this is a Buddhist-themed Substack I must wave my hand in the direction of the Noble truth known as nirodha).
Multiplication brings us back to complexity. Multiply randomly chosen numbers from a wide domain, and a bell curve appears. Most results cluster around zero, and while extremes are possible they are increasingly rare. As more variables are introduced, the curve becomes tighter, the centre thickens and the outliers thin. This is the Central Limit Theorem in action, but its implications go deeper. Multiplication produces the statistical signature of compounding effects and what happens when everything affects everything else: a probabilistic centre forms. The bell curve is not just a statistical idealisation. It may be the natural result of structure acting on structure. When systems interact without external constraint, this is the shape that tends to appear.
These four distributions — long-tail, Poisson spike, flatness, and bell curve — are not inventions. They emerge natively from the arithmetic operations when applied across unbounded ranges. This invites a more radical hypothesis: perhaps these operations are not just descriptions, but the modes of interaction that reality allows. That is, perhaps there are only four fundamental ways things can work on each other — and arithmetic has been quietly modelling them all along.
One way to think about this is in terms of ‘constraint presence’. When performing any operation, the result depends on what is included or excluded:
Subtraction: The bounded part (a circle, a local value) is missing. What remains is the gap — asymmetrical, stretched, unpredictable.
Division: The unbounded part (a field, a continuum) is segmented. The result is discontinuity. A spike that decays.
Addition: Both ends cancel. Nothing is missing or imposed. The result is balance.
Multiplication: Nothing is removed. Everything acts on everything. The result is compounding. A probabilistic centre.
This produces a 2x2 interaction matrix. Not of "one" and "infinity" as poetic stand-ins, but of presence and absence, of boundedness and unboundedness. And there are only four logical configurations. That may explain why there are four operations in basic arithmetic: they model the four minimal ways interaction can be structured.
This four-fold framing may resonate with earlier intellectual traditions. Aristotle posited four causes — material, formal, efficient, and final — as necessary for understanding why things happen. The Buddha taught four noble truths — suffering, origin, cessation, and the path — as a framework for grasping lived experience. Immanuel Kant argued that categories such as quantity, relation, and modality are not just labels but the conditions for coherent experience. Contemporary cognitive scientist John Vervaeke has suggested that intelligence involves multiple modes of knowing — propositional, procedural, perspectival, and participatory — each of which frames relevance in a different structural way. And David Sumpter, in his book Four Ways of Thinking, offered a framework — statistical, emergent, optimisation-based, and probabilistic — that not only echoes the four operations but was itself influenced by the work of Stephen Wolfram, who has long argued that simple rules can give rise to irreducible complexity.
In this framing, subtraction corresponds to the logic of entropy. Division models emergence and categorisation. Addition represents inclusion and flattening. Multiplication captures compounding and evolution. Each of these operations produces a statistical distribution that matches patterns we see in physical systems, biological processes, and cognitive experiences. The match is not exact, but it is consistent. And it suggests that structure is not imposed on the world after the fact — it may emerge directly from these fundamental modes of interaction.
I realise this pushes close to what philosophers of mathematics might call a ‘foundational ontology’. I can’t defend that claim in formal terms. But the alignment is striking: four operations, four shapes, four interaction modes. These are not abstract typologies. They are patterns you can generate on a computer with no real-world data. They appear, reliably, across large domains of random numbers.
That makes this less like numerology and more like physics. You are not mapping data to theory. You are watching structure emerge from operation. The arithmetic operations may not just be tools but instead, the deep grammar of transformation.
So this is the proposal. Not that arithmetic describes the world, but that it generates the conditions under which the world can be described. Not that the four distributions are imposed by models, but that they are the statistical constraints produced by the only four ways things can relate. In this view, arithmetic becomes ontological: a theory not of quantity, but of possibility. A structural theory of interaction.
If the proposal holds, it may help explain why the same four patterns show up in thermodynamics, cognition, social systems, and biological evolution. It would also offer a new bridge between mathematical practice and philosophical inquiry. Not by adding mystical overtones to arithmetic, but by stripping it back to its raw operational forms.
I don’t claim this idea is final, or fully justified, or even that it is new. Indeed I have mentioned some thinkers - including the Buddha himself - who have used their own vocabularies to track much the same idea. I only offer it as my own little view at this mysterious four-fold truth operating beneath the surface of the universe. If true, the four operations aren’t just how we do arithmetic. They may be how reality does itself.
Noble indeed.



Intriguing 🤔
How would you incorporate countability vs uncountability into this? What about imaginary numbers (ie, square root of -1)?