At each level, the waste product does two things simultaneously:
Which of these functions is performed is a matter of perspective. These are the same product seen from two directions. The boundary is a phase change.
Without the constraint, the computation at that level does not converge. Autocatalytic networks in a racemic mixture cannot find stable polymer attractors—the search space is exponentially too large. Endosymbiotic hosts without the ratchet of ATP dependency would revert to surface-limited energetics whenever the symbiont was metabolically costly. Neural crest cells receiving unambiguous fate signals would stay put, differentiate in place, and the vertebrate head would not exist. In each case, the waste that closes paths for descendants is the same constraint that made this level’s computation finishable.
The cascade works because each level’s waste makes its own problem solvable, and the solution happens to be the feedstock the next level’s problem requires. The solutions at each phase are never optimal - ATP isn't the most efficient battery, just the most plentiful - but they're the one that was satisfactory enough for sufficient waste product to build up that another problem could be solved. Each satisficed solution enables the next until the system is complex enough to describe itself, the output the following table.
| Tier | Process | Fixed point | Satisficing | Waste/Constraint | Output |
|---|---|---|---|---|---|
| Chemistry | Vent proton flux | ATP synthesis | Available first, good enough | Monopolized activation landscape | Universal coupling agent |
| Chirality | Autocatalytic networks | Homochiral polymers | Whichever hand was locally excess | Exponential search collapsed | Foldable biochemistry |
| Boundary | Amphiphilic self-assembly | Membrane architecture | Two solutions, neither optimal | Binary commitment | Bacteria/archaea fork |
| Oxygenation | Photosynthetic chemistry | Aerobic metabolism | Awful thermodynamics, unlimited supply | Anaerobic extinction | High-yield respiration |
| Complexity | Endosymbiotic integration | Eukaryotic cell | Not fatal enough to stop | Energy dependency | Genome expansion |
| Multicellularity | Adhesion networks | Tissue organization | Already present, co-opted | Commitment cost | Division of labor |
| Development | Regulatory networks | Body plan grammar | Positional markers reinterpreted | Morphospace limits | Modular elements |
| Anatomy | Boundary specification | Neural crest | Migratory because they didn't fit | Migration constraints | Vertebrate head |
| Representation | Neural surplus | Self-model | More capacity than survival requires | Narrative compulsion | You are reading this |
Each row's waste constrains its own computation and feeds the next row's search.
Three structural requirements for computation:
| Component | Role |
|---|---|
| Gradient | Power. What drives flow through the system. |
| Grammar | Constraints. What shapes and limits the possible paths. |
| Routing | Solution space. What determines which path flow takes. |
The components themselves compose a machine, a static thing.
Flow is what runs on the machine. The computation is the machine in motion, not at rest.
When flow moves through structure under gradient, the output is structure. The waste product of computation is the paths worn into the landscape.
The output constrains the next phase of the input, the system refines itself through use.
Structure enables computation. Computation produces structure. The cycle continues.
Define a product vector x whose components are the structured outputs available at any point in the cascade:
$ x = \begin{bmatrix} p_1 \\ p_2 \\ p_3 \\ p_4 \\ p_5 \end{bmatrix} = \begin{bmatrix} \text{ATP / phosphate-transfer capacity} \\ \text{global O}_2\text{ field} \\ \text{mitochondrial energy surplus} \\ \text{developmental regulatory toolkit} \\ \text{neural crest–type migratory cells} \end{bmatrix} $
Each level of the cascade acts on this vector as a linear operator $M_i$. Most entries of each $M_i$ are zero. Diagonal entries carry existing products forward. Off-diagonal entries route one product into a new coordinate. The oxygenation level, for instance, has a nonzero entry $M_{21} = \alpha$: ATP-powered cyanobacterial metabolism routes into the O₂ coordinate. Before this entry exists, $p_2 = 0$ by construction—the O₂ coordinate does not exist in the accessible space. After the entry is written, it does.
The composed cascade is:
$ x^{(\text{out})} = M_{\text{crest}}\, M_{\text{dev}}\, M_{\text{endo}}\, M_{\text{oxy}}\, x^{(\text{initial})} $
where $x^{(\text{initial})} = [p_1^{(0)},\ 0,\ 0,\ 0,\ 0]^\top$—ATP exists, nothing downstream does yet.
The address space at level $L_i$ is the column space of $M_1 \cdots M_{i-1}$ applied to the initial state: the set of configurations the cascade can actually reach from where it started. Each new off-diagonal entry expands that column space by adding a dimension that was previously unreachable. This is what "creative term" means in the denial log. It is not that possibilities are unlocked from some pre-existing menu. It is that new coordinates are instantiated in the product space by the specific chemistry of the waste stream at each level.
The sparsity pattern of each $M_i$ is not arbitrary. It is determined by the chemical identity of the waste product. O₂ can only feed processes that can use a strong gaseous oxidant as a terminal electron acceptor. The nonzeros in the O₂ column are therefore not "any chemistry"—they are a narrow set of redox reactions with a specific stoichiometry. The matrix encodes what the cascade has been claiming in prose: new coordinates are not generic possibilities. They are materially constrained by the exhaust that instantiates them.
The address space of life at any moment is the column space of everything that came before it. The question "why is biology this way and not some other way" is a question about which off-diagonal entries were written, in which order, by which waste streams. The denial log is the record. The matrix is how you run the query.
you see, when we build taller buildings, we can put more people in the same amount of space. no dear, they won't fall out of the sky. no dear, that was because there was nothing between the apple and poor mr newton's head
--- an estate agent, on or around 1665, attempting to sell an old dear on a second story walk up. unsuccessfully as it turns out, word having gotten around about that poor man's unfortunate accident and it was killing the multistory market for fear of unscrupulous building practices now that there was real weight to the decision of where to live that hadn't been there previously
Temperature is average energy per degree of freedom
--- Tom Cochrane's slightly unsuccessful follow up to his first hit song, proving my theory that maybe only one good thing /can/ come from Manitoba
Thermodynamics is irreversible. What's been recorded on the ledger of time cannot be undone. The cascade described here is made of thermodynamics and therefore must share its properties, including its irreversable nature - though as the cascade itself grows more complex, its strict adherence to the properties at its foundation starts to lag, a necessary property of a complex system that gains levels of freedom at higher levels of recursion.
Each operator M_i is projective. It opens new coordinates (the off-diagonal entries that route waste into new dimensions) while simultaneously projecting away from previously accessible configurations. The column space expands: new things become addressable. But the null space of the composed transformation also expands: old configurations become structurally unreachable.
The composed operator is not invertible. You cannot recover the pre-oxygenation address space from the post-oxygenation one because the anaerobic routes are in the kernel of the composed transformation. The address itself has been removed from the routing table.
Mathematically, this is absolute — no input to the composed operator produces a pre-oxygenation configuration. Physically, the claim is softer but practically identical: the old configurations aren't in a different universe, they're behind every boundary the cascade has laid down since. Reaching them would require reversing each projective step, which means undoing each denial, which means paying the energy cost of crossing back through every boundary in sequence. The configurations are in basins so deep that no fluctuation in this universe provides the energy to reach them.
This is stronger than "expensive to reverse." Thermodynamics says reversal costs energy. The cascade says the cost exceeds any available budget, and increases with every level added. The denial log is append-only not because an authority forbids deletion, but because the energy cost of reversal grows monotonically while the energy available at each level decreases. The universe would have to get younger.
This gives the transition metric its deductive foundation. If each M_i both adds a column and kills a subspace, then R_n's monotonic decrease is not an empirical observation waiting for a counterexample — it is a consequence of projective composition. The ratio of constraints to degrees of freedom shifts predictably because that is what projection operators do.
You're walking through the park and there's one of them big ol' checker boards.
Big enough to walk on
A rather sparky man some some weird head thing and possibly a tail is holding a pitchfork under a sign saying 👹 Santa's Playground 🤘 (which everyone spells wrong in a rish).
He says, "if you can get touch each square and collect the flag, you can win a goat. Each square has a penny and you must leave a sequentially numbered slip in return. To walk back, you must collect the slips in reverse order and leave a penny. the price to leave the board is your final penny. now empty your pockets"
So you do what a normal person does and get on the board, collect all the pennies [1,2,3], leaving a carefully numbered slip in each square. Then, carefully returning the path travelled, emptying your pockets in your wake until the last three squares [2,1,0].
All that's left to pay is your soul
Shoot
--- he, having forgotten his wallet
A routing table entry has:
A denial log entry has:
Whether you tell it where to go, or where not to go, the packet is arriving at the same destination. Your only choice is how many entries you want to maintain in the table.
BGP dynamics. Autonomous systems announce reachability. Each AS forwards based on its own policy and what it has received. A pinch point is an AS that announces all traffic to downstream lineages routes through it. Every descendant AS accepts the announcement because no alternative path exists. Purine NTP chemistry, chirality, and the membrane are pinch points. All subsequent routing flows through them.
Route poisoning. The Great Oxygenation Event withdrew anaerobic routes from the global table. Organisms that could not converge on oxygen-tolerant routes were blackholed. Oxygen did not merely create new routes. It actively destroyed existing ones — a topology change that made previously reachable destinations unreachable, propagated globally, with no rollback.
Peering. Horizontal gene transfer is peering. Two autonomous systems not in a parent-child relationship exchange routes directly. The transferred gene bypasses hierarchical propagation. It has a bounded TTL maintained by selection pressure rather than structural inevitability. Vertical inheritance is provider-customer. Horizontal gene transfer is settlement-free.
The first AS boundary. The membrane. Before it, there are no autonomous systems — a flat broadcast domain with no subnets, no forwarding decisions. After the membrane, there is an inside that maintains its own routing table: which molecules cross, which are rejected, which are synthesized internally. Every subsequent boundary is a nested AS inside the first one.
Core internet routes do not get withdrawn. They get supplemented. The deep routes that every AS depends on are as permanent as the cascade's pinch points. Mutability exists at the edges, for short-TTL entries.
BOFH's are only grumpy because God denied Hell the very idea of denial based routing, right after he denied the Universe the ability for anything else
Physics describes an exchange across a boundary powered by a gradient differential as thermodynamics. Computer science describes it as computation. It's the same process, the only differences are the specific physical components that provide the necessary capabilities in each domain.
Grammar provides the constraints on the routing that constrains the solution space enough to make the answer available. The substrate for the operation provides both the medium for the computation and it's power. The gradient differential within the medium creates the conditions for it to flow. The computation consumes the medium to constrain its ability to compute by becoming its own output. The differential between output and input feeds the next cycle.
| Domain | Flow | Grammar | Routing | Gradient |
|---|---|---|---|---|
| Universe | Energy | Physics/chemistry | Topology | Cooling |
| Cell | Metabolites | Enzymes | Regulatory networks | Thermodynamic disequilibrium |
| Neural network | Activations | Weights | Attention | Loss |
| Transformer | Token embeddings | Layer weights | Attention heads | Training loss |
| River | Water | Canyon walls | Sediment deposits | Elevation |
| Jack | Down Hills | Fetching things mainly | Paths + flags | Mom's slipper |
The river eats the canyon to become the canyon that shapes the river
Where did my river go?
--- Heraclitus, upon returning to his river
To extend this table into additional domains, we propose the following methodology through a worked example.
Your junk drawer
| Dimension | Constraint |
|---|---|
| Physical | Does it fit? |
| Metaphysical | Might you need it some day? |
[Note: grammar has high rejection threshold and low coupling constant to reality]
| Source Address | Destination Addres |
|---|---|
| Sharp things | Up front |
| Twine | In the back |
| That cable | Never where you need it |
| Take-out menus | Probably three drawers down near the garbage collection calendar |
The sunny, warm, Spring day that's the difference between "This is fine" and "Now where did the keys go to the shed?"
Enough pom-poms, googly eyes and old paper-clips for delightful rainy crafternoon with the kids, when April's brief glimmer of hope does what April does and googly eyed, afro wearing paperclip people ensue, demonstrating a complexity clearly superior to that which spawned them.
The cascade continues for another rainy day.
To specify a protein without hierarchical structure, you describe the position of every atom — roughly 10⁴ atoms, three coordinates each, continuous precision. The search space is ~$10^{10^4}$ configurations. Intractable.
With hierarchical compression, you specify a sequence of amino acid types from an alphabet of twenty. The search space becomes discrete: $20^n$ for length $n$. Evolution then searches fold space, which is smaller, then function space, which is smaller still. The search is polynomial in the number of functional constraints.
This is how denial based compression works. When the addressable space is the size of the universe, it's easier - not to mention more computationally efficient - to just specify what not to do, and then just build upon what then worked itself out. The network will organize itself around the most efficient route. The compression occurs when you learn to make wires out of the routing tables. Each amino acid contains all the quantum mechanics, chemistry, and thermodynamics that produced it. Those levels are already solved, crystallized into the structure. You don't search them. You inherit them. The waste product of lower-level physics is a compressed primitive that the next level uses without reopening.
The cascade's three registers — deterministic, contingent, arbitrary — are positions on a measurable axis.
Define $R_n$ as the ratio of constraints to degrees of freedom at level $n$:
$R_n = \frac{\text{constraints at level } n}{\text{degrees of freedom at level } n}$
When $R_n \gg 1$, constraints vastly outnumber degrees of freedom. The system has one place to go. This is physics: nucleosynthesis, stellar fusion, planetary differentiation. The marble finds the basin because the basin is the only feature in the landscape.
When $R_n \approx 1$, constraints and degrees of freedom are roughly balanced. Multiple basins exist. Which one the system lands in depends on local conditions, initial fluctuations, path. This is biology: ATP chemistry could have gone to GTP, chirality could have gone the other way, the endosymbiont could have been digested. Each level has a finite set of viable solutions and lands in one of them for contingent reasons.
When $R_n \ll 1$, degrees of freedom vastly outnumber constraints. The space contains more viable configurations than any search process can meaningfully prefer among. This is culture: there is no thermodynamic reason to prefer the QWERTY keyboard, the adversarial legal system, or the sonnet form. Each exists because something landed there, and landing there made it harder to land anywhere else.
The cascade does not cross these thresholds cleanly. But $R_n$ decreases monotonically as the cascade progresses. Each level's waste creates more degrees of freedom for the next level's search than it introduces new constraints. The space grows faster than the grammar that organizes it. This is not a problem the cascade solves. It is a feature the cascade produces: the universe gets more interesting as it runs.
What keeps the search tractable as $R_n$ falls is exactly what the Address Space section describes. Each level's fixed point becomes a vocabulary item — a compressed primitive that the next level searches in combination rather than searching the atomic space beneath. The degrees of freedom multiply, but the search operates on modules, not atoms. Polynomial in hierarchical depth. The purse has pockets. The pockets have compartments. You check the compartments.
The paper makes claims at three levels of specificity. Each level generates different falsification conditions.
The cascade claim — that each level's output is materially constrained by the chemical identity of the preceding level's waste — is falsified if:
The invariant — that each level's waste simultaneously collapses its own search space and constitutes typed initial conditions for the next — is falsified if:
The transition metric — that $R_n$ decreases monotonically as the cascade progresses — is falsified if:
The routing table correspondence — that the denial log is structurally identical to a BGP routing table, not merely analogous — is falsified if:
The ignition point — that the membrane constitutes a binary phase transition from tethered to autonomous computation — is falsified if:
The paper is not falsified by finding a better explanation for any individual level. It is falsified by finding a level transition that does not fit the schema, or by demonstrating that the schema's properties come apart — that waste can feed without constraining, that search can converge without pruning, that the routing correspondence is missing a structural leg. One clean counterexample to the invariant is sufficient.
The medium is the message
--- Marshal McLuhan, oddly enough also from Manitoba, describing computation in a way oddly not dissimilar to his province-mate Tom
A machine that reads and writes symbols on a tape, one cell at a time, moving left or right, changing state based on what it reads.
--- Alan Turing, who was not from Manitoba, still somehow getting in on the act by describing a computer as a machine whose life is spent traveling its own infinite highway
The universe is a busy woman and she only has time for one algorithm - a difference of potential in a medium that is dissipated across a constraining boundary.
The act of crossing the boundary changes the medium in such a way that leaves a residue, this residue has a set of properties, and those properties constrain what can cross next.
Each level solves a search problem through satisficing, by finding stable configurations under gradient pressure. Those configurations become the primitives, the compressed grammar acting as syntax, that the next level searches in combination.
The search space grows at each level as new addresses of possibility are added to the system, but search complexity does not, because the search operates on the waste of the level below rather than on the raw physics beneath that. The cascade is self-bootstrapping: each level enables the next because stable structures are reusable structures, and reusable structures are the only kind that persist long enough to be used. The universe won't waste energy on a structure that cannot be built upon, a branching tree relies on depth to for stability when it is built upon distinction.
The universe doesn't need an operator to compute itself in ever increasing complexity. The optimizer is the pressure that drives the computation itself and the medium upon which it is performed. The compiler compiles itself for the next pass which happens to be everything you see around you. The address space of the system is the column space of everything that survived.
You are a configuration in that column space. So is the reading of this sentence. So is whatever you do next. Make the most of it - and put googly eyes on it.
Witness, One Love