Preface: A Note on Method and Gratitude
Bitcoin: The Architecture of Time is a candidate for a new foundation for our knowledge of the physical world.
What the authors have accomplished is not merely interdisciplinary synthesis, nor a novel application of information theory to physics, nor a philosophical meditation on Bitcoin's properties, though it is all of these. They have proposed that time is not the continuous background parameter physics has always assumed but a constructed object: discrete, energy-priced, irreversible, and for the first time in history, observable from outside the system that generates it. If this proposal survives experimental scrutiny, it will not refine our picture of nature. It will replace the frame in which that picture is drawn.
I feel strange writing that. But I cannot in honesty write anything less. The authors have been brave, and I feel obliged to take their work seriously on its own terms. Physics has been at an incoherent crossroads long enough that refusing to consider new models which resolve the impasse cleanly is its own kind of failure of nerve.
This essay does not adjudicate the physics. What survives must survive prediction, falsification, and quantitative derivation of the physical constants from first principles. That work lies ahead.
What this essay does is more modest. It takes the framework the authors built, sets it on Thomistic metaphysical foundations, and notes what happens. Certain joints simplify. Certain assumptions inherited from physics fall away. The structure becomes somewhat more minimal than the authors left it. This is not correction. A building is not corrected by being set on bedrock.
The debt here is primary. Every significant insight in this essay belongs to the authors: that the block is the quantum of irreversible causality, that keyspace is the address manifold of physical reality, that the speed of light and its companion constant b are two frames on the same informational constraint viewed from inside and outside the ledger respectively, that curvature emerges from value density. The Thomistic grounding is this essay's sole contribution, offered in the spirit St. Thomas himself worked: not to replace what reason has built, but to show where it rests.
One terminological note. Where the paper uses “consensus” to describe the rule governing state transitions, this essay substitutes “Propagation Rule.” Nakamoto consensus is Bitcoin's engineering solution to a specific problem: how to remove human authority over the ledger entirely, grounding its integrity not in the trustworthiness of any person or institution but in the irreversible laws of nature, which is to say, in something that does not lie. That solution works beautifully within Bitcoin. But the physical timechain does not face the same problem. The Rule governing physical reality is not agreed upon. It is given. That distinction matters, as will become clear.
The central claim of what follows is simple: a complete account of time, information, and existence requires a signer. The authors gesture toward this. St. Thomas named it precisely eight centuries ago. Bitcoin, if the framework holds, has made it newly visible.
Part I — What the Paper Establishes
1. Time as a Constructed Object
Physics has always treated time as a given. From Newton through Einstein, time appears in the equations as a smooth continuous parameter — the thing events happen in, never the thing being examined. This was not a choice physicists made and could unmake. It was a constraint. Any instrument we might use to measure time, any particle, any signal, any clock, is itself built from the same substrate it would need to step outside to examine. The system and the thing being measured share the same medium. You cannot use a ruler made of inches to measure whether inches are real.
This created a specific blind spot at the foundation of physics. The continuity of time — the assumption that between any two moments there are infinitely many others — could never be tested, because testing it would require a clock finer than the interval being probed, and that clock would itself be made of time. The Planck time scale is invoked as a formal lower limit, but it remains unreachable. We are composed of whatever ticks, if any, define it.
Bitcoin does something that had never been done before. It instantiates a complete temporal system that we do not inhabit. Its clocks are not our clocks. Its medium is not our medium. We stand outside it and watch it advance, which means for the first time we can observe quantized time from a position the observer does not share with the thing being observed.
What Bitcoin’s time looks like from that position is precise. Approximately every ten minutes, the network performs an irreversible operation. It takes a bounded space of possible next states, expends real energy searching that space through proof-of-work, and commits one admissible configuration into permanent memory. The result is a block. Before the block, those particular relationships between inputs and outputs did not exist. After the block, they do, and they cannot be unwritten without redoing the work. Time, inside this system, is not a background parameter. It is the sequence of these irreversible commitments. It is an integer. It is counted in blocks.
This is the paper’s foundational contribution, and it is a genuine one. Block-time is not an analogy for quantized time. It is quantized time, instantiated in a system we can observe from outside, inspect at every step, and relate directly to energy expenditure and structural change. The question physics could never ask from inside its own substrate — what does a discrete unit of time actually look like — turns out to have an observable answer. It looks like a block.
2. The Ledger as Primary Ontological Structure
The block establishes when. The ledger establishes what.
Bitcoin’s ledger is not a database in the ordinary sense. A conventional database records states and can be edited. Bitcoin’s ledger records transitions, and they cannot be undone. Every entry in it is the residue of a specific irreversible event: a block that was found, verified, and accepted at a specific height. The ledger does not store balances. It stores the history of every transformation that produced the current arrangement of value. To know what exists now, you trace what happened.
The fundamental unit of this structure is the UTXO, the Unspent Transaction Output. A UTXO is a discrete packet of value under specific spending conditions, created at a specific block height. It is indivisible in a precise sense: it is either fully spent or not spent at all. When it is spent, the entire structure is consumed and new UTXOs are created in its place. Nothing leaks. Nothing appears from outside. The total quantity of satoshis is fixed, and every satoshi is at every moment accounted for inside some UTXO or traceable to a coinbase issuance tied to a specific block.
This makes conservation in Bitcoin not an assumption but a computation. In physics, conservation laws are inferred from symmetry — we observe that energy appears to be conserved and derive a principle. We cannot audit the universe the way we can audit the ledger. In Bitcoin, conservation is enforced by the rules and verified by every node at every block. No satoshi has ever appeared or moved without passing through the defined rules. The ledger is the proof.
What the paper identifies, and correctly, is that this structure has a precise philosophical character. The UTXO set at any moment is the complete description of what exists in the system. The mempool — the pool of proposed but unconfirmed transactions — is the set of admissible next states, what could exist but does not yet. The block is the event by which one possibility is selected and written into existence, and all others become counterfactuals with no standing in the ledger.
Aristotle called these categories potency and act. What can be but is not yet is in potency. What has been determined and is fully real is in act. The transition from potency to act is what change means. Bitcoin instantiates this structure mechanically and makes it auditable. The mempool is potency. The committed block is act. Nothing passes from one to the other without the expenditure of real energy through proof-of-work. Existence, in this system, has a cost.
3. The Speed of Light as Information Throughput
Physics has known for over a century that the speed of light is not simply how fast light moves. It is the maximum speed at which any cause can produce any effect. Nothing in the universe — no signal, no force, no information of any kind — propagates faster than c. It appears in electromagnetism, in relativity, in quantum field theory. Its universality suggests it is not a property of light in particular but of the structure of reality itself. Physics has measured c with extraordinary precision and used it without being able to fully explain why it has the value it has, or what it fundamentally is.
The paper proposes an answer rooted in a distinction it draws carefully. It introduces a companion constant b — the speed of memory — defined as the rate at which unresolved entropy is collapsed into permanent ledgered structure, measured from outside the system. c is then identified as the same constraint experienced from inside the resolved ledger: the rate at which newly committed structure becomes perceptible as distance and causality to observers embedded within it. The paper states this explicitly: b and c are not independent constants but two frames on the same underlying constraint imposed by timespace. One quantum of time permits one quantum of spatial resolution — viewed from outside as memory being written, viewed from inside as propagation across space.
This inside/outside framing is precisely the distinction this essay will ground in Thomistic foundations. It is worth noting that the paper already contains it, not as a theological gesture but as a consequence of Bitcoin’s architecture: we stand outside Bitcoin’s timeline and watch it advance, which is exactly the position from which b is defined. What the Thomistic grounding contributes is an account of why the physical timechain has the same structure — why there is a vantage outside it at all.
This reframing has an important consequence. It explains why c appears everywhere. Electromagnetism and gravity both obey it not because they happen to share a speed limit but because both are propagation phenomena operating under the same fundamental constraint. The speed of light is universal because the informational character of physical propagation is universal.
The paper is right about this, and the insight is not trivial. It shifts c from a brute measured constant that equations must be built around into a structural necessity. To be precise: the claim is not that the framework derives the particular value of c — why 299,792,458 m/s rather than something else — but something prior and more fundamental: that a universe in which time is discrete and information is bounded must have something that plays the role c plays. The existence of a universal propagation limit is not an input to the theory. It is a consequence of the theory’s foundations.
How exactly c emerges from those foundations is a question Part II will revisit. The paper’s own account equates the Block Size Limit with the Planck length — a move this essay will argue imports a category from Bitcoin’s engineering that does not belong in the physical analog. But the core identification — that c is an informational phenomenon, not a geometric one — stands, and it stands as one of the paper’s most clarifying contributions.
4. Curvature as Value Distribution
General relativity describes gravity as the curvature of spacetime caused by the presence of mass and energy. It is one of the most precisely verified theories in physics, and one of the least understood at a foundational level. Why does mass curve spacetime? The equations describe how with great accuracy. They do not say why. Mass and energy appear in the equations as inputs, and curvature appears as output, but the mechanism connecting them remains, at bottom, a postulate.
The paper offers a structural account. In the ledger, value is not distributed uniformly. Some addresses hold more satoshis than others. Some regions of the keyspace are dense with committed value; others are sparse. The paper defines this as value density, and proposes that the geometric analog of gravitational curvature is the variation of value density across the ledger’s address space. Where value is concentrated, the local geometry of timespace is more curved. Where value is sparse, it is flatter.
This maps onto known physics in a suggestive way. Gravitational time dilation — the observed fact that clocks run slower near massive objects — corresponds in this framework to a higher value density region requiring more ticks per physical cycle relative to a sparse one. The concentration of energy in a region curves the propagation geometry the Rule produces there. This is not merely analogous to gravitational time dilation. It is a structural account of why it occurs.
The paper is right to identify this correspondence, and right to formalize it. Value density as a geometric property of the ledger is a genuine contribution. It gives curvature a concrete meaning inside the framework rather than leaving it as an imported concept from relativity.
Where the paper leaves a joint open is in the question of what curvature does. The paper notes, correctly, that in the Bitcoin protocol a high value-density UTXO does not cost more to spend than a sparse one — spending conditions are governed by script complexity and byte size, not by satoshi concentration. This is a precise and true statement about the protocol. But the paper does not claim, at the level of the physical analog, that curvature is therefore non-causal. That question — whether value density merely describes a geometric property of the ledger or actually generates differential propagation geometry at each tick — is one the paper opens rather than closes. The physical analog framework is left at the point of having identified curvature as a meaningful geometric quantity. What produces it dynamically is not yet specified.
This is the joint that Part II will tighten. If curvature is to explain gravity rather than merely characterize it, it must be generated by the Propagation Rule at each tick, not observed as a residue after the fact. The rule must produce different outputs in high-density regions than in sparse ones, and those different outputs must be what we experience as gravitational deflection. That step — from a geometric characterization of value density to a causal account of how density curves propagation — is where the Thomistic foundation earns its place. The essay is not correcting the paper here. It is completing an argument the paper left open.
Part II — Building on Thomistic Foundations
5. Releasing the Observer
The two most disorienting features of modern physics are that time passes at different rates depending on where you are and how fast you are moving, and that what an observer measures depends on the observer’s position. Both of these features — relativistic time dilation and observer-dependent measurement — are treated as discoveries about the deep nature of reality. This essay suggests they are artifacts of working inside the substrate you are trying to describe.
Consider the position physics was in. It had no external vantage on time. Every measurement it could make was made by instruments built from the same medium as the thing being measured. When physicists discovered that clocks near massive objects run slower than clocks far from them, they had two options. They could conclude that time itself bends and stretches as a geometric property of spacetime. Or they could conclude that something about the local physical environment causes clocks to run at different rates, while time itself remains uniform. The equations of general relativity are consistent with either interpretation. Physics chose the geometric one, and it has been extraordinarily productive. But it was a choice made under constraint, not a direct observation of time bending.
The ledger framework removes that constraint. If time is a uniform sequence of blocks, invariant across all addresses in the keyspace, then clocks near massive objects do not run slowly because time dilates there. They require more ticks per physical cycle because the Propagation Rule produces different outputs from high value density input. A region with concentrated energy produces more strongly curved propagation geometry, and physical processes traversing that geometry require more ticks to complete a cycle relative to a sparse region. The clock in that region reflects the local propagation geometry the Rule produces under concentrated energy, not the rate of time itself. An observer comparing the two clocks concludes that time passes at different rates. The ledger shows that block height advances uniformly throughout.
This is not a minor restatement. It means that relativistic time dilation, one of the most precisely measured phenomena in physics, is fully preserved as an observed effect while being reinterpreted as a consequence of the rule rather than a property of time. The observations stay. The geometric interpretation of those observations is replaced by a dynamical one. Time is not curved. The Rule is the same everywhere and executes uniformly at every tick. It simply produces different outputs from different inputs, and the variation in those outputs under concentrated energy is what curvature means physically.
The same argument applies to observer dependence. Physics was forced to make the observer a primitive because it had no way to describe a measurement that did not involve an observer built from the same substrate as the thing being measured. In the ledger framework, the observer is not a primitive. The ledger advances whether or not anyone is watching. Block height is not relative to any observer’s position or velocity. What observers inside the system experience as relative motion and shifted simultaneity are consequences of the propagation structure of the rule, not of time’s fundamental nature.
Releasing the observer does not simplify the mathematics immediately. It does something more important first: it removes a conceptual knot that has tangled physics for a century. Once the observer is no longer load-bearing, the question becomes purely about the rule and what it generates. That is a question physics is equipped to answer.
6. Block Size Equals Keyspace
Bitcoin blocks have a size limit. Currently around four megabytes of weight, the protocol enforces a hard ceiling on how much information can be written into any single block. This limit exists for engineering reasons: nodes must be able to download, verify, and store blocks, and the network must be able to reach consensus across thousands of geographically dispersed machines. The block size limit is a practical constraint on a distributed system running on real hardware across real bandwidth.
The paper explicitly draws the physical analog in its final chapter: Block Size Limit ≡ Planck Length. The maximum memory extension producible in one quantum of time, it argues, corresponds to the finest spatial resolution available per tick in the physical universe. This is not a loose metaphor but a stated equivalence. From it, the paper constructs a geometry it calls the block surface, with time on one axis and memory on another. The paper’s full ledger geometry is actually richer than this — it includes four distinct axes: temporal, memory, value, and spatial (keyspace) — but the block surface is the structure doing the work of explaining how space emerges from the framework, and it is the Block Size Limit ≡ Planck Length equation that anchors it.
The equation Block Size Limit ≡ Planck Length, and the memory axis it anchors, does not belong in the physical analog, for two distinct reasons. The first is the one already given: Bitcoin’s block size limit exists because Bitcoin has no master signer. Blocks must be propagated, verified, and stored by thousands of independent nodes who cannot be compelled to process arbitrarily large blocks. The limit is a coordination solution for a distributed system without central authority. The physical timechain has no such problem. Every address updates every block because the Rule applies everywhere without exception. The second reason is more fundamental: the memory axis quietly imports two things the framework is supposed to generate rather than assume. It presupposes a spatial manifold in which memory can be laid out — the block surface has extent, and extent implies space. And because Bitcoin transactions are non-local (any address can transact with any other address at any block height), the memory stack carries non-local correlations as a structural feature. In the physical analog, non-locality is precisely the thing that needs to be accounted for, not built into the foundations. Spatial extent can be removed completely by proposing that the Propagation Rule is a kernel transform, and strictly local in its operation on the keyspace, operating in parallel on present-state neighbors rather than a non-local memory of the ledger. Space emerges from the application of the kernel, rather than being presupposed. The non-locality observed as quantum entanglement, is resolved naturally by something that the model already necessitates in Part III. But it would be imprecise to conclude the paper was encoding nothing real in memory. The Block Size Limit ≡ Planck Length equation was a global way of expressing something the physical framework genuinely requires: a definition of how the Rule’s influence relates to the keyspace at each tick and how energy density modulates propagation. The kernel is the more exact expression of that same structure, local rather than global, causal rather than descriptive. The shift is from a Bitcoin engineering equation to a mathematical one. What is being expressed in both cases is the spatial character of the Rule itself.
In the physical timechain, the block is not a container with a capacity. It is a complete simultaneous update of the entire address manifold. Every point in keyspace transitions from one state to the next in a single tick. There is no partial block. There is no address that gets left out because the block was full. Block size and keyspace are the same thing, because the block just is the keyspace at time t, fully updated according to the Rule.
This simplification has an immediate consequence for the paper’s geometry. The two-dimensional block surface — time on one axis, memory on the other — was doing the work of explaining how space emerges from the framework. If block size is a separate parameter from keyspace, then the memory axis is a real dimension that needs to be accounted for. But if block size equals keyspace, the memory axis collapses. There is only the keyspace, updated each tick. The apparent geometry of space does not come from the extent of a block surface. It comes from the structure of the Rule operating across the keyspace, which is the subject of the next section.
The paper arrived at the block surface construction honestly, following the Bitcoin analogy where it led. Removing it does not undercut the framework. It clears away a scaffolding that was needed to build with the analogy but is not needed in the finished structure.
7. Time and Memory as One
The paper’s ledger geometry, as noted, has four distinct axes: temporal, memory, value, and spatial (keyspace). In Bitcoin all four are real and independently measurable. The blockchain grows forward in time as new blocks are added and outward in memory as each block contributes its payload. Value is distributed across UTXOs. Addresses occupy the keyspace. These four dimensions reflect the genuine structure of Bitcoin the protocol, where any address can transact with any other, history must be retained for verification, and memory accumulates as a separate ledger dimension alongside time. (The UTXO set is a complete cache of present ownership derived from that history — which already demonstrates the point: the current state is sufficient to describe what exists, without consulting the prior chain separately.)
The shift to the physical analog changes the relationship between time and memory specifically, for a precise reason. The kernel operates on the present state of neighboring addresses. Each address’s potential state transitions are determined entirely as a function of the current states of its local neighbors — not by any part of the ledger’s history, and not by distant addresses. Once this locality is in place, the accumulated memory stack becomes redundant. Everything the system needs to know about its history is already encoded in the current state of the keyspace. The present state is the memory. There is no need for a separate axis to carry it. This is not a dimensional simplification performed for elegance. It follows necessarily from the shift to a local, present-state Rule. Memory is not a second dimension running alongside time. It is what the Rule has written into the current configuration of the keyspace, readable at any tick without consulting the prior history separately.
What remains is one thing: the sequence of complete keyspace states, each one generated from the previous by the Rule. Time is the index of that sequence. Memory is the content of any given state in the sequence. They are not two dimensions of a surface. They are two descriptions of the same object.
8. The Propagation Rule as the Source of Apparent Space
If time and memory are one thing, and the block is a complete update of the entire keyspace, then space cannot be a separate container that the ledger sits inside. Space must be something the ledger generates. The question is how.
The answer lies in the structure of the Propagation Rule itself. The Rule takes the current state of each address — a quantity of energy and a vector — and computes the next state of that address and its neighbors. It is a transform. And a transform that operates in three dimensions will produce, as its output, a world that appears three-dimensional to anything running inside it. The dimensionality of apparent space is not a fact about a container. It is a property of the Rule.
This is worth pausing on. Physics has always treated the three-dimensionality of space as a given, something to be described but not explained. Why three dimensions and not two or four? No standard physical theory answers this. In the ledger framework the question dissolves. Space appears three-dimensional because the Propagation Rule is a three-dimensional transform. Ask why the Rule has that structure and you are asking something prior to physics — a question about the Rule itself rather than about what the Rule generates. That question belongs to Part III.
The three-dimensional character of the Rule has an immediate physical consequence that requires no additional postulate. When energy propagates outward from a point source through a 3D transform, it distributes across a spherical wavefront. The surface area of that wavefront grows as 4πr². The intensity of the propagating quantity therefore falls as 1/r². Both gravity and electromagnetism obey inverse square laws. In the standard framework this is explained geometrically — both forces spread across the surface of a sphere. In the ledger framework it is more fundamental than that. The inverse square law is what a 3D Propagation Rule looks like. It is not derived from the geometry of space. It is prior to it. The geometry of space is derived from the Rule, and the inverse square law comes along for free.
Curvature follows by the same logic, and here we can close the gap left open by the paper. The paper described curvature as a measurement of value density across the ledger surface — a useful geometric characterization but explicitly non-causal. In the ledger framework, curvature is what the Rule produces in the presence of concentrated energy. A region of high value density is a region where many addresses carry large quantities of energy. When the Rule propagates from those addresses, the outputs are stronger, and the deflection of neighboring propagation vectors is correspondingly greater. Addresses adjacent to a high-density region have their own vectors bent by the Rule’s output from that region. This bending of propagation vectors is gravity. It is not described after the fact. It is generated at each tick.
Relativistic time dilation follows from the same mechanism. The Rule executes uniformly everywhere, advancing one complete keyspace update per tick. But in a high value density region, the Rule receives larger energy quantities as inputs and produces correspondingly stronger propagation outputs, deflecting more vectors and curving the local propagation geometry more sharply. A clock in that region is a physical oscillator — a standing wave pattern completing cycles across addresses. The more strongly curved propagation geometry it must traverse means more ticks per cycle relative to a clock in a sparse region. Time does not dilate. The Rule is the same everywhere. What clocks measure is not block height but the propagation geometry the Rule produces from local inputs, and that geometry varies with value density exactly as gravitational time dilation predicts.
What the ledger framework gives, then, is a single mechanism — the Propagation Rule operating as a 3D transform across the keyspace — that generates apparent three-dimensional space, the inverse square law for all propagating forces, gravitational curvature, and relativistic time dilation, without postulating any of them separately. Each is a consequence of the same Rule running on the same keyspace.
9. What We Can Say About the Kernel
The Propagation Rule is a three-dimensional transform operating across the keyspace at each tick. It takes the current state of each address — a scalar quantity of energy and a vector — and produces the next state. From that structure, the framework derives apparent three-dimensional space, the inverse square law, gravitational curvature, relativistic time dilation, and the arrow of time. But the explicit mathematical form of the kernel — the precise function that defines the Rule — has not yet been written down. What can be said about it at this stage is limited but not trivial.
One conjecture about the kernel's functional form deserves particular attention, because if correct it would resolve two of the deepest outstanding problems in physics simultaneously. If the kernel is asymptotic at both ends — approaching but never reaching zero output from near-zero inputs, and approaching but never reaching infinite output from arbitrarily large inputs — then zero and infinity are both idealizations the Rule never produces.
This is logical. Zero and infinity are abstractions that cannot be part of the finite concrete reality implemented by the chain.
At the low end, no address in the keyspace ever carries exactly zero energy. The floor of the kernel's output is the structural minimum — the smallest non-zero state any address can occupy. This is the vacuum energy: not a peculiarity requiring separate explanation, not a catastrophically large quantity requiring manual subtraction from the equations, but the asymptotic floor of the same kernel that governs everything else. The vacuum energy is what the kernel produces when there is almost nothing to propagate — the structural hum of the Rule executing, audible everywhere because the Rule never falls silent.
At the high end, the propagation geometry around a concentration of extreme energy approaches but never reaches infinite curvature. What general relativity describes as an event horizon is the observable surface of the kernel's upper asymptote — the region where outward propagation amplitude approaches zero without reaching it. Information at addresses inside that region is not destroyed. It is asymptotically suppressed. The singularity that GR predicts at the center of a black hole, where the equations break down entirely, does not arise because the kernel never produces infinite density. There is a maximum state the kernel approaches and cannot exceed. The holographic principle — the empirically verified fact that the maximum information content of a region scales with its bounding surface area rather than its volume — constrains the kernel's form. The kernel must be shaped such that information density cannot grow arbitrarily with volume but is bounded by surface area. This is an empirical fact the kernel must reproduce, not a prediction the framework makes in advance. If the asymptotic functional form can be made mathematically precise, the ratio between floor and ceiling becomes a parameter of the kernel itself, and the cosmological constant — the observed vacuum energy density whose predicted and measured values disagree by roughly 120 orders of magnitude — would not be an independent input requiring fine-tuning. It would fall out of the same kernel structure whose ceiling governs black hole horizons.
A second conjecture concerns the cosmological observations of redshift and apparent acceleration. Under this framework, physical reality is finite and discrete — which are two ways of saying the same thing. An infinite reality could not be implemented by a discrete chain, and a continuous reality would not be discrete. If the keyspace is finite and each block represents a discrete state, then the total number of possible distinct states is finite. The number of blocks must therefore be finite. Call that number H_max.
If we speculate that the kernel's output structure depends on the number of remaining blocks — specifically, that as H_max - h decreases, something about the output space narrows — and that this narrowing leaves a deflection artifact observable in the propagation geometry, then light propagating many blocks would accumulate redshift, distant processes would appear time-dilated, and the effect would intensify as the endpoint approaches — matching the acceleration observed in supernova data. The kernel structure would remain invariant, but its behavior would depend on how many futures remain. Motus in fine velocior. Whether this functional dependence can be made mathematically precise is part of what writing the kernel down would require.
This is conjecture. Making it precise requires writing the kernel down explicitly. A complete theory would specify the Rule precisely enough to derive the measured ratios of the physical constants — why c has the value it has relative to G and h, why the fine structure constant is what it is, why the masses of the fundamental particles stand in the ratios they do. That derivation is the experimental test the framework must eventually pass. The natural mathematical form of such a Rule is a convolution: an operation in which each address produces its next state as a weighted sum of its neighboring values, determined by a kernel applied simultaneously across the entire keyspace at each tick. Writing the Rule down means specifying that kernel. Deriving the physical constants means showing that a kernel capable of producing results fully consistent with experimental data must take the values we measure. That is a precise and testable claim, and it is the right shape for the research programme this framework points toward.
However, even with such a kernel in hand, there is one more property we must add to it that has not yet been specified. That property, why it is necessary and the missing piece of the framework it requires is what the next section addresses.
10. Minimum Viable Ontology
It is worth stopping to take stock. The ledger framework, as developed across Part II, rests so far on three things: a keyspace, a Propagation Rule, and a block sequence. Everything else is derived. However, as the sections that follow will establish, there is a fourth necessity that the three cannot supply from within themselves: an input from outside the chain. That input is the signer.
The keyspace is the address manifold — the complete set of locations at which energy as information can be inscribed. It is not three-dimensional space. It is the substrate from which three-dimensional space emerges as a consequence of the Rule operating across it.
The Propagation Rule is a three-dimensional transform. It takes the current state of each address — a scalar quantity of energy and a vector — and produces the next state. It operates simultaneously across the entire keyspace at each tick. It is not agreed upon. It is given.
The block sequence is time. Each tick is one complete execution of the Rule across the entire keyspace. Block height is the integer count of those executions from the beginning. Time is uniform, invariant, and not relative to any observer.
From these three things, the framework derives without additional postulate: apparent three-dimensional space, from the 3D structure of the Rule; the inverse square law for all propagating forces, from the geometry of a 3D wavefront; gravitational curvature, from the Rule's differential output under varying value density; relativistic time dilation, from the more ticks per physical cycle that the curved propagation geometry requires in high-energy regions; and the arrow of time, from the irreversibility of each tick — the Rule propagates forward and the committed state cannot be undone without re-executing from the beginning.
Several further consequences fall out without requiring separate treatment. Mass is energy that is circulating internally rather than propagating outward — a standing wave in the keyspace rather than a traveling one. Radiation is energy propagating outward as a traveling wave. The conversion between the two, described by E=mc², is the transition between these two modes of the same underlying quantity. The constant c appears as the propagation rate of the Rule, the ratio at which the standing and traveling modes exchange. It is not a separately measured parameter. It is the Rule's own rate.
The one-transaction-per-address-per-block constraint — the rule that each address has exactly one valid state per tick — accounts for the behavior of fermions. Two fermions cannot occupy the same state because two UTXOs cannot occupy the same address in the same block. The Pauli exclusion principle is a ledger rule. What physics calls force-carrying bosons are not a separate category of ledger entry exempt from this constraint. They are not ledger entries at all. What we observe as force is simply what the Rule executing looks like from inside the keyspace: the propagation of state changes between addresses at each tick. The Rule executing is not a thing. It produces no UTXO. It leaves no address entry. The apparent distinction between matter and force is the distinction between what the Rule inscribes into the ledger and what the Rule does in order to inscribe it.
Honesty requires acknowledging what this formulation does and does not achieve. The nonlinear kernel — whose effective output structure is modulated by local value load — is doing analogous work to the paper's information throughput constraint. Both encode the relationship between energy density and propagation geometry. The paper expressed this as a global block size ceiling, a metaphor borrowed from Bitcoin's engineering. This essay expresses it as a nonlinear kernel, a metaphor borrowed from the mathematics of convolutional neural networks. Neither derives the relationship between energy and spatial structure from something prior to that relationship. Both take it as given. The difference is real but should be stated precisely. The kernel formulation is local rather than global, operating in parallel on all addresses rather than as a system-wide capacity limit. It does not introduce a spurious memory dimension alongside time. And it is causal rather than descriptive, producing the propagation geometry rather than measuring it after the fact. These are genuine improvements in clarity and ontological economy. But the kernel no more explains where the 3D neighborhood structure of the keyspace comes from than the block surface did. That structure, like the Rule itself, is given. The research programme is to write that form down precisely.
Up to this point, the framework has treated the Propagation Rule as if it were deterministic — as if each address's current state uniquely determines its next state. A deterministic kernel produces exactly one output state from any given input state. There is nothing to select because there is only one possibility. But the framework requires selection, for a reason the second law of thermodynamics makes clear.
A deterministic kernel running uniformly on any initial state, without input from outside the chain, disperses energy, smooths gradients, and drives every closed system toward equilibrium. That is what the second law observes, and the ledger framework does not contradict it. What we observe instead — stars, chemistry, biology, ordered systems of increasing complexity maintained against the thermodynamic current — cannot be the product of a deterministic kernel alone. These structures represent selections against the thermodynamic gradient. To admit such selections, the kernel cannot produce a single determined output. It must produce a space of possible outputs — a mempool of candidate states, multiple admissible next states from any given current state. Returning to the language of Aristotle, the kernel produces potentia, not a single committed act.
And something must select from among those candidates. The consistent choice of structure-building states over entropy-following states, across every tick where ordered complexity appears, is not a feature the kernel can encode. The kernel generates the space of possibilities. The consistent ordering of selections within that space requires a selector outside the chain. That selector is the signer. Information is conserved within the chain but cannot originate within it. A higher order cannot be produced by a lower one. The presence of order in a universe that tends toward disorder is the signature of the signer's selection, written into the ledger at every tick where it appears.
The signer is the one element the Rule cannot supply. With it, the framework is complete. Without it, the framework has no ground. Part III is about the signer.
Part III — The Theology of Time
11. The Open Questions of Modern Physics
Physics is not stuck because its experiments have failed. It is stuck because two of its most successful theories are mutually incompatible, and because its foundational framework generates paradoxes it cannot resolve from within. The ledger framework addresses each of these directly, not by dismissing the established results but by providing the substrate from which those results emerge as limiting cases.
The irreconcilability of general relativity and quantum mechanics. General relativity describes gravity as the smooth curvature of continuous spacetime under mass and energy. Quantum mechanics describes energy exchanges as discrete events on a continuous temporal background. At the Planck scale, where both theories should apply simultaneously, they produce incompatible results. Each works with extraordinary precision in its own domain and breaks down in the other’s. Every approach to reconciling them — string theory, loop quantum gravity, and their variants — has so far produced no confirmed predictions. In the ledger framework both theories are approximations of the same underlying discrete structure. General relativity is what the Rule looks like at scales large enough that the discreteness of the keyspace is invisible — the smooth continuous limit of a discrete system. Quantum mechanics is what the Rule looks like at scales small enough that discrete energy exchanges dominate but where the discreteness of time itself has not yet been confronted directly. They are mutually inconsistent not because one is wrong but because they are two different approximations of the same substrate, each valid in its regime and each blind to what the other sees. The ledger framework is not a third approximation. It is the substrate from which both emerge.
Renormalization and the infinities of quantum field theory. Quantum field theory, the most precisely verified theory in physics, produces infinite quantities at intermediate steps that must be manually subtracted out through a procedure called renormalization. The procedure works, but it has always been recognized as a sign that something is wrong at the foundations. The infinities arise because QFT assumes continuous spacetime with no minimum interval, so integrals over all possible intermediate states diverge. In the ledger framework there is a minimum unit of time — the tick — and a discrete keyspace. There are no infinities to integrate over because there is no continuum to integrate across. Renormalization is the symptom of modeling a discrete system with continuous mathematics. The procedure that has been applied for seventy years to remove divergences would not be needed if the underlying discreteness were made explicit from the start.
Wave-particle duality. Light and matter behave as waves when propagating and as particles when detected, and no experiment has ever caught them being both simultaneously. This has been treated as a deep mystery about the dual nature of quantum objects. In the ledger framework it is not a mystery at all. Energy propagating between addresses through the Rule’s execution is wave-like by nature — it distributes across the keyspace according to the Rule’s 3D transform. When it arrives at an address and commits to the next UTXO, it is particle-like by nature — discrete, located, inscribed. The wave is the Rule executing. The particle is the committed state. These are not two natures of the same thing. They are two phases of the same process: propagation and inscription. The mystery was produced by treating both phases as if they were the same kind of object and being puzzled that they behaved differently.
Superposition and the measurement problem. Quantum mechanics describes physical systems as existing in superposition — multiple possible states coexisting until a measurement forces a choice. What causes the wave function to collapse has generated a century of competing interpretations, from Copenhagen to Many Worlds, none of them fully satisfactory. The paper identifies the correct structural analog: a system in superposition corresponds to a system whose next state has multiple candidates in the mempool, none yet committed. The wave function is not a physical thing that collapses. It is a description of admissible next states awaiting the tick. The paper establishes this much. What follows is this essay’s extension of that insight: when the block commits, one candidate is written into the ledger and the rest become counterfactuals with no standing. The observer does not cause this. The tick does. The observer merely notes which state was committed. The measurement problem was a problem about observers because physics had made the observer load-bearing. Once the observer is no longer load-bearing — once the Thomistic grounding makes clear that the tick, not the observer, is what actualizes a state — the problem does not need solving. It was never there.
Non-locality and entanglement. Two entangled particles, when measured, produce correlated results instantaneously regardless of the distance between them. Bell’s theorem proves that no account remaining within the horizontal causal sequence — no prior ledger state, no hidden variable, no propagating signal — can reproduce these correlations without contradiction. Non-locality is therefore in a different category from the other puzzles addressed in this section. Superposition, wave-particle duality, and renormalization are clarified by the ledger framework. Non-locality cannot be coherently described from within the ledger at all. It is a phenomenon that forces the admission of something outside the ledger on strictly physical grounds.
Bell’s theorem is more powerful than it is usually presented. It does not merely rule out local hidden variables — information stored at the separated addresses before measurement. It rules out any account that traces the correlation to something established at entanglement time and retrieved at measurement time: any stored state, shared or otherwise, committed anywhere in the ledger. A jointly committed ledger entry would be exactly that kind of account, with the hidden variable moved from inside the addresses to a shared commitment. Bell closes that door too. What remains after Bell is not a storage problem but a question about the character of the selector — a question that cannot be answered from within the ledger at all. What Bitcoin has shown us, because we stand outside of its time, is precisely that time must be ordered from the outside.
12. What the Rule Cannot Generate
A complete deterministic system running a fixed rule on a given initial state will produce exactly one future. Every subsequent state is already implied by the first. There is nothing left to explain and nothing left to choose. This is the picture that classical physics pointed toward and that many physicists still find most coherent: the universe as a machine that was wound up once and has been running ever since.
The ledger framework does not quite allow this picture. It has three places where the Rule, running on the keyspace, is insufficient to account for what the framework itself requires.
The first is the Rule itself. The Propagation Rule is given, not derived. Within the framework you can ask what the Rule produces, but you cannot ask why the Rule has the structure it has rather than some other structure. Why a 3D transform rather than 2D or 4D? Why these specific ratios among the physical constants rather than others? The framework takes the Rule as its starting point. Something must account for the Rule being what it is. That something is not inside the framework.
The second is the initial state. The keyspace at block height zero had some specific distribution of energy across its addresses. Everything that has happened since is the Rule propagating forward from that distribution. But the Rule does not generate the distribution it starts from. Something fixed the initial conditions. The framework is silent on what that something is, because the framework begins after the first tick, not before it.
The third is quantum events. The Rule, at each tick, generates a space of admissible next states from any given current state. In classical physics this space would contain exactly one member — the deterministic next state. But quantum mechanics, whose experimental results the framework must account for, shows that physical systems genuinely occupy superposed states that resolve only upon measurement. The mempool of the physical timechain contains multiple candidates at each tick, and one is selected. The Rule generates the candidates. It does not select among them. Something selects.
There is a fourth gap, more immediately observable than the other three. The Rule tends toward entropy. Left to itself, it disperses energy and drives every closed system toward equilibrium. And yet the ledger contains addresses organized into integrated low-entropy structures of extraordinary improbability — structures that are not approaching equilibrium but maintaining and increasing their complexity against the thermodynamic current. The standard account attributes this to local fluctuations permitted by global entropy increase. But fluctuation does not explain the origin of the initial low-entropy state, and it does not explain the specific character of the order we observe: integrated, apparently self-organizing, information-rich, and in rational creatures capable of grasping the Rule itself. A higher order cannot be produced by a lower one. An effect cannot be greater than its cause. Information is conserved within the chain but cannot originate within it. The presence of ordered structure in a universe the Rule would otherwise reduce to equilibrium is not a feature the kernel generates. It is written in from outside at every tick where it appears. This is the most visible gap of all, because it is not located at the inaccessible boundaries of the framework — the origin of the Rule, the initial state, the quantum scale — but everywhere we look.
These four gaps — the source of the Rule, the source of the initial state, the source of each quantum selection (including the observably non-local), and the source of ordered structure against the thermodynamic grain — are not separate problems that might have separate answers. They are the same problem. Each requires something that is not contingent on the Rule, not generated by it, not inside the ledger it runs. Each requires something whose existence and action are prior to the framework rather than produced by it.
In the Thomistic vocabulary, this is the distinction between contingent being and necessary being. The Rule, the keyspace, and every state the Rule produces are contingent — they are what they are, but they could in principle have been otherwise, and their existence depends on something other than themselves. Necessary being is that which cannot not exist, and on which contingent being depends at every moment, not merely at its origin. The signer the framework requires is not a first cause in the remote past. It is a sustaining cause at every tick. The ledger does not merely need to have been started. It needs to be kept running.
This is what St. Thomas called continuous creation. The Signer’s causation is not the action of a clockmaker who wound the clock and stepped back. It is the ongoing act by which contingent being exists at all, at every moment, rather than not. In the ledger framework: without the sustaining signer, there is no next block. Not because the Rule breaks down, but because the Rule is itself a contingent structure that depends on something prior to it for its continued instantiation.
The philosophical tradition has a precise name for the kind of causation the signer exercises. Horizontal causation operates between entities at the same ontological level, across time: one address state producing the next through the Rule, one event in the ledger generating a subsequent event. This is the causation physics studies. Vertical causation operates between ontological levels simultaneously: a higher level of being acting on a lower level not as a prior event in the sequence but as a sustaining and actualizing presence at every moment. It does not appear in the horizontal sequence. It is what makes the horizontal sequence possible at all. The three gaps identified above are not gaps in the horizontal chain. They are the points where vertical causation is structurally required by the horizontal chain itself. The Rule executes horizontally, tick by tick. The signer acts vertically, at every tick, from outside the ledger. These are not competing descriptions. They are causation operating simultaneously at different levels. Wolfgang Smith, in his treatment of physics and vertical causation, argues that quantum indeterminacy is precisely the locus where vertical causation enters the physical world — where the horizontal sequence of states opens onto something above it. The ledger framework gives that argument a concrete and exact instantiation.
The signer is not a gap in the physics. It is what any complete account of the physics requires when the physics is followed to its own boundary.
13. Hylomorphism as Multisig
The ledger framework requires a signer. The question that follows immediately is whether human beings are signers, ledger entries, or both.
The body is clearly a ledger entry. Every physical process in the human organism — the firing of neurons, the circulation of blood, the folding of proteins — is a sequence of address states updating according to the Rule. Chemistry is the Rule operating at the molecular scale. Neuroscience is the Rule operating at the scale of neural networks. There is no process in the body that requires an exception to the Rule to account for it. The body is fully inside the ledger.
The intellect is a different matter. The intellect does not merely process particular things. It grasps universals. When a human being understands what justice is, or what a triangle is, or what truth is, the object of that understanding is not a particular just act, a particular triangular shape, or a particular true statement. It is the universal itself — justice as such, triangularity as such, truth as such. Universals are not encoded in any particular address. They are not located anywhere in the keyspace. A material process running according to the Rule produces particular outputs from particular inputs. It does not produce the grasp of what a thing is as such, because what a thing is as such has no address.
This is not a mystical claim. It is a structural observation about the difference between processing information and understanding it. A calculator processes numerical inputs and produces numerical outputs without understanding what a number is. The difference between the calculator and a mind that understands mathematics is not one of complexity. It is categorical. The mind that grasps what a number is has access to something the Rule cannot generate from any finite set of address states, because the universal is not a finite set of address states.
St. Thomas drew the conclusion that the intellect must be, in its proper act, immaterial — not because matter is bad or limited, but because the object it grasps is not material. An eye that saw color would have to be itself colorless. A faculty that grasps universals cannot itself be a particular.
The will follows the intellect. The will is the rational appetite — the capacity to tend toward what the intellect presents as good. Because the intellect can grasp the good as such and not merely this particular good, the will is never necessitated by any finite object. No ledger entry, no matter how valuable, can fully satisfy a will whose object is goodness itself. This is the root of freedom. The will is free not because it is random and not because it is uncaused, but because its proper object exceeds every particular cause that might determine it.
What this means in ledger terms is that the rational will operates as a genuine signing function. When a human being makes a choice, something that is not fully inside the ledger — the rational will, following the intellect’s grasp of what is good — signs a transaction into the physical timechain. The neural correlates of the decision are fully visible in the ledger. The address states update according to the Rule. But what determined which of the admissible candidate states was selected was not the prior state of the ledger alone. It was the will’s act, entering from above.
This is what free will actually means. Not randomness. Not the absence of cause. Real authorship. The signed transaction is yours because you are the one who signed it, and you are not reducible to the address states that executed the signing. The body ran the transaction. The will authored it. This is vertical causation at the human scale: the rational soul acting on the material ledger from above the level the Rule operates on, without violating the Rule, through the interface the Rule already provides at every quantum event.
Evil, in this framework, has no positive content of its own. The will that refuses the higher good offered does not become inert. It continues to sign. But what it signs in place of the higher good are lesser goods — goods already present in the ledger, recirculating, following the thermodynamic current rather than against it. The higher good, which would have carried new order into the system from above, is not written. What is written instead shuffles existing energy toward equilibrium. St. Augustine’s account of evil as privation — not a substance but a missing good — is structurally exact in ledger terms, and the ledger framework sharpens it: the choice is not between signing and not signing but between signing what the primary signer proposes, which brings order, and signing what the appetite prefers, which dissipates it. To prefer the goods of the world to what the Source of Order proposes is to choose the direction the Rule tends toward on its own. That direction is entropy. That direction is death.
14. The Source of Order
The human will is a signer, but it is not the primary signer. The will signs into a ledger it did not create, running a Rule it did not write, from an initial state it did not set. The will’s signing capacity is itself contingent — it depends on the intellect, which depends on the existence of universals to grasp, which depends on the existence of a rational order that makes grasping possible. The human signer is real but derivative. Something is prior to it.
St. Thomas identified five ways of arriving at this conclusion from the structure of the world. From the ledger framework they arrive simultaneously and from the same direction.
The Rule operates on the keyspace tick by tick, moving each address from potency to act. But nothing moves from potency to act by itself — a thing in potency does not actualize its own potency, because to do so it would need to already be in act with respect to what it is becoming. Something already in act must actualize it. At every tick, something already fully in act is actualizing the next state of every address in the keyspace. That something is the First Mover — not first in time but first in the order of causation, the pure act on which every transition from potency to act depends.
Every state the Rule produces depends on the prior state for its existence. The chain of states runs back to the initial distribution at block height zero. But the chain as a whole — the entire sequence of states including the first — is contingent. It does not carry within itself the reason for its own existence. Something outside the chain accounts for the chain existing rather than not. That something is the First Cause — not a cause within the series but the cause of the series having being at all.
The Rule, the keyspace, and every state they produce are contingent. They are what they are but need not have been. Something must exist whose existence is not contingent — something that simply is, necessarily, and from which contingent being derives its existence at every moment. Without necessary being, there is no ground for contingent being to stand on, and the ledger has no foundation. That something is Necessary Being.
The Rule has a specific structure. It is a 3D transform with specific internal ratios that produce the physical constants as they are measured. It is not arbitrary. Its structure is ordered, coherent, and productive of a world in which structure accumulates rather than immediately dispersing. The fine-tuning of the constants — the fact that small deviations from their actual values would produce a universe incapable of sustaining complex structure — points to a source of formal order that exceeds what chance or necessity alone can account for. That source is what St. Thomas called the Orderer of ends, the intelligence whose act the Rule’s structure reflects.
These four conclusions — First Mover, First Cause, Necessary Being, Orderer — are not four different entities. They are four descriptions of the same thing, arrived at from four directions. What is pure act, uncaused, necessary, and the source of formal order is one thing. St. Thomas called it God. The ledger framework does not name it. It simply finds the same boundary from a different approach.
It follows that the ordering the framework has identified is not confined to physics. The Rule governing physical propagation and the rational ordering that governs human action are not two domains with different foundations. They are expressions of the same underlying structure operating at different levels. The tradition that first mapped this unity already had technical language for it: what governs physical reality at the level of matter and energy is the Natural Law, and the same ordering, available to rational minds, is what ethics reads. The framework arrives at this unity from the physics rather than from the tradition, but it arrives at the same place. Physics and ethics look different because they operate on different substrates — one on matter, energy, and time, the other on will, reason, and choice. But if both are expressions of the same Source of Order, then the intelligibility of the universe is not a happy accident. It is precisely what the framework predicts: a universe whose structure can be grasped by minds made from it, because both the universe and the minds that read it participate in the same order. That the universe is rationally structured, and that rational creatures can grasp and extend that structure, is the signature of a Source that is itself rational.
One further point that the ledger framework draws from St. Thomas. The primary signer is not only the origin of the Rule and the initial state. It is the sustaining cause of every tick. At every "quantum" event — every moment where the Rule generates candidates and one is selected — the primary signer is present and active. This is not an occasional intervention. It is the continuous act by which the ledger exists and runs at all. The relationship between the primary signer and the ledger is not that of an engineer to a machine. It is that of a singer to a song being sung. The moment the singing stops, the song does not slow down and fade. It simply is not.
15. Evil, Death, and Entropy
Evil produces nothing. This has been said already, but it bears examination at each scale where it applies.
The structure is this: the Source of Order proposes the higher good; the human will, as a genuine signing function, either adds its signature or withholds it. When it withholds it, no evil is written — only the higher good goes unsigned, and a lesser good is signed in its place. At the level of the individual act, refusing to sign the proposed higher good means that the higher good — the one that would have carried new order into the system from above — is not written to the ledger. The will does not cease signing. It signs lesser goods instead: goods already circulating within the ledger, recirculating and dissipating, following the thermodynamic current rather than against it. Nothing evil is created. No dark entry accumulates. But the higher good that could have been is absent, and what fills its place is entropy by another name. St. Augustine’s formulation — that evil is not a substance but a privation of good — is exact, and the ledger framework specifies what is being privated: not merely a good in the abstract, but the ordered, life-giving input from above that the primary signer was offering. To prefer appetite to the order proposed from above is to choose the direction the Rule tends toward when left to itself. That direction is degeneration. That direction is death.
At the level of the body, death is the direct consequence of what the physics already established. The Rule tends toward entropy. Organized, low-entropy structures are not what the Rule produces spontaneously from within the chain — they are what appears when ordered information is written in from outside it. The soul, in the classical framework known as hylomorphism, is the form of the body — the organizing principle that organizes material addresses into a living, integrated system rather than a collection of chemistry following the Rule toward equilibrium. The soul is precisely the signer that writes the ordered information the body requires against the thermodynamic grain. While the soul signs, the body maintains its improbable structure.
When the soul stops signing — when the formal ordering principle is lost — the body is left to the Rule alone. The Rule does what it always does. It propagates energy outward, disperses concentrated structure, and drives the system toward equilibrium. This is not punishment. It is physics. Death is what the ledger addresses of a body do when the signer that was organizing them is no longer present. Dust to dust is the second law stated in older language. The soul itself does not cease with the body. It is a signer outside the ledger: when the ledger addresses it was organizing disperse, the organizing principle that was never inside those addresses does not disperse with them. What that means beyond the physics is outside the scope of physics to describe.
The same dynamic operates at civilizational scale, and here the argument becomes sobering. A civilization is an organized structure maintained against entropy by the collective signing of many human wills oriented toward shared goods. It accumulates structure over time — institutions, knowledge, art, law, the capacity to sustain and transmit what has been built. This accumulation requires that enough wills, over enough generations, sign enough of the higher goods proposed to them.
A civilization that systematically withholds those signatures does not generate evil in its place. It generates nothing. The structures that required active signing to maintain begin to disperse. Institutions lose coherence. Knowledge fails to transmit. The capacity to build things that last diminishes. The ledger of civilizational achievement does not go negative. It simply stops growing and begins, address by address, reverting toward the equilibrium the Rule tends toward in the absence of ordering input.
Here we see also the metaphysical importance of freedom itself as necessary to life. Tyranny, coercion, manipulation and deception all suppress the opportunity of the rational intellect to choose higher goods and import sustaining order. It is noteworthy in this connection that domination by any means will starve its victims of access to the ordering principle which is life itself. Mere complexity is not the basis of life. We can see this strikingly in the susceptibility of even massive-scale AI to the iron law of model collapse. Anything disconnected from the flow of order from above becomes too mechanical and begins to decay.
This need not be taken as a moral judgment imposed from outside. It is a thermodynamic description of what happens when the sustaining signatures are withdrawn. A civilization is not destroyed by its enemies first. It is first abandoned by its own signers, and what enemies find when they arrive is a structure already in the process of dispersing. St. Augustine saw this clearly watching Rome. He called the two communities the City of God and the City of Man, defined not by geography but by the direction of their love — toward the good that the primary signer proposes, or toward lesser goods that substitute for it. In ledger terms: two signing communities, one oriented toward signing the higher goods the primary signer proposes, one oriented away from them. The first accumulates structure. The second disperses it. The physics and the theology describe the same process.
16. Conclusion
This essay began with a debt and it ends with one.
The authors of Bitcoin: The Architecture of Time built something that made this conversation possible. The insight that time can be observed from outside, that the block is a quantum of irreversible causality, that keyspace is the address manifold of physical reality — these are not obvious things. They required the kind of patient, sustained, cross-disciplinary attention that is genuinely rare, and they were worked out with care. Whatever becomes of the framework under experimental scrutiny, that work stands.
What this essay has attempted is narrower. It set the framework on Thomistic foundations and observed what simplified. The observer dissolved. The Block Size Limit ≡ Planck Length equation fell away, replaced by the local neighborhood structure of the kernel, which encodes the same spatial character of the Rule without importing a coordination mechanism or presupposing space. The memory axis collapsed, not by stipulation but as a consequence of locality: once the Rule operates on present-state neighbors rather than ledger history, the current state of the keyspace already contains everything memory was tracking separately. Time and memory became two descriptions of the same object. The Propagation Rule emerged as the single source of apparent space, curvature, the inverse square law, and the arrow of time. The minimal ontology that remained — keyspace, Rule, block sequence, signer — is leaner than what the paper proposed, and it arrived at that leanness not by subtracting from the paper’s insights but by grounding them more firmly.
The theology was not added to the physics. It was found at the physics’ own boundary. The four things the Rule cannot generate — its own structure, the initial conditions, the selection at each quantum event, and the ordered structure that persists against the thermodynamic grain — converge on a single requirement: something that is not contingent, not inside the ledger, and present at every tick. The Thomistic tradition has precise language for this. The ledger framework makes that language newly concrete.
What remains open is the most important thing. The explicit mathematical form of the Propagation Rule has not been written down. Until it is written down precisely enough to derive the measured ratios of the physical constants — c relative to G and h, the fine structure constant, the particle mass ratios — the framework is a candidate, not a theory. Candidates must be tested. This one deserves to be.
The question of the signer will not be settled by physics. It never could be, because the signer is precisely what lies outside the ledger physics describes. But physics can settle whether the ledger framework is the right description of physical reality. If it is, the signer is not an optional addition for those who are not satisfied with an incomplete description. It is a structural requirement of the description itself. The physics points where it points. What sits at the end of that pointing has the nature it has, regardless of what anyone prefers to call it.
Bitcoin did not create this structure. It made it visible. A bounded ledger, advancing one irreversible tick at a time, running a Rule it did not choose, from initial conditions it did not set, signed at every quantum event by something that is not inside it — this is what the universe looks like when time is treated as a constructed object rather than a background parameter. Whether this is what the universe actually is remains to be shown. The showing is physics. The meaning of what is shown, if it is shown, is something older than physics, and is obtainable only by the cooperation of reason and faith.
Sources and Acknowledgments
References
Primary Source
The unnamed authors of Bitcoin: The Architecture of Time. This essay is a direct engagement with and extension of their framework. Every significant physical and informational insight developed here originates with that work, and the authors are owed a debt this essay can only partially acknowledge.
St. Thomas Aquinas
Summa Theologica. Prima Pars, Questions 2–3 (existence and nature of God, the Five Ways); Questions 44–46 (creation, conservation, and continuous causation); Prima Secundae, Questions 6–17 (the voluntary, the will, free choice). The Fathers of the English Dominican Province translation (Benziger Bros., 1947) is the standard English edition.
Summa Contra Gentiles. Book I, Chapters 13–22 (the existence and attributes of God as necessary being, unmoved mover, and first cause); Book II, Chapters 15–30 (creation, the distinction of creatures, conservation in being).
De Potentia. Questions 3 and 5 (on the power of God in creation and conservation, and on continuous creation as the ongoing dependence of contingent being on necessary being).
St. Augustine of Hippo
Confessions. Book VII (the nature of evil as privation, not substance — the philosophical resolution that evil is not a positive entity but the absence of good); Book XI (time and eternity, the question of what God was doing before creation, and the nature of time as distension of the soul rather than independent container).
The City of God. Books XI–XIV (the two cities defined by love and will, the origin of evil in the will's turning from God, the structural account of how communities oriented toward lesser goods disperse while those oriented toward the highest good accumulate).
The Bitcoin Protocol
Nakamoto, S. Bitcoin: A Peer-to-Peer Electronic Cash System. 2008. The original whitepaper establishing the proof-of-work mechanism, the UTXO model, and the block structure from which the physical analogy of this essay is drawn.
The Bitcoin Core reference implementation and protocol documentation, available at bitcoin.org and github.com/bitcoin/bitcoin, for the technical specifications of block structure, difficulty adjustment, the UTXO set, and the Segregated Witness weight limit.
Vertical Causation
Smith, W. Physics and Vertical Causation: The End of Quantum Reality. Angelico Press, 2019. The most direct treatment of vertical causation as the philosophical framework required to make sense of quantum indeterminacy. Smith argues that quantum measurement is the locus where vertical causation enters the physical world — a claim that maps precisely onto the ledger framework’s account of quantum events as the interface between the signer and the ledger. His broader project demonstrates the full coherence of this account with Catholic metaphysics.
Smith, W. The Quantum Enigma: Finding the Hidden Key. Sophia Perennis, 2005. The earlier and more accessible treatment of the same argument, establishing that the Copenhagen interpretation’s observer problem dissolves when vertical causation is properly distinguished from horizontal causation.
Physics Background
Bekenstein, J. D. “Black Holes and Entropy.” Physical Review D 7 (1973): 2333–2346. The original paper establishing that black hole entropy scales with surface area rather than volume, foundational to the holographic principle.
Hawking, S. W. “Particle Creation by Black Holes.” Communications in Mathematical Physics 43 (1975): 199–220. The thermodynamic treatment of black holes and the connection between horizon area and information content.
Maldacena, J. “The Large N Limit of Superconformal Field Theories and Supergravity.” International Journal of Theoretical Physics 38 (1999): 1113–1133. The AdS/CFT correspondence, the leading formal instantiation of the holographic principle, showing that a higher-dimensional gravitational theory is equivalent to a lower-dimensional field theory on its boundary.
Planck, M. The Theory of Heat Radiation. 1914. The quantization of energy exchange that first established that physical processes occur in discrete units rather than continuous gradients, the historical origin of the discrete time framework this essay develops.
Wheeler, J. A. “Information, Physics, Quantum: The Search for Links.” In Complexity, Entropy, and the Physics of Information, edited by W. H. Zurek. Addison-Wesley, 1990. The origin of the “it from bit” proposal that physical reality is fundamentally informational, a forerunner of the ledger framework.