Why Dimensional Collapse Quantizes
Gravity, language, and the lattice at the bottleneck
January 3, 2026
You can't say a thought that has no words.
Your brain operates in high-dimensional state space — millions of neurons, continuous dynamics, superpositions of intention. But to communicate, all of that must squeeze through language: a discrete vocabulary, sequential words, finite bandwidth.
The vocabulary is a lattice (in the loose sense: a discrete codebook with adjacency and grammar constraints). It's pre-existing structure in the low-dimensional output space. Your thought doesn't get "translated" — it gets selected by what the lattice admits. The continuous becomes discrete not because thought is discrete, but because the bottleneck has structure.
This is quantization. And the conjecture explored here is that something analogous happens in gravity.
(This post is deliberately schematic. The intuition is real; the question is how much of it survives formalization.)
Why Low-Dimensional Projections Force Discreteness
Two things combine when you project from high to low dimensions:
1. Collisions. Many distinct high-D states map to the same low-D point — forced overlap, aliasing. You cannot represent continuous dynamics faithfully. Instead, you get equivalence classes: groups of high-D states that are indistinguishable from the low-D side.
2. Pre-existing structure. The low-dimensional space isn't a blank canvas. It already has geometry, grammar, a lattice of admissible configurations. New information must be compatible with this structure. The lattice selects which equivalence class you land in.
Together: dimensional collapse forces collisions, and the lattice resolves collisions by selecting admissible outputs. That's quantization. Discreteness is imposed by the interface, not by the high-dimensional system itself.
This much is straightforward — it's how analog-to-digital conversion works, how language discretizes thought, how any lossy projection through structured channels operates. The speculative step is applying it to gravity.
Gravity as a Possible Instance
In 1995, Ted Jacobson showed that Einstein's equation — the governing law of general relativity — could be derived from thermodynamics applied to local null surfaces. (A null surface is a boundary that light rays trace out — the edge of a light cone, or an event horizon.) Heat flow across such a boundary, combined with the entropy-area relationship from black hole physics, gives you gravity as an equation of state.
This is an established result, not speculation. But it left a question: if gravity is thermodynamic, what should be quantized to get quantum gravity?
The standard answers — quantize the metric, or posit horizon microstates — have struggled for decades. Here's an alternative framing:
What if the null surface acts as a bottleneck?
The idea: matter dynamics are high-dimensional. The null surface is a low-dimensional interface where those dynamics get projected. The existing metric structure plays the role of the lattice — it determines which deformations are admissible.
If this analogy holds, quantization would appear at the interface for the same reason it appears in language: the lattice has finite resolution, and compatibility with existing structure forces discreteness.
This is a reframing of where quantumness lives in Jacobson-style thermodynamic gravity — not a derivation. The null-surface-as-bottleneck idea is suggestive, but calling it a "bottleneck" in the dimensional-collapse sense requires showing that the projection from bulk dynamics to surface data actually has the collision-plus-lattice structure described above. That's an open problem.
The Clausius Residual: Measuring the Gap
To make the bottleneck idea more concrete, consider what happens when projection through a structured interface is imperfect.
Classical thermodynamics gives us the Clausius inequality: for a reversible process, heat flow and entropy change are perfectly matched. Define the Clausius residual as the local mismatch:
Operationally: you measure energy flux () across a local patch of null surface and compare it to the entropy change () that the surface geometry records. The residual is whatever the surface couldn't account for — the gap between what arrived and what got registered.
In classical general relativity, this residual vanishes on average. Perfect accounting. Every bit of energy flux is reflected in geometry.
The conjecture is that in quantum gravity, this residual fluctuates. The interface has finite resolution — it can't certify arbitrarily fine distinctions. .
The analogy to language: is what you couldn't say. The conditional entropy — everything in the high-dimensional input that didn't survive projection to the low-dimensional output. Classical GR is perfect translation. Quantum gravity, in this picture, is the acknowledgment that some information is lost at the bottleneck.
Whether this is the right way to formalize the idea is genuinely open. The residual is well-defined; the claim that its fluctuations generate quantum gravitational effects is the conjecture.
What About Planck's Constant?
It's tempting to say: if the null surface is a lattice with finite resolution, then is the lattice spacing. And indeed, a natural area scale exists: , the Planck area. Combined with the Bekenstein-Hawking relation (), this gives — roughly one bit of resolution.
But there's a circularity problem. We're using to define the lattice spacing, and then claiming the lattice spacing explains quantization. That's not an explanation of — it's a consistency check. The framework is compatible with setting the resolution, but it doesn't derive from something more fundamental.
An honest version of the claim: if null surfaces act as structured bottlenecks, then the Planck area is the natural scale at which the bottleneck's discreteness becomes relevant, and parameterizes that scale. This is a reinterpretation, not a derivation. Deriving the value of from the bottleneck structure would require a theory we don't have.
Expected Correction Channels
If the interface has finite resolution, two types of corrections are expected:
1. Stochastic noise. The Raychaudhuri equation (which governs how bundles of light rays converge or diverge) would pick up a noise term from certification fluctuations — the interface can't decide perfectly, and that uncertainty shows up as random focusing of geodesics. The expected amplitude scales as — negligible at macroscopic scales, potentially significant near Planck-scale horizons.
2. Higher-curvature corrections. When the geometry changes faster than the interface can track (a non-equilibrium regime), the effective gravitational action would pick up curvature-squared terms. This is a dissipative lag — the interface can't keep up with rapid metric changes.
These are expected consequences of the bottleneck picture, not derived results. Both correction types already appear in other approaches to quantum gravity (stochastic gravity, higher-derivative gravity), so finding them here is encouraging but not decisive. The bottleneck framework would need to predict specific coefficients or relationships between the two channels to be genuinely testable.
The General Principle
The part of this that's solid: quantization as dimensional collapse through structured bottlenecks. This is not speculative — it's how analog-to-digital conversion works, how measurement theory works, how any lossy structured projection operates.
The part that's conjecture: that gravity is an instance of this pattern, with null surfaces as the bottleneck and metric structure as the lattice.
The general pattern, where it applies:
- High-dimensional dynamics must pass through a low-dimensional interface
- The interface has pre-existing structure (a lattice or codebook)
- The lattice forces discreteness by selecting admissible outputs
- The resolution of the lattice sets the "quantum" of discreteness
The speculative extension: standard quantum mechanics treats quantization as ontological — the substrate itself is discrete. This picture proposes instead that the substrate can be continuous and high-dimensional. Quantization describes the interface, not the substrate. Discreteness is a property of the channel, not of reality.
Whether this reframing leads to new predictions or merely redescribes known physics in different language is the central open question. The language analogy is clear. Whether gravity actually works this way is not.
This post sketches an intuition, not a result. The full paper (in preparation) attempts to formalize the bottleneck structure, define the certification map precisely, and determine whether the correction channels are genuinely predicted or merely accommodated. The hard part is turning a good analogy into a falsifiable claim.