Curvature Amplification of Tracking Complexity on Statistical Manifolds
What's this about?
Why external observers cannot keep up with high-dimensional inference — a theorem, not a metaphor.
Mathematical companion to Intelligence as High-Dimensional Coherence.
Everyone knows negative curvature makes trajectories diverge. What's new here is turning "hard" into a rigorous impossibility:
- A theorem, not intuition — a lower bound on distinguishable states that no observer can beat
- The right epistemic target — when you don't know the direction of inference, only how far it could have gone, you must cover a ball, not follow a path. That's what makes it exponential.
- Applied to inference, not billiards — the manifold is the statistical manifold (Fisher metric), and the punchline is about tracking learning, not generic dynamical systems
- An actionable diagnostic — the effective tracking dimension tells you exactly when curvature overwhelms nominal dimension
Most common distributions (Gaussian, Dirichlet, beta) have negative Fisher curvature. This means external monitoring of inference over these models hits a geometric wall — not because of implementation complexity, but because of mathematical inevitability.
Why it matters
If you're trying to monitor, interpret, or externally verify an inference process:
- You cannot keep up past a curvature-dependent horizon, no matter how clever your probes
- Adding more resources helps linearly, not exponentially — you can't outrun the geometry
- This is why AI interpretability is hard even for "simple" models — it's not engineering difficulty, it's geometric impossibility
The Intelligence paper claims external observers cannot track high-dimensional inference. This paper proves why — converting a qualitative observation into a theorem with the right scalings. The Observable Dimensionality Bound becomes a mathematical inevitability, not just an empirical pattern.
Key findings
- •
Rigorous lower bound: covering numbers grow exponentially in uncertainty radius
- •
Metric entropy grows linearly in (vs logarithmically on flat spaces)
- •
Effective tracking dimension as — infinite-dimensional to observers
- •
Capacity threshold: maximum trackable radius scales linearly in bits, not exponentially
Citation
Todd, I. (2025). Curvature Amplification of Tracking Complexity on Statistical Manifolds. Information Geometry (in preparation).
Workflow: Claude Code with Opus 4.5 (Anthropic) for drafting; GPT-5.2 (OpenAI) for review. Author reviewed all content and takes full responsibility.