Imagine you are trying to park a unicycle. You need to slide it sideways into a tight spot along a curb, but the unicycle can only roll forward, roll backward, or turn. It cannot simply drift sideways. So you have to execute a sequence of turns and rolls, a kind of vehicular dance, to end up where you want to be. The harder the task, the longer and more convoluted the maneuver.
Now imagine that the city council decides to double the fine for turning. Does that make parking twice as hard? For a short parallel park, absolutely. But what about a very long journey that involves a lot of turning and rolling? Surprisingly, and this is the key insight from a remarkable piece of new physics, the answer at long distances is almost no. The fine for turning becomes irrelevant. The only thing that matters, eventually, is how far you need to travel.
This parking analogy is not just a whimsical warm-up. It sits at the heart of a deep and genuinely surprising paper published in Nature, one that connects the mathematics of quantum computers to the geometry of curved spaces, and ultimately to the physics of black holes.
The Complexity Problem at the Core of Quantum Computing
Quantum computers operate on a fundamentally different principle than classical computers. Where a classical machine processes bits, each of which is either a zero or a one, a quantum computer processes qubits, which can exist in combinations of zero and one simultaneously. To do anything useful, a quantum computer executes a sequence of operations, called quantum gates, that transform the state of its qubits step by step.
The number of steps required to perform a particular transformation is called its quantum complexity. It is a measure of how hard a computation is, how much work the machine needs to do before the job is finished.
But here is a problem that may seem technical but turns out to be profound: there is no single agreed-upon definition of what counts as a step. Different researchers use different definitions. Some allow certain types of gate operations to be cheap, some make them expensive. Some count only two-qubit interactions, others allow more. The choice of what you decide to penalize changes the numbers you get.
This has always been a nagging worry at the foundations of quantum complexity theory. If the very definition of complexity is somewhat arbitrary, what does it actually mean to say a computation is complex? Is complexity a real thing or just a convention?
The new research, by a team affiliated with Google DeepMind and Stanford University, argues that this worry is substantially unfounded. Their claim: at high complexity, all reasonable definitions of quantum complexity agree with each other, up to a linear rescaling. Not merely up to some rough equivalence, but linearly. If one definition says a computation has complexity 1,000, another reasonable definition will say it has complexity around 2,000, or 500, but never a million or ten. The proportionality constant may vary, but the deep structure of complexity is the same regardless of how you measure it.
That is what it means, in this context, to say that quantum complexity is universal.
How Geometry Enters the Picture
The bridge from quantum operations to geometry is elegant and a little startling. Imagine the set of all possible states that a quantum computer can reach. This set is vast, an astronomically large space of configurations. Now think of each quantum operation, each gate, as a step along a path through this space. Moving from one configuration to another is like walking from one place to another on a landscape.
The key idea, developed over the past two decades, is to turn this analogy into mathematics. We can define a distance function on the space of quantum configurations, where the distance between two states reflects how hard it is to get from one to the other. Cheap operations correspond to easy directions to move in. Expensive operations correspond to costly directions. The complexity of a computation is then simply the length of the shortest path between two points in this landscape.
This geometry is not flat. It is curved, highly dimensional, and different choices of which operations are cheap or expensive correspond to different shapes of the space. Each definition of quantum complexity corresponds to a different geometry.
The question the researchers asked was: do these different geometries, these different curved high-dimensional spaces, give the same answers for the distances between far-away points?
A Lesson From Curvature
To build intuition, consider a simpler example the researchers use explicitly. Take a three-dimensional space and squash it, compressing some directions and stretching others until it becomes wildly distorted. How much has the distance between two far-away points changed?
The answer, surprisingly, is very little. At short scales, the squashing can change things dramatically. But at large scales, the distances become almost insensitive to the local distortions. The long-distance geometry decouples from the short-distance geometry. Two spaces that look wildly different up close look almost identical from far away.
This phenomenon has a name: long-distance universality. And it is not just a curiosity. It is, in fact, the foundational principle behind one of the deepest ideas in all of physics.
The Universality Principle That Physicists Already Know
Before going further into quantum complexity, it is worth appreciating how powerful the idea of universality already is in physics.
In the study of phase transitions, the precise way in which a magnet loses its magnetism near its critical temperature, described by a number called a critical exponent, does not depend on molecular details at all. Water, iron, and many utterly different materials share the same critical exponent. They belong to the same universality class.
The reason, as the Nobel laureate Kenneth Wilson showed in the 1970s, is that behavior near a critical point is controlled by large-scale fluctuations. Small-scale details wash out. You can change the atomic-level description of a material almost arbitrarily without changing its large-scale behavior.
The new paper applies exactly this Wilsonian perspective to the geometry of quantum complexity. The short-scale geometry, which depends on the exact choice of penalty factors, washes out at large distances. What survives is a universal long-distance structure.
Parallel Parking and the Cut Locus
Back to the unicycle. The researchers use this example not just as an analogy but as a worked-out mathematical case that illustrates the general principle.
When the cost of drifting sideways is high, the optimal parking strategy for a short parallel park is to just drift sideways, even though it is expensive. This means that at short distances, the complexity grows steeply with the penalty factor for drifting. Increase the fine, and short-distance complexity shoots up.
But after a certain distance, called the cut locus, the optimal strategy changes. Instead of drifting, it becomes cheaper to execute a sequence of turns and rolls that indirectly achieves the same sideways displacement. The indirect strategy uses the mathematical fact that commuting two easy operations, turning and rolling, generates a small amount of sideways motion for free. You turn, roll forward, unturn, and roll back. The net result is a sideways displacement proportional to the area enclosed by your maneuver.
Once the indirect strategy takes over, the parking cost no longer depends much on how expensive drifting is. If you were not using drifting in the first place, making drifting even more expensive changes nothing. And at very large distances, the cost of parking grows linearly with distance at a rate that is completely independent of the penalty for drifting. All penalty factors above a certain threshold define the same long-distance behavior.
The researchers prove this rigorously in multiple cases, and then argue, with supporting evidence, that the same phenomenon holds in the high-dimensional spaces relevant to quantum computing.
The Critical Schedule and the Universality Class
There is a special choice of penalty factors, which the researchers call the critical schedule, that plays a privileged role. It sits right at the boundary between relevant and irrelevant. For the critical schedule, the cost of any quantum operation is exactly equal to the cost of achieving the same operation indirectly through cheaper gates. Nothing is unnecessarily expensive.
The critical schedule has a remarkably clean mathematical form. The penalty for a quantum operation that touches k qubits simultaneously should grow roughly as 4 to the power of twice the number of qubits minus two. In other words, the cost grows exponentially with the number of qubits involved. This is not arbitrary: it reflects the fact that more complex operations become increasingly harder to build from simpler ones.
For the critical schedule, the geometry of the complexity space has another special property: it has very low curvature. This matters because curvature is what causes the cut locus to appear early. High curvature means you hit the cut locus quickly, and beyond the cut locus the simple linear growth of complexity is replaced by a complicated regime where indirect paths are optimal. Low curvature pushes the cut locus far away, meaning the simple linear growth continues for a very long time, exponentially long in the number of qubits.
For the critical schedule, the geometry remains simple all the way out. For other schedules in the universality class, things are complicated at short distances but simplify at long distances to match the critical geometry. Any schedule that charges at least as much as the critical schedule for every operation is in the universality class. The researchers conjecture that within this class, all definitions of quantum complexity agree on the complexity of any sufficiently complex quantum operation, up to a linear proportionality constant. This is a much finer level of equivalence than anything previously established.
Two Spaces That Look Different But Agree Everywhere
Perhaps the most vivid illustration of the phenomenon is a mathematical example in the paper. Take a three-sphere, which is a four-dimensional analogue of a beach ball. Squeeze it in one direction by a factor of a thousand billion. This makes the squeezed space radically different in its local geometry. Its volume, measured in terms of how many tiny marbles fit inside, becomes astronomically larger. Yet a careful calculation shows that the distance between any two points in the squeezed space and the original space differs by less than one picometre, which is smaller than the diameter of an atomic nucleus.
Two spaces that contain radically different amounts of volume, and that locally look nothing like each other, agree on every distance to better than one part in a trillion. The short-distance geometry and the long-distance geometry have completely decoupled. All the extra volume is hidden at scales too small to detect from far away.
This is the phenomenon at the heart of the paper, translated into the language of quantum complexity: many definitions of complexity that look different up close look identical from far away.
Why This Matters for Quantum Computers and Physics
The practical significance for quantum computing is this: researchers no longer need to worry that their results about quantum complexity depend on the exact definition they chose. Any reasonable definition, within the universality class, will give the same qualitative answers at large scales. This makes quantum complexity a genuinely robust concept rather than a convention-dependent one.
But the implications run deeper than quantum computing.
For the past two decades, there has been a conjecture in theoretical physics that the complexity of a quantum state is related to geometric properties of a black hole. Specifically, the volume of the interior of a black hole, the space behind its event horizon, grows over time. For a long time, nothing grows inside a black hole except this interior volume. Physicists conjectured that this growth corresponds exactly to the growth of the quantum complexity of the black hole's quantum state.
There was always a nagging problem with this conjecture. On one side you have a geometric quantity calculated precisely using general relativity. On the other side you have quantum complexity, which in the conventional view was defined only up to polynomial equivalence, meaning values could differ by factors as large as the number of particles raised to some power. Claiming these equal seemed like a category error, like equating a precise measurement in meters to a quantity defined only within a factor of a million.
If the conjectures in the new paper are correct, this problem goes away. The universality of long-distance complexity means that the value of the complexity of a particular quantum state is robustly defined up to a linear proportionality constant, not merely a polynomial one. This is exactly the precision needed to make the black hole complexity conjecture meaningful. The researchers note that this could enable a rigorous formulation of holographic complexity, the idea that the geometry of the universe itself can be understood as an emergent property of quantum information.
A New Program for Mathematical Physics
The paper is also the opening move in a broader mathematical program. The researchers ask: given a high-dimensional curved space, what is its long-distance universality class? Two spaces belong to the same class if they agree on all sufficiently large distances. How many such classes are there?
These questions connect quantum complexity to a rich area of mathematics called coarse geometry, which studies the large-scale properties of spaces while ignoring local detail. The conjecture is that the geometries of quantum complexity form far fewer universality classes than the naive count of possible penalty schedules would suggest.
The Idea That Keeps Returning
Physics keeps discovering that details matter less than feared. The microscopic rules of iron and water give way to the same thermodynamics. Ultraviolet corrections to a quantum field theory give way to the same low-energy predictions. And now, the specific penalty factors in a quantum complexity calculation give way to the same long-distance complexity geometry.
In quantum computing, this means the question of how hard a computation is has a robust answer, independent of many choices we thought were arbitrary. That robustness is what allows physicists to conjecture that the volume of a black hole interior could correspond to something as abstract as the complexity of a quantum state.
Whether you approach the problem from geometry, circuit theory, or black hole physics, the long-distance universality of quantum complexity says the same thing: the details wash out, and the deep structure remains.
Just like parallel parking a unicycle.
Publication Details
Published online: 4 October 2023
Journal: Nature
Publisher: Springer Nature
DOI: https://doi.org/10.1038/s41586-023-06460-3
Credit and Disclaimer
This article is based on original research published in Nature by a team affiliated with Google DeepMind and Stanford University, with additional contributions from the University of California, Santa Barbara, and Princeton University. All scientific findings, conjectures, and conclusions presented here are drawn directly from the original peer-reviewed publication. This article is intended as an accessible introduction to the research for a broad audience and does not substitute for the original scientific text. Readers who wish to examine the full mathematical derivations, formal conjectures, supporting evidence, and complete discussion of implications are strongly encouraged to consult the original open-access research article.






