Many of the great advances in science are marked by the discovery that an aspect of nature we thought was fundamental is actually an illusion, due to the coarseness of our sensory perceptions. Thus, air and water appear to us to be continuous fluids, but we discover on deeper experiment that they are made of atoms. The Earth appears to us motionless, but a deeper understanding teaches us that it moves relative to the sun and the galaxy.
One persistent illusion is that physical objects only interact with other objects they are close to. This is called the principle of locality. We can express this more precisely by the law that the strengths of forces between any two objects falls off quickly—at least by some power of the distance between them. This can be explained by positing that the bodies do not interact directly, but only through the mediation of a field, such as an electromagnetic field, which propagates from one body to the other. Fields spread out as they propagate, with the field lines covering a constantly greater area—providing a natural explanation for the laws that say the forces between charges and masses fall off like the square of the distance between them.
Locality is an aspect of an even more compelling illusion: that we exist within an absolute space, with respect to which we mark our positions as we move “through” it. Thus, Newton opined that motion is ultimately defined as change of position with respect to absolute space. If this seems obscure—because no measurement can establish a relation of a physical object to this imagined absolute space—Newton assured us that absolute space is seen by God, making your location relative to it an aspect of the divinity of the world. We humans must make do with relative positions and motions—which are defined relative to physical objects we can see.
Leibniz broke the mystification by declaring that all that exists is relative positions and motions. He proposed as a matter of principle that any acceptable science of motion must be formulated in terms of relative motions alone. And this, after two centuries of waiting, is what Einstein delivered to us in his general theory of relativity. In this glorious construction, space is subsumed into spacetime, which is explicable as a dynamically evolving network of relationships.
And what defines those relationships? Nothing but causality. The elements of space-time are events—the ultimate expression of locality—and each of these is caused by events in their past. Each event will also become a cause of events in the future. Most of the information in the geometry of spacetime is actually a coding of the relations of causality that relate the events.
So, we see that the idea that physical forces must act locally is a consequence of a deeper principle, which is that physical effects are due to causal processes. And the basic principles of relativity theory insist that causes can only propagate through space at a finite speed, which cannot exceed the speed of light. We call this the principle of relativistic causality.
This principle would seem to be so natural that it must be true. But not so fast. Of all the strange aspects of quantum physics so far discovered, the strangest of all has to be the shocking discovery that the principle of relativistic causality is violated by quantum phenomena. Roughly speaking, if two particles interact and then separate, flying far apart from each other, they nevertheless may continue to share properties of a strange kind, that may be ascribed to the pair, without each of the individuals having themselves any definite properties. We say the two particles are “entangled.”
When two particles are in such an entangled state, an experimenter can, it turns out, affect the properties of one of the particles, directly and immediately, by choosing to measure some particular corresponding property of the other. It matters not at all that it would require a signal much faster than light to effect directly such an influence.
This has been shown in many experiments carried out since the 1970s, which test a notion of locality formulated by John Bell in 1964—and all the results show that entangled pairs violate that concept of locality.
In its present form, quantum mechanics only predicts statistical averages for the outcomes of many kinds of experiments, including these. Consequently, it is not possible to use the nonlocality present in entangled pairs to send a signal faster than light. But many physicists, in an ambition going back to Einstein, de Brgolie, Schrödinger and the other inventors of quantum mechanics, aspire to discover an improved version of quantum theory.
This would go deeper and replace the present statistical theory with a more complete theory, which would provide a complete and exact description of what goes on in every individual quantum process. For such a theory to work, it would have to be based on influences traveling arbitrarily faster than light, thus destroying the principle of relativistic causality as well as our intuitive notions of local influence.
Is such a more complete understanding of quantum physics possible? And, how are we to search for it? I believe it is not only possible but an inevitable next step in the progress of physics. I believe that the completion of quantum mechanics will be a major part of the resolution of another deep problem—that of unifying our understandings of gravity, spacetime and the quantum, to produce a quantum theory of gravity.
The reason is that there is good evidence that the quantum theory of gravity will itself engender big violations of locality. And, as Fotini Markopoulou and I first proposed in 2003, the violations of locality forced on us by quantum gravity are precisely what are needed to explain the nonlocality brought on by quantum entanglement.
If we are to have a complete physics, we must unify the geometrical picture of spacetime given by general relativity with quantum physics. There is some theoretical evidence that this project of making a quantum theory of gravity will require space and spacetime to become discrete and built out of finite atoms of geometry.
In the same sense that a liquid is just a description of the collective motions of myriads of atoms, space and spacetime will turn out to be just a way of talking about the collective properties of the large number of atomic events. Their constant coming in and out of being, causing the next ones as they recede into the past, make up the continual construction of the world—also known to us as the flow of time.
The aim of a quantum theory of gravity is then first to hypothesize the laws that govern the elementary events, by which they continually come into being and then recede into the past. Then we must show how a large-scale picture emerges, in which these discrete events become subsumed in an emergent description of a smooth and continuous spacetime—as described by Einstein’s 1915 general theory of relativity.
Initially there is no space—just a network of individual elementary events, together with the relations expressing which of these were the direct causes of which other events. The notion of the flow of events collectively giving rise to a smooth description in terms of the geometry of a spacetime must emerge—and the most important aspect of this is locality. The notion of distance must emerge, and in such a way that those events that are close to each other are, on average, correspondingly more likely to have influenced each other. Getting this right is the holy grail of quantum gravity theorists.
Notice that if this is right, there are two notions of locality: a fundamental locality, which is based on the actual facts of which fundamental events were causes of which, and an approximate, collective, emergent notion of which events are near to each other in space and spacetime. The familiar macroscopic notion of distance is then based on a collective averaging of all the myriad of fundamental causal processes. To get a sense of how much is involved in this average, we expect that during each second there are around 10120 elementary events happening within each cubic centimeter of space.
Indeed, one way to approach quantum gravity is to aim to derive the Einstein equations, which are the laws general relativity applies to spacetime, from the laws of thermodynamics, applied to myriads of elementary events. This strategy was introduced by Ted Jacobson in 1995 in one of the few papers admired by quantum gravity theorists of all stripes
But here we get a surprise and, quite possibly, an opportunity. For the collective, large-scale notion of nearness is only meant to correspond to the fundamental notion of causality when averaged over vast numbers of events. This gives the individual fundamental events and their causal relations a great deal of freedom to depart from the averages.
For example, let us pick just two elementary events, one in the cup of coffee you are now drinking and the other in a cup of whatever it is they drink on one of the planets of Proxima Centauri. These events may be separated by four light-years—but nothing prevents one from being an elementary cause of the other.
We can choose these two events so that they are nearly simultaneous as we (or the Proximas) measure time. So, it violates the principles of Einstein’s theories of relativity to have one of these events be the cause of the other. But there need not be a contradiction if we regard the laws of relativity as emergent regularities to govern the collective large-scale average. This is just how we regard the laws of thermodynamics as arising from averages over large collections of atoms, whose individuals follow different laws.
When a law emerges from a statistical averaging, there are always relatively rare events, in which individual atoms violate the rule that holds on average. We call these fluctuations. A good example is the tendency of collections of atoms, when cooled, to form regular crystal patterns. But from time to time an atom ends up in the wrong place, disrupting the beautiful symmetry of the crystal arrangement. We say the pattern has been disordered.
I can then summarize the story I’ve been telling by saying that when locality, and space itself, emerge from averaging over fundamental processes involving a myriad of individual events, it is inevitable that locality will be disordered. Mostly, influences will be local, because most of the time, causally related events will end up close to each other in the emergent rough description we call space. But there will be many pairs of events that are causally related, that will end up far from each other—thus disordering space and locality.
Could this disordering of locality serve to explain the quantum nonlocality inherent in entangled particles? I believe the answer is yes, and indeed we have shown that this is the case in two different models of fundamental completions of quantum mechanics.
The details are unimportant, especially at this early stage. But the takeaway lesson is that the intuitive idea that objects influence each other because they are close in space is soon to become another of those easy beliefs that turn out to be wrong when we look deeper. The smoothness of space is soon to become an illusion that hides a tiny and complex world of causal interactions, which do not live in space—but which rather define and create space as they create the future from the present.