“the difference between physics and metaphysics is not that the practitioners of one are smarter than the practitioners of the other. The difference is that the metaphysicist has no laboratory.”– Carl Sagan
Physics and philosophy are like cosmic yin and yang, complementary and complete, our universe their union. They exist in distinct spaces; physics being the realm of empirical inquiry, metaphysics being that which is not. Metaphysics takes over where physics leaves off. Or so it would seem. Nature, it turns out, might not be so black and white.
The grey twilight of reality, which I will call the physics-metaphysics gap, might remain murky, forever evading our understanding. What exactly is this gap, and how does it come about?
Putting the Meta in Metaphysics
Language is fickle; richly expressive and inherently beleaguered with misunderstanding and miscommunication. The mere use of language changes sentiments; context and history, homophone and homonym, imbue words with new meaning. Accidental connotations accrue, and with them our appreciation for the original usages. The linguistic goal of clarity is hopelessly Sisyphean. The word ‘metaphysics’ is susceptible to the same unstoppable forces. And as its meaning changes, so too does its domain of discourse.
Yet this property of language is just as much a feature as a bug. By leveraging linguistic developments, we can dig more deeply into what the original meaning less convincingly penetrates.
“Physics”, deriving from the Greek ta physika, translates to “the nature of things”. The use of “meta”, meaning “beyond” or “after”, is delightfully deceptive in its etymological origin! Indeed, “metaphysics” first appeared as the title of a sequence of Aristotle’s works published after his treatise on physics. Ironically, at least from a modern point of view, the “meta” in metaphysics was meant literally!
The same modern perch, however, implores a more contextual interpretation. Only considering the title’s explicit semantics severely disservices Aristotle, Andronicus of Rhodes – the editor of Metaphysics credited with its naming – and the influence Aristotle’s works have had on shaping our society. Aristotle’s works on metaphysics were not haphazardly placed after his works on physics. Aristotle regarded physics as the study of change, and “first philosophy”, the subject matter of his metaphysics, the study of that which persists through change.
Quite conscientiously it seems, he believed that one must understand the change beforebeginning to grasp the constant. In this sense at least, metaphysics at its core was meant to represent something beyond physics.
Remarkably, even as the mechanics of our physical theories has evolved, as has the meaning of the word physics itself, much of Aristotle’s original distinction remains. “Physics” still refers to the study of the natural world, but it has also come to represent our investigation of the physical world through the scientific method. Prediction and probe, hypothesis and revision are the central tenets of modern science.
This science is intimately connected to change. Newton’s laws relate the changes in motion of objects to the influence of forces. Schrödinger’s equation describes evolution of quantum states in time. Even Einstein’s famous equivalence between mass and energy is about the conversion of matter into energy. The equations of physics do not make philosophical claims about what objects, quantum states, or matter and energy are. Of course, it’s not always easy to separate the physics from the philosophy. Metaphysical assertions about the nature of space, time and existence can affect the way physicists test hypotheses. In turn, mathematical structures found to underlie physical constructs can force new philosophical interpretations.
Leaning into the linguistic conflation of physics and the science of physics, perhaps a modern metaphysics should concern itself with what is truly beyondphysics as a science. What are the limitations on what we can learn about our universe through physics?
This new line of thought aligns quite nicely with twentieth century developments in the usage of “meta”, where metalanguage arose as the study of language, and metamathematics as the mathematical study of mathematics itself. For instance, whereas logic considers deductions from a logical theory, metalogic studies the truths derivable about logic systems themselves.
In the early twentieth century, mathematician David Hilbert set forth the goal of putting mathematics on a firm foundation by finding a finite set of axioms from which all known mathematical results could be proven. This movement, known as Hilbert’s program, was met with considerable optimism as the math community came together to achieve this grand goal.
The result was a crowning achievement indeed, but for metalogic rather than mathematics. In his eponymous incompleteness theorems, Kurt Gödel showed that any consistent set of axioms is incapable of proving all arithmetic truths about the natural numbers – and even more drastically, incapable of proving its own consistency. In so doing, he demonstrated once and for all the limitations of logic.
The Collider and the Computer
A mathematician and friend once told me in jest that “mathematics is about the math, but physics is about the physicist”. His sentiment rings astoundingly true. In the popular imagination, the physicist is an eccentric genius. Stories surround these physics folk heroes: Newton poking needles in his own eyes; crazy-haired Einstein using never-to-be-cashed checks as bookmarks; Feynman playing the bongos and picking locks. Yet there’s always a strand, implicit in this lore, connecting the quirky and unconventional with inspired creativity.
We’re made to believe that eventually the Theory of Everything will come to light, illuminated by another idiosyncratic individual who saw the world just a little bit differently. Creativity is the answer; mathematics the language.
But is creativity really the answer? If the truths derivable via logic are limited, then surely our ability to ascertain truths about the physical world is as well. As clever as humans are, our ability to explore the physics of our universe is still bound by the physics of our universe!
The culprit is the empiricism baked into the scientific method – the same empiricism that delineates physical and metaphysical inquiry. Falsifiability, first espoused by philosopher of science Karl Popper, is one of the guiding principles of modern science. It says that any scientific hypothesis that cannot be falsified is not scientific, plain and simple.
This brings us to the physics-metaphysics gap: physical phenomena governing our universe that are not falsifiable by the science that we call physics. In our universe, the small fall within this gap.
According to our current understanding of physics, all matter is made out of particles or particle-like objects which pop in and out of existence. The more fundamental a particle, the smaller it is and the more energy required to create it. We directly test particle physics theories via collisions predicted to create the hypothesized particles.
The history of particle physics has been one of scattering and collisions with ever-increasing energies. To “demonstrate” the existence of the Higgs and W and Z Bosons for instance, required the construction of a 17 mile long collider (the Large Hadron Collider), and perhaps the most tremendous global scientific collaboration to date.
But this strategy is inherently limited by the energy-length scale relation: if we wanted to directly probe potential ‘stringy’ behavior of quantum gravity, we’d need a collider the size of our galaxy! If we somehow managed to build such a collider and the results led us to hypothesize that strings were made of even smaller objects, then the corresponding collider would potentially need to be larger than our universe itself!
Just as astoundingly, we are also fundamentally limited in our ability to even ascertain predictions of our physical theories. In quantum chromodynamics, an attempt to unite quantum mechanics and the strong nuclear force, predictions cannot be made analytically. The issue turns out to be eerily similar to that which plagues our experimental probing capabilities – a problem of large energies. In this case, representing space or time as a continuum leads to infinite energies, and the theory is not mathematically well defined. Instead, spacetime is approximated as a discretized lattice.
This approach is known as Lattice QCD, is well defined. However, to have bearing on the physical world, the lattice spacing must be taken to 0, approaching the continuum limit. As the lattice spacing becomes smaller, the requisite compute quickly increases. Even determining the precise predictions of QCD entails extrapolation and often multiple approximations, combined with world-class supercomputers.
QCD doesn’t even account for gravity. Classically computing predictions from any theory of quantum gravity would likely be even more intensive, and would quickly become intractable!
Whether there is some ultimate Theory of Everything or it is ‘turtles all the way down’, there’s a point beyond which we will neither be able to directly probe the physics of our universe, and beyond which we will not even be able to determine the predictions of our physical theories about our universe.
One natural corollary is in order: we hold string theory in a certain regard because its predictions are not falsifiable. However, what we fail to acknowledge – at least explicitly – is that this problem is not inherent to string theory. Rather, it is a feature of all theories at those length scales.
As crazy as this seems, it’s maybe even crazier to think that it didn’t have to be this way. Philosophically, there’s a distinction between necessary truths, which are true in every possible world, and contingent truths, which just happen to be true in our world. Whereas the incompleteness of logic is necessary, the existence of a physics-metaphysics gap is contingent on the particular physics of our universe.
If the basic laws governing our physics were different, it is quite possible that we could directly probe the fundamental elements of or universe through the scientific method. There’s no reason a priori that energy had to be related to length scale, or that the length scales of the ‘smallest’ physical objects had to be so damn small. By similar logic, our physical theories could have given analytic – or at least computationally tractable – predictions. More generally, there’s no reason energy, space or time had to be relevant concepts in describing the natural world.
The Path Forward
So does all of this mean that our efforts are in vain? Should we abandon all hope?
Maybe we just need to go back to the drawing board. Metaphysically, maybe by slightly modifying our conceptions of space and time or by moving away from a particle-centric view of the universe, we can narrow or close the physics-metaphysics gap. Physically, perhaps we can find other, indirect methods of probing.
At a Quantum Gravity in the Lab workshop at Google X I attended a few months ago, theorists and experimentalists from around the world came together to initiate the practice of a new sub-discipline in physics. The central thesis was that rather than directly probe the small-scale phenomena of quantum gravity, we can use the quantum nature of quantum computers to simulate emergent quantum gravity-like behavior.
Matter in our universe must obey the physical laws of our universe. By manipulating matter in such a way that it is explainable in terms of emergent space-time dimensions, we can indirectly learn about the emergent space-time structure of our own universe. Such a strategy blurs the boundary between predicting and probing. And this might be exactly what we need.
The workshop’s principle paper (the impetus for the workshop itself), Quantum Gravity in the Lab: Teleportation by Size and Traversable Wormholes, proposes a table-top experiment to ‘test’ teleportation. If we use a quantum computer to chaotically scramble an input message containing quantum entanglement, the initially surprising result is that at some point in time after the message decoheres, it actually comes back into focus.
As the authors argue, this can be understood in the context of quantum gravity as if the initial state consists of two entangled black holes connected by a wormhole. In this picture, the decoherence and re-coherence of the message is due to its ‘teleportation’ through the wormhole!
What justifies this picture? And what do we gain by taking this perspective? It turns out that this engages essentially with one of the premier conjectures of theories of quantum gravity:
ER = EPR
This pithy proposed equivalence states in broad strokes that quantum entanglement is intimately related to the structure of space-time. ER stands for Einstein-Rosen Bridges, colloquially known as wormholes, and EPR (Einstein-Podolski-Rosen) refers to a pair of entangled particles. The conjecture then states that entangled particles are connected via wormholes.
By testing the refocusing of an entangled message, this experiment proposes to indirectly probe conjectures about theories of quantum gravity. If successful, such an experiment would use (emergent) Einstein-Rosen Bridges to help bridge the physics-metaphysics gap!
Whether or not this particular experiment proves useful, the larger lesson stands: we are going to have to be more creative in how we test physical theories. Indirect probing provides less convincing evidence and is beset by many philosophical difficulties, prime among them the ‘theory-ladenness of observation’. It might be our only hope. Is it enough?