Re-imagining Spacetime --
Is Physics only about Physics?

And do physicists own the sole rights to it?

Physics is the study of stuff that happens---or, at least, that's how it began. As its reputation deepened and its cultural baggage expanded, it became---like every other specialization---pickled in its own special culture and norms, as the preserve of what physicists consider their own territory. But physics is useful in other disciplines too: Information Technology, for one, though the language is not well adapted to it. Should physicists have the sole rights to narratives that are so important on all levels? Do physicists get to decide on the boundaries of physics? My new book is about just this topic, and I argue that there is much to be learned in both directions by rethinking the hallowed ideas of space and time, and appying them to the modern world.

Time for a wider view?

As a physicist who has wandered between disciplines in a way that academia generally frowns upon, I've found great inspiration and solace in the methods, and meta-principles of physics---and have had some success in applying them to the realm of modern computing. Many computer scientists balked at the audacity of trying to describe computing as physics. This makes me see them as trapped in a single scale description of computer code and Turing Machine ideology. Conversely, many physicists tried to dismiss it, in turn, as reasoning by analogy---a kind of pitiful attempt at pedagogy by someone who strayed from the true path. What is discovered in physics is `objective truth', and everything else should be application of that. But this professional snobbery that abounds in science overlooks an important opportunity---to challenge conventional ideas with new information, if we only look.

One area which has raised red flags is in the concepts of space, time, and scale. Those topics impinge on just about every branch of science and technology, from language to rocket science---yet, it seems that there is a need to address deep-rooted issues and consolidate a modern treatment, because they underpin so many aspects of the way we frame phenomena. It's all a bit of a mess: even while physics claims some ownership of space and time, it treats them in an inconsistent way across its subfields, and each has its own normalized narrative. The culture of physics is probably to blame. In recent decades, the `industrialization' of physics methodology has led to an elevation of productivity over critical thinking. We have come to believe in our own mathematical descriptions as the template for nature, rather than merely a helpful language for telling its stories. The original motivation for the mathematics sometimes gets swept away with the belief that maths doesn't lie.

What if we treat spacetime as a network?

What if space is not like the Euclidean ideal we learn in mathematics, but more like a network? What happens to the perception of space and time that describes interactions as you shrink them? What happens to the ability to measure things as you shrink or expand an observer? Why does quantum mechanics use complex numbers? Is it a kind of mystical twist on reality, or simply that we left out some implicitly degrees of freedom in spacetime that happen to matter on a small scale?

These questions are incredibly important. They are hardly new. Alas, they are not usually handled well by physicists, because of the way matter and spacetime are treated as separate phenomena, and inconsistently to boot across its branches. It's common to scale part of the picture without scaling all of it. I've seen how this causes distortions of understanding in the world of computers, and I've come to believe that we may be creating problems for ourselves artificially by clinging on to convenient but inconsistent approximations as if they were truth. We treat the model of spacetime used in physics as fundamental, because we believe physics seeks the fundamental---but, in cold mathematical terms, a computer network is more elementary in its construction than the imagined `Euclidean manifold' with its smoothed over continuum approximation, which forms the mathematical basis for all traditional techniques in physics.

Physics is not very good at updating its explanations of phenomena. Once a theory has been published, that particular telling of the story is usually held on to as gospel, unflinchingly, until the next great revolution.

One of the fundamental questions in physics is whether properties are invariant or altered when considered at different scales. Some phenomena are scale dependent---you can't scale biology to the size of a solar system or to the size of an atom. Others may exhibit a degree of scale invariance. But we apply these ideas piecemeal and inconsistently. I never saw anyone apply this kind of scaling to relativity or `the measurement problem'. In Quantum Mechanics, we only discuss decoherence as the disappearance of quantum truth into classical illusion. What if it's the other way around? Mismatching of scales can lead to strange results.

Nothing can be completely scale invariant---universality eventually succumbs to specific representations, like atoms or subatomic particles. Can you really scale a ruler or a clock? Does it matter when you use a concept of time designed for planetary rotation on a process smaller than an atom? This is what we do. How about a property like a commutator in quantum mechanics--does it scale? The commutator is surely scale invariant? But what about the concepts of position and momentum themselves? Does it make sense to use a `quantization procedure' based on non-scalable concepts to dictate a theory on a smaller scale, where the concepts might not even be valid? Isn't that asking for trouble?

Wave-particle brainlock

In popular--and even professional--writing about quantum physics, the conception of particles, waves, and quanta is mystfied and muddled in a bizarre and almost irrational attempt to hold onto a false narrative. We continually ask: Is matter particle or wave? Do we ask--is water a wave or a molecule? What a ridiculous and artificial issue. The waves in Quantum Theory are no more fundamental than water waves. Waves are not elementary phenomena; they are large scale (non-local) distribution processes, carrying information about something underlying; particles are localized information about singular observations. And quanta? They are countable processes. There are three utterly distinct ideas--is it really so hard to get this straight?

In engineering, mechanical devices get built from the bottom up---from components on small scales to mechanisms on larger scales. Explanation doesn't always work like that though, which distorts our views. We may need to revist the way we approach issues of scale, especially around quantization in physics, but not also in the "dataverse", i.e. the world of cloud computing which is starting to look surprisingly similar to extreme physics.

Dynamical similarity

Isn't that just pseudo-science? The principle that similar things behave in similar ways is built into physics at every level---and yet it gets overlooked from time to time when phenomena seem to be pinned to a particular scale, and assumptions get built into our models of the world.

Physics apparently got lucky in applying some top down `quantization' methods to QM. It works surprisingly well, but the current approach to identifying dynamical variables by matching to Newtonian limit seems unlikely to have a future. Top-down thinking violates the principles of information scaling. Momentum is an inherently non-local concept, for instance, yet we treat it as local. Moreover, a vector field is an idealization that can't easily be implemented in a discrete space, but these formulations are elevated above physics today. Formalism before understanding--helpful for writing papers but not for solving problems.

There is a catch however: this is all built on trust. Einstein taught us that the correct way to understand something is to stick to what you can observe. You don't really know anything except what you can observe directly: you are free to speculate about what you can't see (as long as it doesn't contradict experiment) but the only goal worth pursuing is to believe the experiments. He then went on to show that the processes of observation are fundamentally limited and can lead us to believe an incorrect version of events unless we study the mechanics of observation too. For instance, when you see yourself in a distorting mirror, you might not want to believe what you see. And when you measure something through a magnifying glass, you might want to be careful where you place your ruler. If you can't trust your measuring apparatus---i.e. the processes you piggyback on to measure things---then you are stuck in a relativity catch 22.

Time's arms and hands race

You can't make a clock without implicating both space and time together (it's a process). So then, what kind of clock could an atom, electron, or a photon, say it was using to obey the laws of physics during its interactions? If you don't even have atoms to distinguish the states of different time, at the subatomic level, then what meaning would time have to an electron?

What Einstein effectively said was that time can only be trusted once you've decided on your ruler and clock. If you can't trust your measuring apparatus---i.e. the processes you piggyback on to measure things---then you are stuck in a relativity catch 22. A careless application of relativity assumes that spacetime has infinite resolution. This is simply inconsistent. Even if you could find enough distinguishable states to build a clock, how do we know the rate at which it ticks is constant as far as the subatomic process is concerned (and does it matter)? All we could do would be to try to measure it with another clock---another process. So, in measuring change and defining time, we are doomed to race different processes.

And now that issue is beginning all over again in the world of computers.

Look to the Internet---a semantic spacetime

I am one of a growing number who believe that we need to sort out the basics of space and time in order to make further progress in understanding systems---not just in elementary physics, but in every challenging area of dynamical and semantic systems (similar thoughts have been written by Carlo Rovelli, Gerard t'Hooft, and Philip Ball to mention but a few). There is plenty of room in those fundamental concepts to accommodate the weirdnesses we covet in fundamental physics, and in modern technology. That means allowing for more of the degrees of freedom that phenomena entail across different scales. It turns out that we might be able to learn something about spacetime by studying computers instead of experiments at CERN.

There are many ideas to unpick here. No introduction could do more than scratch the surface---but I think it's important to bring the ideas to discerning audiences across physics, computer science, social sciences, and beyond. To that end, I've written a book that pits these different viewpoints against one another. What if spacetime were a network? What happens if we try to scale measuring apparatus up and down? Do we recover Quantum Mechanics or Cognition? How does understanding dimension, propagation, and scale affect semantics of perception in human and artificial intelligence? Physics tends to eschew semantics, but only by embracing functional distinctions can we hope to truly find a Grand Unified Theory of the world.

To avoid immediate controversy, I've focused on the practical issues of how we apply spacetime concepts, deliberately avoiding subjects like Loop Quantum Gravity or String Theory that take on some of these issues. No one knows that true nature of the physical universe---nevertheless, but there is a lot we can say from general principles, and it seems encouraging that a network viewpoint is capable of removing some of the annoyingly coveted mystery of phenomena in small scale physics, as well as reveal how the equivalent phenomena are now appearing in technology.

Physicists may love or loathe the cross disciplinary aspect of my book at first. It encroaches rudely on physics territory, to be sure---and gives equal credence to processes on all scales. I don't hold to the doctrine that the quantum world is special or unique, just smaller. Having worked with computer systems for a quarter century, I've seen equally strange things happen with simpler explanations. Exciting possibilities lie in finding phenomena that can be exposed in computer systems, and then searched for in the subatomic world too.

[Get the book]

M.B. 4th March 2019