--or How Quantum Theory May Sharpen the Blade of Occam's Razor
.

The 1999 movie 'The Matrix' explored a world where humans are plugged into a virtual reality. They go about their daily lives, unaware that the sensory inputs that they receive do not originate from their perceived reality. When a person, Alice, within the matrix observes a watermelon falling from a skyscraper, there is no skyscraper, nor watermelon, nor even gravity responsible for the watermelon's fall. Instead a complex computer program works silently in the background. The initial state of the watermelon, and the location of the observer, is all encoded by bits. The computer takes these bits, processes them according to a predetermined algorithm, and outputs the electrical signals that dictate what the observer should see.
To we who live in the twenty first century, whose lives are enmeshed in various information processors, the eventual plausibility of the Matrix does not appear as radical as it once did. One by one, the photos we view and the mail we send, have been converted to digital form. Common questions, such as, "How many megabytes does that song take up?" reflect a society that is becoming increasingly accepting of the idea that the observable qualities of every object can be represented by bits, and physical processes by how they manipulate these bits. Some scientists have even gone as far as to speculate we could live within a giant information processor, a giant 'Matrix', programmed to simulate the laws of physics we know.
If our observed reality were indeed a simulation constructed by some ultimate architect, what is the underlying code of our universe? What sort of information processing would they use? Would it be merely classical logic on classical bits, or would they harness the unique properties of quantum logic? On first impressions, such questions seem difficult to answer. After all, all observations we make lie within 'The Matrix', how could we say anything about what lies beyond?
One way to approach this problem is to walk in the shoes of the architect. Suppose you were an young architect, tasked to simulate a simple reality. A universe, consisting of a single observable bit evolving in discrete time steps, such that at each time, the bit flipped with probability 0.2 (see note (*) below, for why I have chosen 0.2). Being a beginner, you are presented with two potential solutions:
(i) A system consisting of two binary coins. At each time-step, the system sets the observable bit to 0 or 1 depending on whether the state of the two coins coincide, and one of the two coins is chosen at random and flipped with probability 0.2.
(ii) A system consisting of a single coin. At each time-step, the system sets the observable bit to 0 or 1 depending on the state of the coin, and the coin is flipped with probability 0.2.
Either solution works: The output of either system behaves exactly according to desired specifications. Yet, if tasked to select one of the two, most of us are likely to prefer the second option. A solution with a single coin simply appears more appealing than a solution that requires two.
This natural sense of aesthetics was first formalized by William of Ockham, a 13th century friar. Occam's Razor posited that 'plurality is not to be posited without necessity.' When given two different ways of doing the same thing, there's no sense in choosing the more complex one without good reason. Since its inception, the principle has become an important heuristic that guides the development of theoretical models in quantitative science. In the words of Isaac Newton, "We are to admit no more causes of natural things than such as are both true and sufficient to explain their appearances."
When applied to the two potential simulators above, the natural appeal of the second solution can be given more rigorous footing. The first simulator kept track of two coins to simulate our toy universe. Since all configurations occur with equal probability, it would have an entropy of 2. In contrast, the second simulator keeps track of only a single coin, and thus requires only half this entropy. If we are tasked with simulating a plethora of such realities with a hard drive of limited space, we could manage twice as many realities with the second approach.
Therefore, should we assume that whoever designed The Matrix would have similar aesthetic tastes, and so we may use similar reasoning to deduce the underlying code to our own reality--or, at the very least, deduce how our reality should be designed if our computer overlords cared about how much storage space they used. These considerations thus motivate the question:
If we are to construct a simulator of observed reality, would we need to store less data if we chose to exploit quantum dynamics?
Let us return to the simple universes that consist of a single bit evolving in discrete time steps. The behavior of such realities can be characterized as a stochastic process, a probability distribution over a sequence of bits. A simulator for a stochastic process can be thought of as a physical system that stores select information about past outputs, and uses them to generate the require statistics for the future (see image, top right). Ideally, we want to construct a simulator that as simple as possible, such that its information storage requirements are minimized.
A priori, it is not obvious quantum dynamics would be of direct benefit; after all, the required behavior is merely a string of classical bits that obey a particular classical probability distribution. Quantum dynamics does seem to be of immediate relevance.
Yet, classical simulators have turned out to be less than ideal. Take for example, the simple case of our toy reality that is simulated by a single coin. The amount of information stored within the simulator is a single bit, namely the state of the coin. However, even if we could observe the entire future of this reality, we would still be unable to ascertain whether our simulator started with a coin in state 0, or state 1. Thus, not all information stored by the simulator was ever visible in its simulated reality, and thus should never be stored in the first place.
This is in fact, a generic property of classical simulators. For most stochastic processes, even the provable optimal classical simulator stores more information than it needs. If a binary property 'X' (such as the state of coin), had an effect on the future evolution of observed reality, then the value X must be stored. This is unavoidable; even all future observations made within the reality does not guarantee one can deduce the value of X. Therefore, classical simulators erase information; they contain a source of irreversibly that cannot be removed.
Quantum simulators, however, have greater potential freedom. Instead of allocating a full bit to store the value of X, we may store the conditions 'X= 0' and 'X=1' in non-orthogonal states. Consequently, the simulator saves memory, as it was never sure what state the property was in the first place. Nevertheless, we show in Nature Communications, this week, that it is often possible to engineer dynamics such that the simulator can still replicate the dynamics of our desired reality (full paper available at arXiv:1102.1994v4 ). The use of quantum processing has essentially sharpened Occam's razor, allowing us to shave off the parts of X that we never needed to remember.
The applications of this result go beyond programming The Matrix for memory conscious computer overlords. The minimum amount of information required to simulate a given stochastic process is a significant topic of study in the field of complexity theory, where it is known in scientific literature as statistical complexity. The rationale is that if we are supplied any complex system, we can still make a meaningful statement about how complicated it must be by looking only at the statistical complexity of its output. If the system displays a statistical complexity of C, then whatever the underlying mechanics of the system, we need at least a memory of C to simulate its statistics.
The fact that this memory can be reduced quantum mechanically implies the counterintuitive conclusion that quantizing such simulators can reduce their complexity beyond this classical bound, even if the process they're simulating is purely classical. Many organisms and devices operate based on the ability to predict and thus react to the environment around them, the fact that it is possible to make identical predictions with less memory by exploiting quantum dynamics implies that such systems need not be as complex as one originally thought.
Nevertheless a puzzle remains: Quantum simulators are still not wholly reversible. For many stochastic processes, even the best quantum simulators we know still erase information--they still store unnecessary information. Could an even more general probability theory, with even more bizarre correlations, side-step this restriction? If our reality indeed lay with a grand Matrix run by some memory-conscious architect, he could certainly prefer a 'quantum Matrix' over a classical one; but could he have some even more exotic Matrix in mind?
Note (*): Many of you might wonder why in this toy universe, I chose to have each bit flip with probability 0.2 rather than the more natural 0.5. The answer is actually rather illuminating. 0.5 is a bit of a special case. If you, as the young architect, were given the task to design a universe where each bit was to be flipped with probability 0.5, then you should consider yourself very lucky. To anyone living inside the universe, the universe would look like a string of completely random bits. To simulate this, you won't need anything sophisticated. Just design a system that blindly tosses a coin into the air! Such a system is remarkably simple, it would need to keep track of absolutely nothing!
--
Mile Gu is a research fellow at the Centre for Quantum Technologies in Singapore.