This post has been co-written by Mile Gu and Thomas Elliott.

Artist's impression of a quantum hourglass
In 2006, Oxford University Press set out to determine the most commonly used nouns in the English language. Topping the list was 'time,' closely followed in 3rd and 5th by 'year' and 'day' respectively, both delineating periods of time. This highlights how deeply embedded the concept of time is within the human psyche.
This prevalence of time is perhaps not too surprising--observing and tracking time are integral to our daily lives. Our work days are carried out according to schedules, meetings are planned with set durations. We synchronise with others by fixing times to meet, and we plan each day with the knowledge that they contain a set number of hours. To prepare for what is going to happen in the future, time-keeping is essential. Indeed, one can argue that one of the foundations modern civilization is built upon is our capacity to track time and anticipate future events.
In this context, the hourglass is distinguished for its role as one of the first accessible means of tracking time to the accuracy of seconds in medieval society. Indeed, their function is so familiar that 'the sands of time' has become a popular idiom, referring to the visual metaphor that the passage of time appears to flow like falling sand, steadily and irreversibly progressing from past to future.
Zoom in close enough on an hourglass, and one will see the individual grains of sand. At this level, the flow is not smooth, but inherently granular. At any moment, a finite number of grains of sand will have fallen, incrementing temporal progress in discrete packets. Time itself however appears, at any observable level, to be continuous. The hourglass analogy thus extends only so far.
This limit illustrates a pertinent observation--objects with a finite configuration space can only mimic the passage of time to finite precision. This isn't particularly surprising: a finite configuration space can support only a finite memory. Thus, irrespective of any other physical limitations, such a system can only store the current time to a finite precision.
Indeed, this limitation is all too familiar to those in scientific fields that involve digital modelling or simulation. Theoretical models almost always assume time operates on a continuum. Whether modelling neuronal spike trains or dynamics of quantum systems, time is generally represented by some parameter t that takes on real values. Digital simulations of the resulting systems must however inevitably approximate such dynamics by discretising time.
As an illustration, consider the simulation of a particularly simple delayed Poisson process. This consists of a single system that emits only a single type of output at probabilistic points in time. The probability of an emission occurring during each infinitesimal time-interval δ is constant, with one catch: no output is ever emitted within a fixed relaxation period τ after an emission. For a device to replicate these statistics correctly, it would need to record whether it is in such a relaxation period, and if so, precisely how much time has elapsed since the last emission. Let's call this elapsed time t. Only by storing t can the simulator know precisely much longer it must wait (i.e., the value of τ - t) before it is okay to emit the next output.
However, the variable t is real, and can take on a continuum of values. The more decimal places we wish to store about it, the more memory is required. In the limit where we want to faithfully predict the next emission with accuracy up to an arbitrary level of precision, this memory becomes unbounded. In practice, when writing computer code to perform the simulation, we would approximate t by discretising time into granular packets. For example, we could take some sufficiently small Î" and call it a day, provided we are happy to track time only to the nearest kÎ", for some integer k. A typical program for simulating such a process would have a pseudocode along the lines of:
if t is less than Ï„,
increment time by setting t = t + Î"
otherwise
emit an output and set t=0 with probability p Î"
end repeat from start
Each iteration of the code simulates one timestep Î". As Î" goes to zero, the output of this simulator will become statistically identical to the original delayed Poissonian process. The cost though, is that the number of values t can take scales as 1/Î", growing to infinity as Î" goes to zero.
Thus, the more accurately that we wish to simulate the process, the more memory we need to invest. There is an ever-present trade-off between temporal precision and memory, and perfect statistical replication requires the allocation of unlimited resources. An hourglass, with a finite amount of sand, can thus never achieve exact replication.
While this trade-off is of clearly of practical relevance, it is fascinating also from a more foundational perspective. Many scientists seriously consider the possibility that we live within a simulatable reality, wherein everything in nature can be thought of as information processing. If we assume the memory capacity of this underlying computer is finite, running a program like the pseudocode above, how would it simulate processes that operate in continuous time? One may be tempted then to conclude that either we live in a computer with unlimited resources, or that continuous time exists only as a theoretical idealisation. Perhaps our universe is itself like sand in an hourglass--zoom closely enough, and everything appears granular.
While this could be a valid possibility, is there perhaps a way to avert this conundrum?
What we have not yet considered here is the quantum nature of information. The key element is that every bit of data represents some physical system with two different configurations--one which we label |0>, the other |1>. Provided the system can be isolated sufficiently well from the environment, we can also steer it into quantum mechanical degrees of freedom that possess coherence: superposition states represented by α|0>+β|1>, simultaneously |0> and |1> with specific weights dictated by α and β. The difference now is that α and β are intrinsically continuous degrees of freedom. Thus, this quantum bit--a finite physical system--contains within it a continuous parameter. Could this be leveraged to encode the continuity of time?
In our latest article, published in npj Quantum Information (4, Article number: 18 (2018)), we show that this ingredient gives us exactly what we need. Instead of using a classical hourglass where each grain of sand has either fallen or is yet to fall, we employ a quantum time-keeper where the grains of sand are in a weighted superposition of both possibilities. By deforming the weights continuously with the passage of time, we are able to prove that the delayed Poisson process could be modelled with perfect precision using finite memory.
Pragmatically, this result could immediately lead to memory savings in continuous time simulations. Numerical evidence indicates that our results apply to much more general cases, where the waiting time distribution between successive emissions is arbitrary. Such general processes, known as renewal processes, are relevant in many diverse fields of study--from modelling the firing of neurons to arrival times of buses. Thus a means to simulate such systems with less memory could have direct practical use. Similar advantages can be found when considering other continuous variables, such as position, as was shown in a companion article recently published in New Journal of Physics (2017).
The foundational consequences, however, are perhaps more exciting. Let us again entertain the scenario where we live in a simulated reality. Would the architects of this reality prefer the use of classical or quantum information? Our work shows that if they are intent on constructing a universe where time flows smoothly, then quantum mechanics may be the only feasible method. Time, should it be continuous, could well necessitate an underlying quantum reality.
--
Mile Gu is a physicist at Nanyang Technological University and the Centre for Quantum Technologies, Singapore. Thomas J. Elliott is a physicist at Nanyang Technological University, Singapore. Their research was supported in part by FQXi.