Entanglement from a Local Deterministic Model?

September 10, 2009
by Vlatko Vedral

Image 1

Entanglement is a quantum effect that exhibits the clearest departure of quantum theory from the previous classical physics. This is easiest seen through the Bell inequalities which are satisfied by all classical physics, but are (both theoretically and experimentally) found to be violated by quantum systems.

What's at stake here are the assumptions about the underlying reality of the world and/or its locality. The reality assumption stipulates that values of experimental measurements are known in advance and independently of making measurements. This is true in the classical world: the police camera monitoring your driving speed does not change the speed by measuring it (otherwise you'd have a great defence in court against paying the speeding fine).

Locality on the other hand says that measurements in one part of the universe cannot instantaneously affect another part of the universe (but the disturbance can at most travel at the speed of light). This is in perfect agreement with the theory of relativity.

Bell's inequalities only assume reality and locality (and , yes, you also have to believe in probabilities; if you don't, you can circumvent Bell, which is what some Many World Interpretation of quantum mechanics supporters have been promoting. This point, however, is irrelevant for the present discussion). The fact that quantum systems can violate Bell's inequalities means that either the world is not real (in the sense of being predetermined, independently of our measurements) or that the world is in some sense nonlocal - i.e. some signals can travel faster than light. The latter conclusion is precisely why Einstein, who was a staunch realist, chose to call entanglement "a spooky action at a distance," for he thought that it implied a fundamental nonlocality in nature.

The bottom line is that up until recently there was no way out of Bell's logic and the world we live in either had to be nonlocal or unreal (or both). Now, however, a Nobel Prize winning Dutch physicist Gerhard 't Hooft believes to have found a local realistic model that does exhibit features of quantum entanglement. (You can read his paper here and Technology Review's take on it here.) And this, of course, is very surprising!

't Hooft believes that although the microscopic world is local realistic, quantum mechanics emerges at larger scales. This is, in spirit, the same logic that permeates Einstein's philosophy. Basic quantities in 't Hooft's deterministic model are "beables" (they are independent of measurements - hence the name). He's also got "changables" (things that change beables into other beables) and "superimposables" (things that change beables into things that are not beables).

Heisenberg, in contrast to Einstein (and now 't Hooft), gave up on reality when he invented quantum mechanics. He based his description of atomic phenomena on observable quantities only. Whatever the underlying entities in nature are (read: beables), Heisenberg recognised that we can only talk about them through what we observe and nothing else. The main quantities in quantum mechanics are observables and most observables are not beables, i.e. their outcomes depend on our measurements. How does 't Hooft bridge the gap between observables and beables?

First of all, 't Hooft's model is discrete in space (one feels this needs to be the case to comply with the quantum discreteness). Likewise, the evolution of his model takes place in discrete time units and is thus the same as that of any cellular automaton. But this is, of course, not sufficient to reproduce quantum physics (otherwise, classical statistical mechanics would do the job). In addition to beables and changables 't Hooft needs superimposables in order to comply with quantum mechanics. But how can that model still remain local realistic? I definitely need to examine his work much more to give a more detailed answer to this conundrum. But my hunch is as follows.

One can maintain the local realistic classicality if one has access to some underlying (could be hidden) quantumness. It seems to me that 't Hooft needs this in his model of Bell inequalities tests. Here Alice and Bob use their local but distant quasars (pulsating stars) to aid their measurements. This reminds me of a trick exploited in a recent paper of mine in Physical Review Letters. Together with my colleagues from Vienna and Belfast, I showed that classical states of light (i.e. those that admit local hidden variables), when manipulated by local operations only can still violate Bell's inequalities (the point being that their local operations rely on prior shared entanglement). So, in this way it appears that you can have your cake and eat it. But, of course, your world is still ultimately quantum (more precisely, nonlocal), albeit the quantumness is here in the background and, granted that, your theory then only talks about beables and changables (the much needed superimposables come from the background).

In addition to Bell's inequalities there are other issues that 't Hooft's model has to grapple with. However, no matter what the final verdict this model, one thing is for sure. We currently have two extraordinarily successful (and pretty) theories of Nature (quantum physics and general relativity) that do not sit happily alongside each other. Given that relativity is basically a local realistic theory, one way of combining the two (and by no means the most popular choice) is to discover a local realistic theory behind quantum mechanics. That was Einstein's dream to start with and it is clearly kept alive by effort's such as 't Hooft's in spite of all the obstacles along the way.