The Convexity Club

August 15, 2007
by Matthew Leifer

Plenty of people have been writing about the recent fqxi conference, which was excellent by the way, so I'll write instead about another fqxi-funded event that happened at the beginning of July in St. Catherine's college, Cambridge.

St. Catherine\'s College

St. Catherine\'s College

The two-week workshop was entitled Operational Probabilistic Theories as Foils for Quantum Theory and organized by Rob Spekkens, Jonathan Barrett and Tony Short. It's aim was to try and understand quantum theory by setting it in a wider context of probabilistic theories that follow from more directly intuitive axioms.

This is what the people at the workshop looked like.

Preparing to punt. L to R: Me, Mana, Appleby, d\'Ariano, Wootters, Short

Preparing to punt. L to R: Me, Mana, Appleby, d\'Ariano, Wootters, Short

Hard at work.  L to R: Barrett, Dahlsten, Mana, Toner, Spekkens, arm of Barnum, Short

Hard at work. L to R: Barrett, Dahlsten, Mana, Toner, Spekkens, arm of Barnum, Short

Much of this work takes place in the "Convex Sets Framework" that has become something of an addiction for a small band of researchers in quantum information recently, including myself. The basic idea is that you assume that states, whatever else they are, should be objects that assign probabilities to all possible measurements that can be made on a system. Since preparation procedures can be mixed, you assume that these form a convex set. Classical and quantum theories provide examples of this. In classical probability theory the state space is a simplex and in quantum theory it is the set of density operators on a Hilbert space.

However, the framework also contains much more general things, e.g. theories in which the Bell inequalities can be violated to a greater extent than in quantum theory whilst still not permitting signaling. This class of theories has attracted considerable attention recently because it is vastly more powerful than quantum theory for certain information processing tasks, e.g. it trivializes communication complexity. Questions that were addressed at the workshop include:

- How can we understand why the world obeys quantum theory rather than any of the other theories?

This has lead to a few new axiomatizations of quantum theory recently, e.g. the work of Hardy and d'Ariano.

- Which features of quantum theory are special to quantum theory and which of them hold generically in all convex theories?

Surprisingly, considering the paucity of assumptions that go into the framework, many of our most cherished ideas of what makes something "genuinely quantum" turn out to hold generically. Examples include no-cloning, no-broadcasting, existence of indistinguishable pure states, measurement-disturbance, violation of Bell inequalities, and many more.

I should also mention that a variety of other approaches and topics were discussed at the workshop, such as toy theories that are not convex (Spekkens), axiomatizations of quantum theory not in the convex sets framework (Goyal), theories in the decoherent histories framework (Dowker), operational approaches to quantum gravity (Hardy), generalizations of the quantum de Finetti theorem (Toner, Renner) and the concept of negative information (Oppenheim).

There was also plenty of free time for discussion at the workshop, and several papers should appear based on them in the coming months.