More videos from the Setting Time Aright meeting, this time from the session on Truth. The talks covered both the philosophical question of what scientific "truth" is -- Is science as objective as we might hope? -- and the practical question of how we establish scientific truth in a changing landscape in which computers play as much of a role as -- if not more than -- test tubes, particle accelerators and other lab equipment for testing hypotheses.
First up was FQXi-member and investigative journalist Peter Byrne (who wrote a book about Hugh Everett III) talking about dogma and subjectivity in science, using the example of the resistance to Everett's Many Worlds interpretation of quantum mechanics that was (is?) largely due to sociological and idealogical reasons:
[youtube:X4XLAJSASy0, 560, 340]
The video doesn't show the discussion after Byrne's talk, but it provoked an indignant reaction from one physicist who asked if Byrne was seriously suggesting that (for instance), despite repeated confirmation of its predictions, the Standard Model is only subjectively true. Michael Reisenberger responded that subjectivity appears in the stories that we use to explain and understand the Standard Model -- and I agree with him. Those of us that spend much of our time writing popular science may be more comfortable with the idea that science involves storytelling, but that's not just at the popular level. The discussion brings to mind an article that I wrote for Nature last year ("The Large Human Collider") about social scientists, philosophers and anthropologists who were using CERN as a laboratory for studying the behavior of scientists and the construction of knowledge in large-scale collaborations. In particular, I remember Holger Lyre asking, "Does the Higgs Mechanism Exist?" -- not in the sense of whether or not the LHC will find the Higgs, but in the sense of what actually exists "out there," if they do find evidence for the Higgs. Quoting from his paper (arXiv:0806.1359v1):
"To be sure, there is nothing wrong with the mathematics of it, but on closer inspection of the "mechanism" it will become clear that a deeper conceptual understanding of the formalism is not at all as obvious and as straightforward as most presentations, notably textbook presentations, of the Higgs mechanism usually pretend. For instance--and as the alert philosophy of physics reader will certainly have noticed already--the status of the symmetries in question, gauge symmetries, is in fact a non-empirical or merely conventional one precisely in the sense that neither global nor local gauge transformations possess any real instantiations (i.e. realizations in the world). Rather their status is comparable to the status of coordinate transformations... How is it then possible to instantiate a mechanism, let alone a dynamics of mass generation, in the breaking of such a kind of symmetry? Suspicions like this should raise philosophical worries about the true ontological and explanatory story behind the Higgs mechanism."
I usually try and avoid plugging articles I have written on this blog, but since I have already mentioned my LHC article, I'll bring up another that is relevant to the second talk in the Truth session, also from Nature last year: "Error: Why Scientific Programming Does Not Compute." That looked at concerns from computer scientists that scientific coding is not as accurate as scientists perhaps believe, and that published results that are based on computational analyses are not reproducible. In the article, I mention recommendations that came out of the Nov 2009 Yale Law School Data and Code Sharing Roundtable in New Haven, Connecticut, organized by Victoria Stodden, urging scientists to provide links to source-code and data used to generate results when publishing. Stodden was at the meeting and spoke at length about these issues in the Truth session, placing them within historical context by looking at how our views of what constitutes scientific truth have changed over the centuries:
[youtube:dF1-nkqwmjI, 560, 340]
Having been vulgar enough to mention two articles that I have written, I may as well throw in a third: "String Theory Finds a Bench Mate"! (Thank you to John Merryman for noticing it, and commenting on it elsewhere in this forum.) It's about the mathematical connections between string theory and experimental condensed matter physics -- the AdS/CFT conjecture helps predict new states of matter in the lab. I'm bringing this up here as it also touches on the issue of establishing scientific truth. String theory has taken a battering for being divorced from experiment and thus failing to establish its scientific credentials. Although theoretical physics -- more so than other sciences -- sets great store in mathematical elegance, string theorists (at least the ones that I spoke too) were very keen to be able to connect with experiment through this condensed matter link.
But just what do they achieve by doing this? No one on either side claims that the experiments have any bearing on the question of whether string theory provides the correct description of reality at a fundamental level, or whether strings really "exist." But, as John McGreevy told me, if future experiments confirm these string-theory-condensed-matter-predictions, it will help to establish that "strings exist in the Platonic sense." In the article, I quote Andrew Green saying that string theory may have been misunderstood: "Maybe string theory is not a unique theory of reality, but something deeper -- a set of mathematical principles that can be used to relate all physical theories," he says. "Maybe string theory is the new calculus."聽Any thoughts on this?