This is a list of this week’s papers on quantum foundations published in various journals or uploaded to preprint servers such as arxiv.org and PhilSci Archive.
Interpretation neutrality in the classical domain of quantum theory
on 2016-1-01 8:21pm GMT
Publication date: Available online 30 December 2015
Source:Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics
Author(s): Joshua Rosaler
I show explicitly how concerns about wave function collapse and ontology can be decoupled from the bulk of technical analysis necessary to recover localized, approximately Newtonian trajectories from quantum theory. In doing so, I demonstrate that the account of classical behavior provided by decoherence theory can be straightforwardly tailored to give accounts of classical behavior on multiple interpretations of quantum theory, including the Everett, de Broglie–Bohm and GRW interpretations. I further show that this interpretation-neutral, decoherence-based account conforms to a general view of inter-theoretic reduction in physics that I have elaborated elsewhere, which differs from the oversimplified picture that treats reduction as a matter of simply taking limits. This interpretation-neutral account rests on a general three-pronged strategy for reduction between quantum and classical theories that combines decoherence, an appropriate form of Ehrenfest׳s Theorem, and a decoherence-compatible mechanism for collapse. It also incorporates a novel argument as to why branch-relative trajectories should be approximately Newtonian, which is based on a little-discussed extension of Ehrenfest׳s Theorem to open systems, rather than on the more commonly cited but less germane closed-systems version. In the Conclusion, I briefly suggest how the strategy for quantum-classical reduction described here might be extended to reduction between other classical and quantum theories, including classical and quantum field theory and classical and quantum gravity.
Niels Bohr on the wave function and the classical/quantum divide
on 2016-1-01 8:21pm GMT
Publication date: February 2016
Source:Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics, Volume 53
Author(s): Henrik Zinkernagel
It is well known that Niels Bohr insisted on the necessity of classical concepts in the account of quantum phenomena. But there is little consensus concerning his reasons, and what he exactly meant by this. In this paper, I re-examine Bohr׳s interpretation of quantum mechanics, and argue that the necessity of the classical can be seen as part of his response to the measurement problem. More generally, I attempt to clarify Bohr׳s view on the classical/quantum divide, arguing that the relation between the two theories is that of mutual dependence. An important element in this clarification consists in distinguishing Bohr׳s idea of the wave function as symbolic from both a purely epistemic and an ontological interpretation. Together with new evidence concerning Bohr׳s conception of the wave function collapse, this sets his interpretation apart from both standard versions of the Copenhagen interpretation, and from some of the reconstructions of his view found in the literature. I conclude with a few remarks on how Bohr’s ideas make much sense also when modern developments in quantum gravity and early universe cosmology are taken into account.
Discovering Quantum Causal Models
PhilSci-Archive: No conditions. Results ordered -Date Deposited.
on 2015-12-28 7:22pm GMT
Shrapnel, Sally (2015) Discovering Quantum Causal Models. [Preprint]
Quantum Delayed-Choice Experiment with a Beam Splitter in a Quantum Superposition
on 2015-12-28 3:00pm GMT
Author(s): Shi-Biao Zheng, You-Peng Zhong, Kai Xu, Qi-Jue Wang, H. Wang, Li-Tuo Shen, Chui-Ping Yang, John M. Martinis, A. N. Cleland, and Si-Yuan Han
A beam splitter is placed in a quantum superposition state of being both active and inactive allowing the wave and particle aspects of the system to be observed in a single setup.
[Phys. Rev. Lett. 115, 260403] Published Mon Dec 28, 2015
Concerning Quadratic Interaction in the Quantum Cheshire Cat Experiment
International Journal of Quantum Foundations » International Journal of Quantum Foundations
on 2015-12-28 11:55am GMT
Volume 2, Issue 1, pages 17-31
W. M. Stuckey [Show Biography], Michael Silberstein [Show Biography] and Timothy McDevitt [Show Biography]
Mark Stuckey is a Professor of Physics at Elizabethtown College, Pennsylvania, USA. His PhD thesis was in relativistic cosmology from the University of Cincinnati working for Louis Witten in 1987. His work in relativistic cosmology contributed to a movement to correct misconceptions in the mass media and introductory astronomy textbooks about Big Bang cosmology. In 1994 he started study in foundations of physics with the goal of interpreting quantum mechanics in order to develop a new approach to fundamental physics. In 2005, he and a colleague in philosophy of science (Prof. Silberstein) achieved the first part of that goal by creating the Relational Blockworld (RBW) interpretation of quantum mechanics. In 2009, a colleague in mathematics (Prof. McDevitt) joined the collaboration and helped bring the goal to fruition with the development of an RBW approach to quantum gravity and the unification of physics based on modified lattice gauge theory. In 2012, the corresponding modification to Regge calculus in Einstein-deSitter cosmology was used to fit the Union2 Compilation supernova data as well as LambdaCDM without accelerating expansion, dark energy, or a cosmological constant. In 2015, RBW and its associated new approach to fundamental physics are well-developed and being brought to bear on the dark matter problem.
Michael David Silberstein is a Full Professor of Philosophy at Elizabethtown College, a founding member of the Cognitive Science program and permanent Adjunct in the Philosophy Department at the University of Maryland, College Park, where he is also a faculty member in the Foundations of Physics Program and a Fellow on the Committee for Philosophy and the Sciences. His primary research interests are foundations of physics, foundations of cognitive science and foundations of complexity theory respectively. He is especially interested in how these branches of philosophy and science bear on more general questions of reduction, emergence and explanation. In 2005, he and a colleague in physics (Prof. Stuckey) created the Relational Blockworld (RBW) interpretation of quantum mechanics. In 2009, a colleague in mathematics (Prof. McDevitt) joined the collaboration and helped bring to fruition the development of an RBW approach to quantum gravity and the unification of physics based on modified lattice gauge theory. In 2012, the corresponding modification to Regge calculus in Einstein-deSitter cosmology was used to fit the Union2 Compilation supernova data as well as LambdaCDM without accelerating expansion, dark energy, or a cosmological constant. In 2015, RBW and its associated new approach to fundamental physics is being brought to bear on the dark matter problem.
Tim McDevitt is Professor of Mathematics and Chair of the Department of Mathematical and Computer Sciences at Elizabethtown College. He earned his Ph.D. in Applied Mathematics in 1996 at the University of Virginia and has spent significant time working both in and outside of academia. He has been at Elizabethown College since 2005 and he enjoys engaging in interdisciplinary research with colleagues in other disciplines.
In a July 2014 Nature Communications paper, Denkmayr et al. claim to have instantiated the so-called quantum Cheshire Cat experiment using neutron interferometry. Crucial to this claim are the weak values which must imply the quantum Cheshire Cat interpretation, i.e., “the neutron and its spin are spatially separated” in their experiment. While they measured the correct weak values for the quantum Cheshire Cat interpretation, the corresponding implications do not obtain because, as we show, those weak values were measured with both a quadratic and a linear magnetic field Bz interaction. We show explicitly how those weak values imply quantum Cheshire Cat if the Bz interaction is linear and then we show how the quadratic Bz interaction destroys the quantum Cheshire Cat implications of those weak values. Since both linear and quadratic Bz interactions contribute equally to the neutron intensity in this experiment, the deviant weak value implications are unavoidable. Because weak values were used successfully to compute neutron intensities for weak Bz in this experiment, it is clearly the case that one cannot make ontological inferences from weak values without taking into account the corresponding interaction strength.
Full Text Download (336k) | View Submission Post
Are Retrocausal Accounts of Entanglement Unnaturally Fine-Tuned?
International Journal of Quantum Foundations » International Journal of Quantum Foundations
on 2015-12-27 10:25am GMT
Volume 2, Issue 1, pages 1-16
D. Almada, K. Ch’ng, S. Kintner, B. Morrison and K. B. Wharton [Show Biography]
Ken Wharton received his physics degrees from Stanford and UCLA, and is now a full professor at San Jose State University; his co-authors are current or former students at SJSU. Wharton’s early research concerned experimental laser-plasma interactions, but for the past decade he has specialized in quantum foundations. A member of the Foundational Questions Institute, Wharton spends some of his efforts on popular outreach (general-level essays, a piece in New Scientist, an appearance on Through The Wormhole, etc.) His current research is focused on developing explanations for quantum phenomena using only continuous structures in ordinary spacetime.
An explicit retrocausal model is used to analyze the general Wood-Spekkens argument that any causal explanation of Bell-inequality violations must be unnaturally fine-tuned to avoid signaling. The no-signaling aspects of the model turn out to be robust under variation of the only free parameter, even as the probabilities deviate from standard quantum theory. The ultimate reason for this robustness is then traced to a symmetry assumed by the original model. A broader conclusion is that symmetry-based restrictions seem a natural and acceptable form of fine-tuning, not an unnatural model-rigging. And if the Wood-Spekkens argument is indicating the presence of hidden symmetries, this might even be interpreted as supporting time-symmetric retrocausal models.
Full Text Download (228k) | View Submission Post