# Weekly Papers on Quantum Foundations (39)

This is a list of this week’s papers on quantum foundations published in various journals or uploaded to preprint servers such as arxiv.org and PhilSci Archive.

Effects of the Generalized Uncertainty Principle on Quantum Tunneling. (arXiv:1509.07359v1 [quant-ph])

on 2015-9-25 9:55am GMT

In a previous paper [Blado G, Owens C, and Meyers V 2014 Quantum Wells and the Generalized Uncertainty Principle Eur. J. Phys. 35 065011], we showed that quantum gravity effects can be discussed with only a background in non-relativistic quantum mechanics at the undergraduate level by looking at the effect of the generalized uncertainty principle (GUP) on the finite and infinite square wells. In this paper, we derive the GUP corrections to the tunneling probability of simple quantum mechanical systems which are accessible to undergraduates (alpha decay, simple models of quantum cosmogenesis and gravitational tunneling radiation) and which employ the WKB approximation, a topic discussed in undergraduate quantum mechanics classes. It is shown that the GUP correction increases the tunneling probability in each of the examples discussed.

Interpretation of the Klein-Gordon Probability Density. (arXiv:1509.07380v1 [quant-ph])

on 2015-9-25 9:53am GMT

Authors: Roderick I. Sutherland

An explanation is presented for how the expression for probability density provided by the Klein-Gordon equation can be understood within a particle interpretation of quantum mechanics. The fact that this expression is not positive definite is seen to be no impediment once a careful distinction is drawn between the outcomes of measurements and the positions of particles between measurements. The analysis indicates, however, that retrocausal influences must be involved.

Quantum nonlocality with arbitrary limited detection efficiency. (arXiv:1509.07139v1 [quant-ph])

on 2015-9-25 9:53am GMT

The demonstration and use of nonlocality, as defined by Bell’s theorem, rely strongly on dealing with non-detection events due to losses and detectors’ inefficiencies. Otherwise, the so-called detection loophole could be exploited. The only way to avoid this is to have detection efficiencies that are above a certain threshold. We introduce the intermediate assumption of limited detection efficiency, that is, in each run of the experiment, the overall detection efficiency is lower bounded by $\eta_{min} > 0$. Hence, in an adversarial scenario, the adversaries have arbitrary large but not full control over the inefficiencies. We analyse the set of possible correlations that fulfill Limited Detection Locality (LDL) and show that they necessarily satisfy some linear Bell-like inequalities. We prove that quantum theory predicts the violation of one of these inequalities for all $\eta_{min} > 0$. Hence, nonlocality can be demonstrated with arbitrarily small limited detection efficiencies. We validate this assumption experimentally via a twin-photon implementation in which two users are provided with one photon each out of a partially entangled pair. We exploit on each side a passive switch followed by two measurement devices with fixed settings. Assuming the switches are not fully controlled by an adversary, nor by hypothetical local variables, we reveal the nonlocality of the established correlations despite a low overall detection efficiency.

Ten reasons why a thermalized system cannot be described by a many-particle wave function. (arXiv:1509.07275v1 [cond-mat.stat-mech])

on 2015-9-25 9:53am GMT

Authors: Barbara Drossel

It is widely believed that the underlying reality behind statistical mechanics is a deterministic and unitary time evolution of a many-particle wave function, even though this is in conflict with the irreversible, stochastic nature of statistical mechanics. The usual attempts to resolve this conflict for instance by appealing to decoherence or eigenstate thermalization are not fully convincing. This paper presents ten arguments why the irreversibility and stochasticity of statistical mechanics should be taken as a true property of nature, and why we should assume that there are limits of validity to a unitary time evolution based on wave functions. The arguments are made for macroscopic systems at finite temperature and are based among others on the classical limit, on extensivity, on the concepts of entropy and equilibrium, and on symmetry braking in phase transitions and quantum measurement. It follows that a gas of a macroscopic number N of atoms in thermal equilibrium is best represented by a collection of N wave packets of a size of the order of the thermal de Broglie wave length, which behave quantum mechanically below this scale but classically sufficiently far beyond this scale. In particular, these wave packets must localize again after scattering events, which requires stochasticity and indicates a connection to the measurement process.

Bayesian Interpretation of Weak Values. (arXiv:1509.07198v1 [quant-ph])

on 2015-9-25 9:53am GMT

Authors: Akio Hosoya

The real part of the weak value is identified as the conditional Bayes probability through the quantum analog of the Bayes relation. We present an explicit protocol to get the the weak values in a simple Mach-Zehnder interferometer model and derive the formulae for the weak values in terms of the experimental data consisting of the positions and momenta of detected photons on the basis of the quantum Bayes relation. The formula gives a way of tomography of the initial state almost without disturbing it in the weak coupling limit.

Antimatter in the Direct-Action Theory of Fields. (arXiv:1509.06040v1 [quant-ph])

on 2015-9-22 1:16am GMT

Authors: R. E. Kastner

One of Feynman’s greatest contributions to physics was the interpretation of negative energies as antimatter in quantum field theory. A key component of this interpretation is the Feynman propagator, which seeks to describe the behavior of antimatter at the virtual particle level. Ironically, it turns out that one can dispense with the Feynman propagator in a direct-action theory of fields, while still retaining the interpretation of negative energy solutions as antiparticles.

Provable Quantum Advantage in Randomness Processing. (arXiv:1509.06183v1 [quant-ph])

on 2015-9-22 1:16am GMT

Authors: Howard DaleDavid JenningsTerry Rudolph

Quantum advantage is notoriously hard to find and even harder to prove. For example the class of functions computable with classical physics actually exactly coincides with the class computable quantum-mechanically. It is strongly believed, but not proven, that quantum computing provides exponential speed-up for a range of problems, such as factoring. Here we address a computational scenario of “randomness processing” in which quantum theory provably yields, not only resource reduction over classical stochastic physics, but a strictly larger class of problems which can be solved. Beyond new foundational insights into the nature and malleability of randomness, and the distinction between quantum and classical information, these results also offer the potential of developing classically intractable simulations with currently accessible quantum technologies.

From Noncontextuality Inequalities to Noncontextuality Inequalities without Assuming Determinism. (arXiv:1509.06027v1 [quant-ph])

on 2015-9-22 1:16am GMT

Authors: Zhen-Peng XuJing-Ling ChenHong-Yi Su

Recently, Kunjwal and Spekkens have proposed a new method [1] to show quantum contextuality, and an experiment [2] has been done according to this method. The original paper only said that every KS theorem can correspond to a non-contextuality inequality, let’s denote this type non-contextuality inequality as Kunjwal-Spekkens-type non-contextuality inequality. However, we can show that so does every original type non-contextuality inequality, especially the SIC ones. Here we take YO-13-ray [3] and KCBS inequality [4] for example.

Investigating the emergence of time in stationary states with trapped ions

PRA: Fundamental concepts

on 2015-9-21 2:00pm GMT

Author(s): Serge Massar, Philippe Spindel, Andrés F. Varón, and Christof Wunderlich

Even though quantum systems in energy eigenstates do not evolve in time, they can exhibit correlations between internal degrees of freedom in such a way that one of the internal degrees of freedom behaves like a clock variable, and thereby defines an internal time, that parametrizes the evolution of…

[Phys. Rev. A 92, 030102(R)] Published Mon Sep 21, 2015

Contexts, Systems and Modalities: A New Ontology for Quantum Mechanics

Latest Results for Foundations of Physics

on 2015-9-21 12:00am GMT

Abstract

In this article we present a possible way to make usual quantum mechanics fully compatible with physical realism, defined as the statement that the goal of physics is to study entities of the natural world, existing independently from any particular observer’s perception, and obeying universal and intelligible rules. Rather than elaborating on the quantum formalism itself, we propose a new quantum ontology, where physical properties are attributed jointly to the system, and to the context in which it is embedded. In combination with a quantization principle, this non-classical definition of physical reality sheds new light on counter-intuitive features of quantum mechanics such as the origin of probabilities, non-locality, and the quantum-classical boundary.