Weekly Papers on Quantum Foundations (22)

Authors: J. Sperling

It is demonstrated that the collapse of the wave function is equivalent to the continuity of measurement outcomes. The latter states that a second measurement has to result in the same outcome as the first measurement of the same observable for a vanishing time between both observations. In contrast to the exclusively quantum-physical collapse description, the equivalent continuity requirement also applies in classical physics, allowing for a comparison of both domains. In particular, it is found that quantum coherences are the single cause for measurable deviations in statistical properties due to the collapse. Therefore, the introduced approach renders it possible to characterize and quantify the unique features of the quantum-physical measurement problem within the framework of modern quantum resource theories and compare them to classical physics.

Authors: Uri PelegLev Vaidman

The recent criticism of Vaidman’s propsal for the analysis of the past of a particle in the nested interferometer is refuted. It is shown that the definition of the past of the particle adopted by Englert et al. [Phys. Rev. A 96, 022126 (2017)] is applicable only to a tiny fraction of photons in the interferometer which indeed exhibit different behaviour. Their proof that all pre- and postselected particles behave this way, i.e. follow a continuous trajectory, does not hold, because it relies on the assumption that it is intended to prove.

Authors: Carlo Maria ScandoloRoberto SalazarJarosław K. KorbiczPaweł Horodecki

We investigate the emergence of classicality and objectivity in arbitrary physical theories. First we provide an explicit example of a theory where there are no objective states. Then we characterize classical states of generic theories, and show how classical physics emerges through a decoherence process, which always exists in causal theories as long as there are classical states. We apply these results to the study of the emergence of objectivity, here recast as a multiplayer game. In particular, we prove that the so-called Spectrum Broadcast Structure characterizes all objective states in every causal theory, in the very same way as it does in quantum mechanics. This shows that the structure of objective states is valid across an extremely broad range of physical theories. Finally we show that, unlike objectivity, the emergence of local observers is not generic among physical theories, but it is only possible if a theory satisfies two axioms that rule out holistic behavior in composite systems.

Authors: Philippe Allard GuérinČaslav Brukner

In general relativity, the causal structure between events is dynamical, but it is definite and observer-independent; events are point-like and the membership of an event A in the future or past light-cone of an event B is an observer-independent statement. When events are defined with respect to quantum systems however, nothing guarantees that event A is definitely in the future or in the past of B. We propose to associate a causal reference frame corresponding to each event, which can be interpreted as an observer-dependent time according to which an observer describes the evolution of quantum systems. In the causal reference frame of one event, this particular event is always localised, but other events can be “smeared out” in the future and in the past. We do not impose a predefined causal order between the events, but only require that descriptions from different reference frames obey a global consistency condition. We show that our new formalism is equivalent to the pure process matrix formalism. The latter is known to predict certain multipartite correlations, which are incompatible with the assumption of a causal ordering of the events — these correlations violate causal inequalities. We show how the causal reference frame description can be used to gain insight into the question of realisability of such strongly non-causal processes in laboratory experiments. As another application, we use causal reference frames to revisit a thought experiment where the gravitational time dilation due to a massive object in a quantum superposition of positions leads to a superposition of the causal ordering of two events.

Authors: Adam Bednorz

In the standard quantum theory, one can measure precisely only a subset of the incompatible observables. It results in lack of a formal joint probability defining objective realism even if we accept nonlocal or certain faster-than-light interactions. We propose a construction of such realism extending the usual single-copy description to many copies, partially analogous to familiar many worlds. Failure of the standard single copy can be easily looked for experimentally. The copies should interact weakly at the macroscopic level, leading to effective collapse to a single identical pointer state. Experimental evidence for this conjecture could be obtained by detecting incomplete collapse in sequential measurements or finding deviations from the single-copy Born rule when observing simple quantum systems.

Authors: Diederik AertsMassimiliano Sassoli de BianchiSandro SozzoTomas Veloz

Since its inception, many physicists have seen in quantum mechanics the possibility, if not the necessity, of bringing cognitive aspects into the play, which were instead absent, or unnoticed, in the previous classical theories. In this article, we outline the path that led us to support the hypothesis that our physical reality is fundamentally conceptual-like and cognitivistic-like. However, contrary to the ‘abstract ego hypothesis’ introduced by John von Neumann and further explored, in more recent times, by Henry Stapp, our approach does not rely on the measurement problem as expressing a possible ‘gap in physical causation’, which would point to a reality lying beyond the mind-matter distinction. On the contrary, in our approach the measurement problem is considered to be essentially solved, at least for what concerns the origin of quantum probabilities, which we have reasons to believe they would be epistemic. Our conclusion that conceptuality and cognition would be an integral part of all physical processes comes instead from the observation of the striking similarities between the non-spatial behavior of the quantum micro-physical entities and that of the human concepts. This gave birth to a new interpretation of quantum mechanics, called the ‘conceptualistic interpretation’, currently under investigation within our group in Brussels.

Authors: Lee Smolin

Because of the non-locality of quantum entanglement, realist approaches to completing quantum mechanics have implications for our conception of space. Quantum gravity also is expected to predict phenomena in which the locality of classical spacetime is modified or disordered. It is then possible that the right quantum theory of gravity will also be a completion of quantum mechanics in which the foundational puzzles in both are addressed together. I review here the results of a program, developed with Roberto Mangabeira Unger, Marina Cortes and other collaborators, which aims to do just that. The results so far include energetic causal set models, time asymmetric extensions of general relativity and relational hidden variables theories, including real ensemble approaches to quantum mechanics. These models share two assumptions: that physics is relational and that time and causality are fundamental.

Authors: Samuel S. CreeTamara M. DavisTimothy C. RalphQingdi WangZhen ZhuWilliam G. Unruh

The cosmological constant problem arises because the magnitude of vacuum energy density predicted by quantum mechanics is $\sim 120$ orders of magnitude larger than the value implied by cosmological observations of accelerating cosmic expansion. Recently some of the current authors proposed that the stochastic nature of the quantum vacuum can resolve this tension [1]. By treating the fluctuations in the vacuum seriously, and applying a high-energy cutoff at which Quantum Field Theory is believed to break down, a parametric resonance occurs that predicts a slow expansion and acceleration. In this work we more thoroughly examine the implications of this proposal by investigating the resulting dynamics. Firstly, we improve upon the numerical calculations and show that convergence issues with the original code had overshadowed some important effects. Some of the conclusions are thus reversed, however, the premise that parametric resonance can explain a very slowly accelerating expansion remains sound. After improving the resolution and efficiency of the numerical tests, we explore a wider range of cutoff energies, and examine the effects of multiple particle fields. We introduce a simple model using Mathieu’s equation, a prototypical example of parametric resonance, and find that it closely matches numerical results in regimes where its assumptions are valid. Using this model, we extrapolate to find that in a universe with $28$ bosonic fields and a high-energy cutoff $40$ times higher than the Planck energy, the acceleration would be comparable to what is observed.

Franklin, Alexander (2018) Whence the Effectiveness of Effective Field Theories? The British Journal for the Philosophy of Science.

Author(s): Florian Fröwis, Pavel Sekatski, Wolfgang Dür, Nicolas Gisin, and Nicolas Sangouard

Schrödinger’s thought experiment of a cat in superposition of being dead and alive.

[Rev. Mod. Phys. 90, 025004] Published Thu May 31, 2018

Walter, Scott A. (2018) Figures of light in the early history of relativity (1905-1914). [Preprint]
El Skaf, Rawad (2018) The function and limit of Galileo’s falling bodies thought experiment: Absolute weight, Specific weight and the Medium’s resistance. [Preprint]
Biener, Zvi (2017) The Certainty, Modality, and Grounding of Newton’s Laws. The Monist, 100 (3). pp. 311-325.
Fraser, James D. (2018) Towards a Realist View of Quantum Field Theory. [Preprint]

Authors: Andrew CoatesCharles Melby-ThompsonShinji Mukohyama

In the context of Horava gravity, the most promising known scenarios to recover Lorentz invariance at low energy are the possibilities that (1) the renormalization group flow of the system leads to emergent infrared Lorentz invariance, and (2) that supersymmetry protects infrared Lorentz invariance. A third scenario proposes that a classically Lorentz invariant matter sector with controlled quantum corrections may simply co-exist with Horava gravity under certain conditions. However, for non-projectable Horava gravity in 3+1 dimensions it is known that, in the absence of additional structures, this mechanism is spoiled by unexpected power-law divergences. We confirm this same result in the projectable version of the theory by employing the recently found gauge-fixing term that renders the shift and graviton propagators regular. We show that the problem persists for all dimensions $D\geq 3$, and that the degree of fine tuning in squared sound speeds between a U(1) gauge field and a scalar field increases with $D$. In particular, this difference in the zero external momentum limit is proportional to $\Lambda^{D-1}$ for $D\geq 3$, where $\Lambda$ is the ultraviolet momentum cutoff for loop integrals, while the power-law divergences are absent for $D=1$ and $D=2$. These results suggest that not only the gravity sector but also the matter sector should exhibit a transition to Lifshitz scaling above some scale, and that there should not be a large separation between the transition scales in the gravity and matter sectors. We close with a discussion of other more promising scenarios, including emergent Lorentz invariance from supersymmetry/strong dynamics, and pointing out challenges where they exist.

Authors: Nicolas Gisin

It is usual to identify initial conditions of classical dynamical systems with mathematical real numbers. However, almost all real numbers contain an infinite amount of information. Since a finite volume of space can’t contain more than a finite amount of information, I argue that the mathematical real numbers are not physically relevant. Moreover, a better terminology for the so-called real numbers is “random numbers”, as their series of bits are truly random. I propose an alternative classical mechanics, which is empirically equivalent to classical mechanics, but uses only finite-information numbers. This alternative classical mechanics is non-deterministic, despite the use of deterministic equations, in a way similar to quantum theory. Interestingly, both alternative classical mechanics and quantum theories can be supplemented by additional variables in such a way that the supplemented theory is deterministic. Most physicists straightforwardly supplement classical theory with real numbers to which they attribute physical existence, while most physicists reject Bohmian mechanics as supplemented quantum theory, arguing that Bohmian positions have no physical reality. I argue that it is more economical and natural to accept non-determinism with potentialities as a real mode of existence, both for classical and quantum physics.

Authors: Matteo CarlessoAndrea VinanteAngelo Bassi

Recently, a non-thermal excess noise, compatible with the theoretical prediction provided by collapse models, was measured in a millikelvin nanomechanical cantilever experiment [Vinante et al., Phys. Rev. Lett. 119, 110401 (2017)]. We propose a feasible implementation of the cantilever experiment able to probe such a noise. The proposed modification, completely within the grasp of current technology and readily implementable also in other type of mechanical non-interferometric experiments, consists in substituting the homogeneous test mass with one composed of different layers of different materials. This will enhance the action of a possible collapse noise above that given by standard noise sources.

Authors: N. David MerminRüdiger Schack

We review the famous no-hidden-variables theorem in John von Neumann’s 1932 book on the mathematical foundations of quantum mechanics. We describe the notorious gap in von Neumann’s argument, pointed out by Grete Hermann in 1935 and, more famously, by John Bell in 1966. We disagree with recent papers claiming that Hermann and Bell failed to understand what von Neumann was actually doing.

Authors: Jochen Szangolies

In-principle restrictions on the amount of information that can be gathered about a system have been proposed as a foundational principle in several recent reconstructions of the formalism of quantum mechanics. However, it seems unclear precisely why one should be thus restricted. We investigate the notion of paradoxical self-reference as a possible origin of such epistemic horizons by means of a fixed-point theorem in Cartesian closed categories due to F. W. Lawvere that illuminates and unifies the different perspectives on self-reference.

Authors: Yakir AharonovLev Vaidman

Possibility to communicate between spatially separated regions, without even a single photon passing between the two parties, is an amazing quantum phenomenon. The possibility of transmitting one value of a bit in such a way, the interaction-free measurement, was known for quarter of a century. The protocols of full communication, including transmitting unknown quantum states were proposed only few years ago, but it was shown that in all these protocols the particle was leaving a weak trace in the transmission channel, the trace larger than the trace left by a single particle passing through the channel. This made the claim of counterfactuality of these protocols at best controversial. However, a simple modification of these recent protocols eliminates the trace in the transmission channel and makes all these protocols truly counterfactual.

Authors: Manabendra Nath BeraAndreas WinterMaciej Lewenstein

Thermodynamics and information have intricate inter-relations. The justification of the fact that information is physical, is done by inter-linking information and thermodynamics – through Landauer’s principle. This modern approach towards information recently has improved our understanding of thermodynamics, both in classical and quantum domains. Here we show thermodynamics as a consequence of information conservation. Our approach can be applied to most general situations, where systems and thermal-baths could be quantum, of arbitrary sizes and even could posses inter-system correlations. The approach does not rely on an a priori predetermined temperature associated to a thermal bath, which is not meaningful for finite-size cases. Hence, the thermal-baths and systems are not different, rather both are treated on an equal footing. This results in a “temperature”-independent formulation of thermodynamics. We exploit the fact that, for a fix amount of coarse-grained information, measured by the von Neumann entropy, any system can be transformed to a state that possesses minimal energy, without changing its entropy. This state is known as a completely passive state, which assumes Boltzmann-Gibb’s canonical form with an intrinsic temperature. This leads us to introduce the notions of bound and free energy, which we further use to quantify heat and work respectively. With this guiding principle of information conservation, we develop universal notions of equilibrium, heat and work, Landauer’s principle and also universal fundamental laws of thermodynamics. We show that the maximum efficiency of a quantum engine, equipped with a finite baths, is in general lower than that of an ideal Carnot’s engine. We also introduce a resource theoretic framework for intrinsic-temperature based thermodynamics, within which we address the problem of work extraction and state transformations.

Myrvold, Wayne C. (2018) Ontology of Relativistic Collapse Theories. [Preprint]

Article written by