Weekly Papers on Quantum Foundations (8)

Peacock, Kent A. (2014) Would Superluminal Influences Violate the Principle of Relativity ? [Published Article or Volume]
Vervoort, Louis (2014) The Manipulability Account of Causation Applied to Typical Physical Systems. [Published Article or Volume]

Author(s): Mario Krenn, Armin Hochrainer, Mayukh Lahiri, and Anton Zeilinger

Quantum entanglement is one of the most prominent features of quantum mechanics and forms the basis of quantum information technologies. Here we present a novel method for the creation of quantum entanglement in multipartite and high-dimensional systems. The two ingredients are (i) superposition of …
[Phys. Rev. Lett. 118, 080401] Published Thu Feb 23, 2017

Author(s): Keren Li, Guofei Long, Hemant Katiyar, Tao Xin, Guanru Feng, Dawei Lu, and Raymond Laflamme

Superposition, arguably the most fundamental property of quantum mechanics, lies at the heart of quantum information science. However, how to create the superposition of any two unknown pure states remains as a daunting challenge. Recently, it was proved that such a quantum protocol does not exist i…
[Phys. Rev. A 95, 022334] Published Thu Feb 23, 2017

Authors: Igor PikovskiMagdalena ZychFabio CostaČaslav Brukner

In a recent paper (arXiv:1701.04298 [quant-ph]) Toro\v{s}, Gro{\ss}ardt and Bassi claim that the potential necessary to support a composite particle in a gravitational field must necessarily cancel the relativistic coupling between internal and external degrees of freedom. As such a coupling is responsible for the gravitational redshift measured in numerous experiments, the above statement is clearly incorrect. We identify the simple mistake in the paper responsible for the incorrect claim.

Authors: Oziel de AraujoHelder AlexanderMarcos SampaioIrismar da Paz

Quantum correlations of observables for two particle states have demonstrated the nonlocal character of the quantum mechanics. However nonlocality can be exhibited even for noncommuting observables of a single particle system. In this paper we show nonlocality of position-momentum correlations of a single particle in the double-slit experiment modeled by an initially correlated Gaussian wavepacket. The positivity or negativity of the Wigner function for the state of the particle at the detection screen is related with the $\sigma_{xp}$ covariances. A Bell’s inequality is constructed from the Wigner function and it is violated for both the positive and negative cases. The case of positive Wigner function is the analogous of the original EPR state for a single particle.

Pitts, J. Brian (2017) Underconsideration in Space-time and Particle Physics. [Preprint]

Authors: Philipp A Hoehn

We summarise a recent reconstruction of the quantum theory of qubits from rules constraining an observer’s acquisition of information about physical systems. This review of [arXiv:1412.8323arXiv:1511.01130] is accessible and fairly self-contained, focussing on the main ideas and results and not the technical details. The reconstruction offers an informational explanation for the architecture of the theory and specifically for its correlation structure. In particular, it explains entanglement, monogamy and non-locality compellingly from limited accessible information and complementarity. As a byproduct, it also unravels new `conserved informational charges’ from complementarity relations that characterise the unitary group and the set of pure states.

Authors: Paolo GloriosoHong Liu

The second law of thermodynamics states that for a thermally isolated system entropy never decreases. Most physical processes we observe in nature involve variations of macroscopic quantities over spatial and temporal scales much larger than microscopic molecular collision scales and thus can be considered as in local equilibrium. For a many-body system in local equilibrium a stronger version of the second law applies which says that the entropy production at each spacetime point should be non-negative. In this paper we provide a proof of the second law for such systems and a first derivation of the local second law. For this purpose we develop a general non-equilibrium effective field theory of slow degrees of freedom from integrating out fast degrees of freedom in a quantum many-body system and consider its classical limit. The key elements of the proof are the presence of a $Z_2$ symmetry, which can be considered as a proxy for local equilibrium and micro-time-reversibility, and a classical remnant of quantum unitarity. The $Z_2$ symmetry leads to a local current from a procedure analogous to that used in the Noether theorem. Unitarity leads to a definite sign of the divergence of the current. We also discuss the origin of an arrow of time, as well as the coincidence of causal and thermodynamical arrows of time. Applied to hydrodynamics, the proof gives a first-principle derivation of the phenomenological entropy current condition and provides a constructive procedure for obtaining the entropy current.

Authors: David Edward Bruschi

We propose the time evolution formalism for quantum systems. This formalism falls within the scope of a recently proposed theory of gravitating quantum matter. We find that systems in nonclassical states tend to “decohere” towards the energy eigenstate with highest energy. The decoherence time depends on the amount of coherence, or entanglement, present in the state. The theory implies that time evolution of quantum systems depends on their initial state. The scope, applications and validity of this proposal are also discussed.

Authors: Peter BierhorstEmanuel KnillScott GlancyAlan MinkStephen JordanAndrea RommalYi-Kai LiuBradley ChristensenSae Woo NamLynden K. Shalm

Random numbers are an important resource for applications such as numerical simulation and secure communication. However, it is difficult to certify whether a physical random number generator is truly unpredictable. Here, we exploit the phenomenon of quantum nonlocality in a loophole-free photonic Bell test experiment for the generation of randomness that cannot be predicted within any physical theory that allows one to make independent measurement choices and prohibits superluminal signaling. To certify and quantify the randomness, we describe a new protocol that performs well in an experimental regime characterized by low violation of Bell inequalities. Applying an extractor function to our data, we obtained 256 new random bits, uniform to within 0.001.

Authors: Matthias LienertSören PetratRoderich Tumulka

In non-relativistic quantum mechanics of $N$ particles in three spatial dimensions, the wave function $\psi(q_1,\ldots,q_N,t)$ is a function of $3N$ position coordinates and one time coordinate. It is an obvious idea that in a relativistic setting, such functions should be replaced by $\phi((t_1,q_1),\ldots,(t_N,q_N))$, a function of $N$ space-time points called a multi-time wave function because it involves $N$ time variables. Its evolution is determined by $N$ Schr\”odinger equations, one for each time variable; to ensure that simultaneous solutions to these $N$ equations exist, the $N$ Hamiltonians need to satisfy a consistency condition. This condition is automatically satisfied for non-interacting particles, but it is not obvious how to set up consistent multi-time equations with interaction. For example, interaction potentials (such as the Coulomb potential) make the equations inconsistent, except in very special cases. However, there have been recent successes in setting up consistent multi-time equations involving interaction, in two ways: either involving zero-range ($\delta$ potential) interaction or involving particle creation and annihilation. The latter equations provide a multi-time formulation of a quantum field theory. The wave function in these equations is a multi-time Fock function, i.e., a family of functions consisting of, for every $n=0,1,2,\ldots$, an $n$-particle wave function with $n$ time variables. These wave functions are related to the Tomonaga-Schwinger approach and to quantum field operators, but, as we point out, they have several advantages.

Authors: N. CottetS. JezouinL. BretheauP. Campagne-IbarcqQ. FicheuxJ. AndersA. AuffèvesR. Azouit,P. RouchonB. Huard

In apparent contradiction to the laws of thermodynamics, Maxwell’s demon is able to cyclically extract work from a system in contact with a thermal bath exploiting the information about its microstate. The resolution of this paradox required the insight that an intimate relationship exists between information and thermodynamics. Here, we realize a Maxwell demon experiment that tracks the state of each constituent both in the classical and quantum regimes. The demon is a microwave cavity that encodes quantum information about a superconducting qubit and converts information into work by powering up a propagating microwave pulse by stimulated emission. Thanks to the high level of control of superconducting circuits, we directly measure the extracted work and quantify the entropy remaining in the demon’s memory. This experiment provides an enlightening illustration of the interplay of thermodynamics with quantum information.

McCoy, C.D. (2017) Prediction in General Relativity. [Published Article or Volume]
Publication date: Available online 16 February 2017
Source:Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics
Author(s): Dennis Dieks
According to what has become a standard history of quantum mechanics, in 1932 von Neumann persuaded the physics community that hidden variables are impossible as a matter of principle, after which leading proponents of the Copenhagen interpretation put the situation to good use by arguing that the completeness of quantum mechanics was undeniable. This state of affairs lasted, so the story continues, until Bell in 1966 exposed von Neumann’s proof as obviously wrong. The realization that von Neumann’s proof was fallacious then rehabilitated hidden variables and made serious foundational research possible again. It is often added in recent accounts that von Neumann’s error had been spotted almost immediately by Grete Hermann, but that her discovery was of no effect due to the dominant Copenhagen Zeitgeist. We shall attempt to tell a story that is more historically accurate and less ideologically charged. Most importantly, von Neumann never claimed to have shown the impossibility of hidden variables tout court, but argued that hidden-variable theories must possess a structure that deviates fundamentally from that of quantum mechanics. Both Hermann and Bell appear to have missed this point; moreover, both raised unjustified technical objections to the proof. Von Neumann’s argument was basically that hidden-variables schemes must violate the “quantum principle” that physical quantities are to be represented by operators in a Hilbert space. As a consequence, hidden-variables schemes, though possible in principle, necessarily exhibit a certain kind of contextuality. As we shall illustrate, early reactions to Bohm’s theory are in agreement with this account. Leading physicists pointed out that Bohm’s theory has the strange feature that pre-existing particle properties do not generally reveal themselves in measurements, in accordance with von Neumann’s result. They did not conclude that the “impossible was done” and that von Neumann had been shown wrong.

Publication date: Available online 15 February 2017
Source:Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics
Author(s): O.J.E. Maroney
A quantum pre- and post-selection paradox involves making measurements at two separate times on a quantum system, and making inferences about the state of the system at an intermediate time, conditional upon the observed outcomes. The inferences lead to predictions about the results of measurements performed at the intermediate time, which have been well confirmed experimentally, but which nevertheless seem paradoxical when inferences about different intermediate measurements are combined. The three box paradox is the paradigm example of such an effect, where a ball is placed in one of three boxes and is shuffled between the boxes in between two measurements of its location. By conditionalising on the outcomes of those measurements, it is inferred that between the two measurements the ball would have been found with certainty in Box 1 and with certainty in Box 2, if either box been opened on their own. Despite experimental confirmation of the predictions, and much discussion, it has remained unclear what exactly is supposed to be paradoxical or what specifically is supposed to be quantum, about these effects. In this paper I identify precisely the conditions under which the quantum three box paradox occurs, and show that these conditions are the same as arise in the derivation of the Leggett–Garg Inequality, which is supposed to demonstrate the incompatibility of quantum theory with macroscopic realism. I will argue that, as in Leggett–Garg Inequality violations, the source of the effect actually lies in the disturbance introduced by the intermediate measurement, and that the quantum nature of the effect is that no classical model of measurement disturbance can reproduce the paradox.

Article written by