Weekly Papers on Quantum Foundations (19)

Publication date: Available online 17 May 2019

Source: Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics

Author(s): Adam Koberinski

Abstract

In this paper I will focus on the case of the discovery of parity nonconservation in weak interactions from the period spanning 1947–1957, and the lessons this episode provides for successful theory construction in HEP. I aim to (a) summarize the history into a coherent story for philosophers of science, and (b) use the history as a case study for the epistemological evolution of the understanding of weak interactions in HEP. Iconclude with some philosophical lessons regarding theory construction in physics.

All most people hear about is quantum computing, but that’s hardly the whole story

— Read more on ScientificAmerican.com

      

show enclosure

Constraints on Symmetries from Holography

  

on 2019-5-17 10:00am GMT

Author(s): Daniel Harlow and Hirosi Ooguri

Insights from the AdS/CFT correspondence provide a glimpse of what global kinematical properties of viable quantum theories of gravity might be.


[Phys. Rev. Lett. 122, 191601] Published Fri May 17, 2019

Authors: Marco Piva

We point out the idea that, at small scales, gravity can be described by the standard degrees of freedom of general relativity, plus a scalar particle and a degree of freedom of a new type: the fakeon. This possibility leads to fundamental implications in understanding gravitational force at quantum level as well as phenomenological consequences in the corresponding classical theory.

Authors: Robert L. Navin

This paper posits the existence of, and finds a candidate for, a variable change that allows quantum mechanics to be interpreted as quantum geometry. The Bohr model of the Hydrogen atom is thought of in terms of an indeterministic electron position and a deterministic metric and the motivation for this paper is to try to change variables to have a deterministic position and momentum for the electron and nucleus but with an indeterministic (quantum) metric that reproduces the physics of the Bohr model. This mapping is achieved by allowing the metric in the Hamiltonian to be different to the metric in the space-time distance element and then representing the two metrics with vierbeins and assuming they are canonically conjugate variables. Effectively, the usual Schr\”odinger space-time variables have been re-interpreted as four of the potentially sixteen parameters of the metric tensor vierbein in the distance element while the metric tensor vierbein in the Hamiltonian is an operator expressible as first-order derivatives in these variables or vice versa. I then argue that this reproduces observed quantum physics at the sub-atomic level by demonstrating the energy spectrum of electron orbitals is exactly the same as the usual relativistic Bohr model for the Hydrogen atom in a certain limit. Next, by introducing a single dimensionless running coupling that shows up in the analogous place as, but in addition to, Planck’s constant in the commutator definition I argue that this allows massive objects to couple to the physical space-time geometry but not massless ones – no matter coupling value. This claim is based on a fit to the Schwarzschild metric with a few simple assumptions and thus obtaining an effective theory of how the quantum geometries at nearby space-time points couple to one another. This demonstrates that this coupling constant is related to Newton’s gravitational constant.

Authors: Arkady Bolotin

A common way of stating the non-cloning theorem — one of distinguishing characteristics of quantum theory — is that one cannot make a copy of an arbitrary unknown quantum state. Even though this theorem is an important part of the ongoing discussion of the nature of a quantum state, the role of the theorem in the logical-algebraic approach to quantum theory has not yet been systematically studied. According to the standard point of view (which is in line with the logical tradition), quantum cloning amounts to two classical rules of inference, namely, monotonicity and idempotency of entailment. One can conclude then that the whole of quantum theory should be described through a logic wherein these rules do not hold, which is linear logic. However, in accordance with a supervaluational semantics (that allows one to retain all the theorems of classical logic while admitting `truth-value gaps’), quantum cloning necessitates the permanent loss of the truth values of experimental quantum propositions which violates the unalterability of the past. The present paper demonstrates this.

Authors: Juerg Froehlich

To begin with, some of the conundrums concerning Quantum Mechanics and its interpretation(s) are recalled. Subsequently, a sketch of the “ETH-Approach to Quantum Mechanics” is presented. This approach yields a logically coherent quantum theory of “events” featured by physical systems and of direct or projective measurements of physical quantities, without the need to invoke “observers”. It enables one to determine the stochastic time evolution of states of physical systems. We also briefly comment on the quantum theory of indirect or weak measurements, which is much easier to understand and more highly developed than the theory of direct (projective) measurements. A relativistic form of the ETH-Approach will be presented in a separate paper.

Authors: Dayou YangAndrey GrankinLukas M. SiebererDenis V. VasilyevPeter Zoller

An ideal quantum measurement collapses the wave function of a quantum system to an eigenstate of the measured observable, with the corresponding eigenvalue determining the measurement outcome. For a quantum non-demolition (QND) observable, i.e., one that commutes with the Hamiltonian generating the system’s time evolution, repeated measurements yield the same result, corresponding to measurements with minimal disturbance. This concept applies universally to single quantum particles as well as to complex many-body systems. However, while QND measurements of systems with few degrees of freedom has been achieved in seminal quantum optics experiments, it is an open challenge to devise QND measurement of a complex many-body observable. Here, we describe how a QND measurement of the Hamiltonian of an interacting many-body system can be implemented in a trapped-ion analog quantum simulator. Through a single shot measurement, the many-body system is prepared in a narrow energy band of (highly excited) energy eigenstates, and potentially even a single eigenstate. Our QND scheme, which can be carried over to other platforms of quantum simulation, provides a novel framework to investigate experimentally fundamental aspects of equilibrium and non-equilibrium statistical physics including the eigenstate thermalization hypothesis (ETH) and quantum fluctuation relations.

Authors: Inge S. Helland

A conceptual variable is any variable defined by a person or by a group of persons. Such variables may be inaccessible, meaning that they cannot be measured with arbitrary accuracy on the physical system under consideration at any given time. An example may be the spin vector of a particle; another example may be the vector (position, momentum). In this paper, a space of inaccessible conceptual variables is defined, and group actions are defined on this space. Accessible functions are then defined on the same space. Assuming this structure, the basic Hilbert space structure of quantum theory is derived: Operators on a Hilbert space corresponding to the accessible variables are introduced; when these operators have a discrete spectrum, a natural model reduction implies a new model in which the values of the accessible variables are the eigenvalues of the operator. The principle behind this model reduction demands that a group action may also be defined also on the accessible variables; this is possible if the corresponding functions are permissible, a term that is precisely defined. The following recent principle from statistics is assumed: every model reduction should be to an orbit or to a set of orbits of the group. From this derivation, a new interpretation of quantum theory is briefly discussed: I argue that a state vector may be interpreted as connected to a focused question posed to nature together with a definite answer to this question. Further discussion of these topics is provided in a recent book published by the author of this paper.

Authors: Inge S. Helland

The interpretation of quantum mechanics has been discussed since this theme first was brought up by Einstein and Bohr. This article describes a proposal for a new foundation of quantum theory, partly drawing upon ideas from statistical inference theory. The approach can be said to have an intuitive basis: The quantum states of a physical system are under certain conditions in one-to-one correspondence with the following: 1. Focus on a concrete question to nature and then 2. Give a definite answer to this question. This foundation implies an epistemic interpretation, depending upon the observer, but the objective world is restored when all observers agree on their observations on some variables. The article contains a survey of parts of the authors books on epistemic processes, which give more details about the theory. At the same time, the article extends some of the discussion in the books, and at places makes it more precise.

Publication date: Available online 11 May 2019

Source: Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics

Author(s): Katsuaki Higashi

Abstract

According to a conventional view, there exists no common cause model of quantum correlations satisfying locality requirements. Indeed, Bell’s inequality is derived from some locality requirements and the assumption that the common cause exists, and the violation of the inequality has been experimentally verified. On the other hand, some researchers argued that in the derivation of the inequality, the existence of a common common-cause for multiple correlations is implicitly assumed and that the assumption is unreasonably strong. According to their idea, what is necessary for explaining the quantum correlation is a common cause for each correlation. However, Graβhoff et al. showed that when there are three pairs of perfectly correlated events and a common cause of each correlation exist, we cannot construct a common cause model that is consistent with quantum mechanical prediction and also meets several locality requirements. In this paper, first, as a consequence of the fact shown by Graβhoff et al., we will confirm that there exists no local common cause model when a two-particle system is in any maximally entangled state. After that, based on Hardy’s famous argument, we will prove that there exists no local common cause model when a two-particle system is in any non-maximally entangled state. Therefore, it will be concluded that for any entangled state, there exists no local common cause model. It will be revealed that the non-existence of a common cause model satisfying locality is not limited to a particular state like the singlet state.

Publication date: Available online 14 May 2019

Source: Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics

Author(s): Jonathan F. Schonfeld

Abstract

I argue that the marquis characteristics of the quantum-mechanical double-slit experiment (point detection, random distribution, Born rule) can be explained using Schroedinger’s equation alone, if one takes into account that, for any atom in a detector, there is a small but nonzero gap between its excitation energy and the excitation energies of all other relevant atoms in the detector (isolated-levels assumption). To illustrate the point I introduce a toy model of a detector. The form of the model follows common practice in quantum optics and cavity QED. Each detector atom can be resonantly excited by the incoming particle, and then emit a detection signature (e.g., bright flash of light) or dissipate its energy thermally. Different atoms have slightly different resonant energies per the isolated-levels assumption, and the projectile preferentially excites the atom with the closest energy match. The toy model permits one easily to estimate the probability that any atom is resonantly excited, and also that a detection signature is produced before being overtaken by thermal dissipation. The end-to-end detection probability is the product of these two probabilities, and is proportional to the absolute-square of the incoming wavefunction at the atom in question, i.e. the Born rule. I consider how closely a published neutron interference experiment conforms to the picture developed here; I show how this paper’s analysis steers clear of creating a scenario with local hidden variables; I show how the analysis steers clear of the irreversibility implicit in the projection postulate; and I discuss possible experimental tests of this paper’s ideas. Hopefully, this is a significant step toward realizing the program of solving the measurement problem within unitary quantum mechanics envisioned by Landsman, among others.

Ackermann, Matthias (2019) A Comparison of Two Presentations of Quantum Mechanics – Everett’s Relative-State and Rovelli’s Relational Quantum Mechanics. [Preprint]
Chall, Cristin and King, Martin and Mättig, Peter and Stöltzner, Michael (2019) From a Boson to the Standard Model Higgs: A Case Study in Confirmation and Model Dynamics. Synthese. ISSN 0039-7857
De Haro, Sebastian and Butterfield, Jeremy (2018) On symmetry and duality. [Preprint]
Schneider, Mike D. (2018) What’s the problem with the cosmological constant? [Preprint]
Schroeren, David (2019) Symmetry Fundamentalism. [Preprint]
Kiessling, Michael K.-H. (2019) The influence of gravity on the Boltzmann entropy of a closed universe. [Preprint]

Author(s): Eyuri Wakakuwa, Akihito Soeda, and Mio Murao

We prove a trade-off relation between the entanglement cost and classical communication round complexity of a protocol in implementing a class of two-qubit unitary gates by two distant parties, a key subroutine in distributed quantum information processing. The task is analyzed in an information the…

[Phys. Rev. Lett. 122, 190502] Published Thu May 16, 2019

Nature, Published online: 15 May 2019; doi:10.1038/s41586-019-1196-1

An array of superconducting qubits in an open one-dimensional waveguide is precisely controlled to create an artificial quantum cavity–atom system that reaches the strong-coupling regime without substantial decoherence.

As one of the most famous physicists of the 20th century, Richard Feynman was known for a lot. Early in his career, he contributed to the development of the first atomic bomb as a group leader of the Manhattan Project. Hans Bethe, the scientific leader of the project who won a Nobel Prize in Physics in 1967 (two years after Feynman did), has been quoted on what set his protégé apart: “There are two types of genius. Ordinary geniuses do great things, but they leave you room to believe that you could do the same if only you worked hard enough. Then there are magicians, and you can have no idea how they do it. Feynman was a magician.”

In his 1993 biography Genius, James Gleick called Feynman “brash,” “ebullient” and “the most brilliant, iconoclastic and influential physicist of modern times.” Feynman captured the popular imagination when he played the bongo drums and sang about orange juice. He was a fun-loving, charismatic practical joker who toured America on long road trips. His colleague Freeman Dyson described him as “half genius and half buffoon.” At times, his oxygen-sucking arrogance rubbed some the wrong way, and his performative sexism looks very different to modern eyes. Feynman will also be remembered for his teaching: The lectures he delivered to Caltech freshmen and sophomores in 1962 set the gold standard in physics instruction and, when later published as a three-volume set, sold millions of copies worldwide.

What most people outside of the physics community are likely to be least familiar with is the work that counts as Feynman’s crowning scientific achievement. With physicists in the late 1940s struggling to reformulate a relativistic quantum theory describing the interactions of electrically charged particles, Feynman conjured up some Nobel Prize-winning magic. He introduced a visual method to simplify the seemingly impossible calculations needed to describe basic particle interactions. As Gleick put it in Genius, “He took the half-made conceptions of waves and particles in the 1940s and shaped them into tools that ordinary physicists could use and understand.” Through the work of Feynman, Dyson, Julian Schwinger and Sin-Itiro Tomonaga, a new and improved theory of quantum electrodynamics was born.

Feynman’s lines and squiggles, which became known as Feynman diagrams, have since “revolutionized nearly every aspect of theoretical physics,” wrote the historian of science David Kaiser in 2005. “In the same way that computer-enabled computation might today be said to be enabling a genomic revolution, Feynman diagrams helped to transform the way physicists saw the world, and their place in it.”

To learn more about Feynman diagrams and how they’ve changed the way physicists work, watch our new In Theory video:

Decades later, as Natalie Wolchover reported in 2013, “it became apparent that Feynman’s apparatus was a Rube Goldberg machine.” Even the collision of two subatomic particles called gluons to produce four less energetic gluons, an event that happens billions of times a second during collisions at the Large Hadron Collider, she wrote, “involves 220 diagrams, which collectively contribute thousands of terms to the calculation of the scattering amplitude.” Now, a group of physicists and mathematicians is studying a geometric object called an “amplituhedron” that has the potential to further simplify calculations of particle interactions.

Meanwhile, other physicists hope that emerging connections between Feynman diagrams and number theory can help identify patterns in the values generated from more complicated diagrams. As Kevin Hartnett reported in 2016, understanding these patterns could make particle calculations much simpler, but like the amplituhedron approach, this is still a work in progress.

“Feynman diagrams remain a treasured asset in physics because they often provide good approximations to reality,” wrote the Nobel Prize-winning physicist Frank Wilczek three years ago. “They help us bring our powers of visual imagination to bear on worlds we can’t actually see.”

If you liked the fifth and final episode from season two of Quanta’s In Theory video series, you may also enjoy our previous videos on universalityquantum gravityemergence and turbulence.

show enclosure

Author(s): F. Laloë

We discuss a model of spontaneous collapse of the quantum state that does not require adding any stochastic processes to the standard dynamics. The additional ingredient with respect to the wave function is a position in the configuration space which drives the collapse in a completely deterministic…

[Phys. Rev. A 99, 052111] Published Tue May 14, 2019

Dieks, Dennis (2019) Quantum Reality, Perspectivalism and Covariance. [Preprint]
Henriksson, Andreas (2019) On the ergodic theorem and information loss in statistical mechanics. [Preprint]
Martens, Niels C.M. (2019) Machian Comparativism about Mass. The British Journal for the Philosophy of Science. ISSN 1464-3537
Henriksson, Andreas (2019) On the Gibbs-Liouville theorem in classical mechanics. [Preprint]
Palacios, Patricia (2019) Phase Transitions: A Challenge for Intertheoretic Reduction? [Preprint]
Kastner, Ruth (2019) The “Delayed Choice Quantum Eraser” Neither Erases Nor Delays. [Preprint]

Article written by