Weekly Papers on Quantum Foundations (32)

Author(s): S. Bachmann, W. De Roeck, and M. Fraas

The first proof of the quantum adiabatic theorem was given as early as 1928. Today, this theorem is increasingly applied in a many-body context, e.g., in quantum annealing and in studies of topological properties of matter. In this setup, the rate of variation ϵ of local terms is indeed small compar…
[Phys. Rev. Lett. 119, 060201] Published Fri Aug 11, 2017

Authors: Leonard Susskind

These are some thoughts contained in a letter to colleagues, about the close relation between gravity and quantum mechanics, and also about the possibility of seeing quantum gravity in a lab equipped with quantum computers. I expect this will become feasible sometime in the next decade or two.

Authors: Naoki Yamamoto

In the classical gravitational lensing, a light ray is locally deflected in the direction of $-{\boldsymbol \nabla} \phi$ by the gravitational potential $\phi$. We show that the quantum correction due to the helicity of photons leads to the displacement of the trajectory of light in the direction perpendicular to both ${\boldsymbol \nabla} \phi$ and the classical trajectory, and, in particular, to the splitting of the trajectories of right- and left-handed circularly polarized light. We derive the expression for this gravitational quantum Hall effect (GQHE) in terms of the Berry curvature of photons. We also derive the semiclassical equation of motion for gravitons taking into account the Berry curvature, and find that the GQHE of gravitational waves in curved space is twice as large as that of light.

Authors: Gautam SharmaArun Kumar Pati

Quantum measurements necessarily disturb the state of physical system. Once we perform a complete measurement, the system undergoes decoherence and loses its coherence. If there is no disturbance, the state retains all of its coherence. It is therefore natural to ask if there is trade-off between disturbance caused to a state and its coherence. We present a coherence disturbance complementarity relation using the relative entropy of coherence. For bipartite states we prove a complementarity relation between the quantum coherence, entanglement and disturbance. Similar relation also holds for quantum coherence, quantum discord and disturbance for a bipartite state. We illustrate the trade-off between the coherence and the disturbance for single qubit state for various quantum channels.

Authors: Yuqian ZhouYu CaiJean-Daniel BancalFei GaoValerio Scarani

There is an ongoing search for a physical or operational definition for quantum mechanics. Several informational principles have been proposed which are satisfied by a theory less restrictive than quantum mechanics. Here, we introduce the principle of “many-box locality”, which is a refined version of the previously proposed “macroscopic locality”. These principles are based on coarse-graining the statistics of several copies of a given box. The set of behaviors satisfying many-box locality for $N$ boxes is denoted $MBL_N$. We study these sets in the bipartite scenario with two binary measurements, in relation with the sets $\mathcal{Q}$ and $\mathcal{Q}_{1+AB}$ of quantum and “almost quantum” correlations. We find that the $MBL_N$ sets are in general not convex. For unbiased marginals, by working in the Fourier space we can prove analytically that $MBL_{N}\subsetneq\mathcal{Q}$ for any finite $N$, while $MBL_{\infty}=\mathcal{Q}$. Then, with suitably developed numerical tools, we find an example of a point that belongs to $MBL_{16}$ but not to $\mathcal{Q}_{1+AB}$. Among the problems that remain open, is whether $\mathcal{Q}\subset MBL_{\infty}$.

Author(s): M. Lanzagorta and T. Crowder

We show that the paradox published by Saldanha and Vedral [Phys. Rev. A 87, 042102 (2013)] is based on an improper interpretation of the localization probability density and a miscalculation of the wave functions. This Comment shows that there is no paradox as stated.
[Phys. Rev. A 96, 026101] Published Thu Aug 10, 2017

Authors: R. E. KastnerS. Kauffman

It is suggested that the apparently disparate cosmological phenomena attributed to so-called ‘dark matter’ and ‘dark energy’ arise from the same fundamental physical process: the emergence, from the quantum level, of spacetime itself. This creation of spacetime results in metric expansion around mass points in addition to the usual curvature due to stress-energy sources of the gravitational field. A recent modification of Einstein’s theory of general relativity by Chadwick, Hodgkinson, and McDonald incorporating spacetime expansion around mass points, which accounts well for the observed galactic rotation curves, is adduced in support of the proposal. Recent observational evidence corroborates a prediction of the model that the apparent amount of ‘dark matter’ increases with the age of the universe. In addition, the proposal leads to the same result for the small but nonvanishing cosmological constant, related to ‘dark energy, as that of the causet model of Sorkin et al.

Gao, Shan (2017) Failure of psychophysical supervenience in Everett’s theory. [Preprint]
List, Christian (2017) Levels: descriptive, explanatory, and ontological. [Preprint]

Authors: Carlo Rovelli

A new phenomenon, recently studied in theoretical physics, may have considerable interest for astronomers: the explosive decay of old primordial black holes via quantum tunnelling. Models predict radio and gamma bursts with a characteristic frequency-distance relation making them identifiable. Their detection would be of major theoretical importance.

Author(s): Apoorva Patel and Parveen Kumar

Projective measurement is used as a fundamental axiom in quantum mechanics, even though it is discontinuous and cannot predict which measured operator eigenstate will be observed in which experimental run. The probabilistic Born rule gives it an ensemble interpretation, predicting proportions of var…
[Phys. Rev. A 96, 022108] Published Mon Aug 07, 2017

Author(s): Arun Sehrawat

The Born rule provides a probability vector (distribution) with a quantum state for a measurement setting. For two settings, we have a pair of vectors from the same quantum state. Each pair forms a combined-probability vector that obeys certain quantum constraints, which are triangle inequalities in…
[Phys. Rev. A 96, 022111] Published Mon Aug 07, 2017

Authors: Angel J. GallegoRoman Orus

In this paper we consider some well-known facts in syntax from a physics perspective, which allows us to establish some remarkable equivalences. Specifically, we observe that the operation MERGE put forward by N. Chomsky in 1995 can be interpreted as a physical information coarse-graining. Thus, MERGE in linguistics entails information renormalization in physics, according to different time scales. We make this point mathematically formal in terms of language models, i.e., probability distributions over word sequences, widely used in natural language processing as well as other ambits. In this setting, MERGE corresponds to a 3-index probability tensor implementing a coarse-graining, akin to a probabilistic context-free grammar. The probability vectors of meaningful sentences are naturally given by tensor networks (TN) that are mostly loop-free, such as Tree Tensor Networks and Matrix Product States. These structures have short-ranged correlations in the syntactic distance by construction and, because of the peculiarities of human language, they are extremely efficient to manipulate computationally. We also propose how to obtain such language models from probability distributions of certain TN quantum states, which we show to be efficiently preparable by a quantum computer. Moreover, using tools from quantum information and entanglement theory, we use these quantum states to prove classical lower bounds on the perplexity of the probability distribution for a set of words in a sentence. Implications of these results are discussed in the ambits of theoretical and computational linguistics, artificial intelligence, programming languages, RNA and protein sequencing, quantum many-body systems, and beyond.


This paper considers the importance of unification in the context of developing scientific theories. I argue that unifying hypotheses are not valuable simply because they are supported by multiple lines of evidence. Instead, they can be valuable because they guide experimental research in different domains in such a way that the results from those experiments inform the scope of the theory being developed. I support this characterization by appealing to the early development of quantum theory. I then draw some comparisons with discussions of robustness reasoning.

Article written by