# Weekly Papers on Quantum Foundations (45)

Authors: Y. S. Kim

Two-photon states produce enough symmetry needed for Dirac’s construction of the two-oscillator system which produces the Lie algebra for the O(3,2) space-time symmetry. This O(3,2) group can be contracted to the inhomogeneous Lorentz group which, according to Dirac, serves as the basic space-time symmetry for quantum mechanics in the Lorentz-covariant world. Since the harmonic oscillator serves as the language of Heisenberg’s uncertainty relations, it is right to say that the symmetry of the Lorentz-covariant world, with Einstein’s $E = mc^2$, is derivable from Heisenberg’s uncertainty relations.

Photon interference and bunching are widely studied quantum effects that have also been proposed for high precision measurements. Here we construct a theoretical description of photon-interferometry on rotating platforms, specifically exploring the relation between non-inertial motion, relativity, and quantum mechanics. On the basis of this, we then propose an experiment where photon entanglement can be revealed or concealed solely by controlling the rotational motion of an interferometer, thus providing a route towards studies at the boundary between quantum mechanics and relativity.

 上午9:43 | quant-ph updates on arXiv.org

We discuss and review in this chapter the developing field of research of quantum simulation of gauge theories with ultracold atoms.

 上午9:43 | quant-ph updates on arXiv.org

Authors: Cesar Gomez

For any quantum algorithm given by a path in the space of unitary operators we define the computational complexity as the typical computational time associated with the path. This time is defined using a quantum time estimator associated with the path. This quantum time estimator is fully characterized by the Lyapunov generator of the path and the corresponding quantum Fisher information. The computational metric associated with this definition of computational complexity leads to a natural characterization of cost factors on the Lie algebra generators. Operator complexity growth in time is analyzed from this perspective leading to a simple characterization of Lyapunov exponent in case of chaotic Hamiltonians. The connection between complexity and entropy is expressed using the relation between quantum Fisher information about quantum time estimation and von Neumann entropy. This relation suggest a natural bound on computational complexity that generalizes the standard time energy quantum uncertainty. The connection between Lyapunov and modular Hamiltonian is briefly discussed. In the case of theories with holographic duals and for those reduced density matrix defined by tracing over a bounded region of the bulk, quantum estimation theory is crucial to estimate quantum mechanically the geometry of the tracing region. It is suggested that the corresponding quantum Fisher information associated with this estimation problem is at the root of the holographic bulk geometry.

 上午9:43 | quant-ph updates on arXiv.org

Authors: Maximilian Schlosshauer

Quantum decoherence plays a pivotal role in the dynamical description of the quantum-to-classical transition and is the main impediment to the realization of devices for quantum information processing. This paper gives an overview of the theory and experimental observation of the decoherence mechanism. We introduce the essential concepts and the mathematical formalism of decoherence, focusing on the picture of the decoherence process as a continuous monitoring of a quantum system by its environment. We review several classes of decoherence models and discuss the description of the decoherence dynamics in terms of master equations. We survey methods for avoiding and mitigating decoherence and give an overview of several experiments that have studied decoherence processes. We also comment on the role decoherence may play in interpretations of quantum mechanics and in addressing foundational questions.

With the long-term goal of studying quantum gravity in the lab, we propose holographic teleportation protocols that can be readily executed in table-top experiments. These protocols exhibit similar behavior to that seen in recent traversable wormhole constructions: information that is scrambled into one half of an entangled system will, following a weak coupling between the two halves, unscramble into the other half. We introduce the concept of “teleportation by size” to capture how the physics of operator-size growth naturally leads to information transmission. The transmission of a signal through a semi-classical holographic wormhole corresponds to a rather special property of the operator-size distribution we call “size winding”. For more general setups (which may not have a clean emergent geometry), we argue that imperfect size winding is a generalization of the traversable wormhole phenomenon. For example, a form of signalling continues to function at high temperature and at large times for generic chaotic systems, even though it does not correspond to a signal going through a geometrical wormhole, but rather to an interference effect involving macroscopically different emergent geometries. Finally, we outline implementations feasible with current technology in two experimental platforms: Rydberg atom arrays and trapped ions.

 上午9:43 | quant-ph updates on arXiv.org

Authors: Dean Alvin L. PablicoEric A. Galapon

We consider the quantum traversal time of an incident wave packet across a potential well using the theory of quantum time of arrival (TOA)-operators. This is done by constructing the corresponding TOA-operator across a potential well via quantization. The expectation value of the potential well TOA-operator is compared to the free particle case for the same incident wave packet. The comparison yields a closed-form expression of the quantum well traversal time which explicitly shows the classical contributions of the positive and negative momentum components of the incident wave packet and a purely quantum mechanical contribution significantly dependent on the well depth. An incident Gaussian wave packet is then used as an example. It is shown that for shallow potential wells, the quantum well traversal time approaches the classical traversal time across the well region when the incident wave packet is spatially broad and approaches the expected quantum free particle traversal time when the wave packet is localized. For deep potential wells, the quantum traversal time oscillates from positive to negative implying that the wave packet can be advanced or delayed.

 上午9:43 | quant-ph updates on arXiv.org

The strong exponential-time hypothesis (SETH) is a commonly used conjecture in the field of complexity theory. It states that CNF formulas cannot be analyzed for satisfiability with a speedup over exhaustive search. This hypothesis and its variants gave rise to a fruitful field of research, fine-grained complexity, obtaining (mostly tight) lower bounds for many problems in P whose unconditional lower bounds are hard to find. In this work, we introduce a framework of Quantum Strong Exponential-Time Hypotheses, as quantum analogues to SETH.

Using the QSETH framework, we are able to translate quantum query lower bounds on black-box problems to conditional quantum time lower bounds for many problems in BQP. As an example, we illustrate the use of the QSETH by providing a conditional quantum time lower bound of $\Omega(n^{1.5})$ for the Edit Distance problem. We also show that the $n^2$ SETH-based lower bound for a recent scheme for Proofs of Useful Work, based on the Orthogonal Vectors problem holds for quantum computation assuming QSETH, maintaining a quadratic gap between verifier and prover.

 上午9:43 | ScienceDirect Publication: Physics ReportsScienceDirect RSShttps://www.sciencedirect.com/journal/physics-reportsRSS for NodeTue, 23 Jul 2019 10:02:48 GMTCopyright © 2019 Elsevier B.V. All rights reservedRapid solidification as non-ergodic phenomenonPublication date: 20 July 2019Source: Physics Reports, Volume 818Author(s): P.K. Galenko, D. JouAbstractRapid solidification is a relevant physical phenomenon in material sciences, whose theoretical analysis requires going beyond the limits of local equilibrium statistical physics and thermodynamics and, in particular, taking account of ergodicity breaking and of generalized formulation of thermodynamics. The ergodicity breaking is related to the time symmetry breaking and to the presence of some kinds of fluxes and gradient flows making that an average of microscopic variables along time is different than an average over some chosen statistical ensemble. In fast processes, this is due, for instance, to the fact that the system has no time enough to explore the who

Publication date: Available online 15 November 2019

Source: Physics Reports

Author(s): Angelo Carollo, Davide Valenti, Bernardo Spagnolo

##### Abstract

In this article we provide a review of geometrical methods employed in the analysis of quantum phase transitions and non-equilibrium dissipative phase transitions. After a pedagogical introduction to geometric phases and geometric information in the characterisation of quantum phase transitions, we describe recent developments of geometrical approaches based on mixed-state generalisation of the Berry-phase, i.e. the Uhlmann geometric phase, for the investigation of non-equilibrium steady-state quantum phase transitions (NESS-QPTs ). Equilibrium phase transitions fall invariably into two markedly non-overlapping categories: classical phase transitions and quantum phase transitions, whereas in NESS-QPTs this distinction may fade off. The approach described in this review, among other things, can quantitatively assess the quantum character of such critical phenomena. This framework is applied to a paradigmatic class of lattice Fermion systems with local reservoirs, characterised by Gaussian non-equilibrium steady states. The relations between the behaviour of the geometric phase curvature, the divergence of the correlation length, the character of the criticality and the gap – either Hamiltonian or dissipative – are reviewed.

Authors: A. S. LemosG. C. LunaE. MacielF. Dahia

There are theoretical frameworks, such as the large extra dimension models, which predict the strengthening of the gravitational field in short distances. Here we obtain new empiric constraints for deviations of standard gravity in the atomic length scale from analyses of recent and accurate data of hydrogen spectroscopy. The new bounds, extracted from 1S-3S transition, are compared with previous limits given by antiprotonic Helium spectroscopy. Independent constraints are also determined by investigating the effects of gravitational spin-orbit coupling on the atomic spectrum. We show that the analysis of the influence of that interaction, which is responsible for the spin precession phenomena, on the fine structure of the states can be employed as a test of a post-Newtonian potential in the atomic domain. The constraints obtained here from 2P_{1/2}-2P_{3/2} transition in hydrogen are tighter than previous bounds determined from measurements of the spin precession in an electron-nucleus scattering.

Authors: Robert Brandenberger (McGill University)

Cosmological inflation is not the only early universe scenario consistent with current observational data. I will discuss the criteria for a successful early universe cosmology, compare a couple of the proposed scenarios (inflation, bouncing cosmologies, and the {\it emergent} scenario), focusing on how future observational data will be able to distinguish between them. I will argue that we need to go beyond effective field theory in order to understand the early universe, and that principles of superstring theory will yield a nonsingular cosmology.

Authors: Alejandro PerezDaniel Sudarsky

In a recent work we have argued that nosy energy momentum diffusion due to space-time discreteness at the Planck scale (naturally expected to arise from quantum gravity) can be responsible for the generation of a cosmological constant during the electro-weak phase transition era of the cosmic evolution. Simple dimensional analysis and an effectively Brownian description of the propagation of fundamental particles on a granular background yields a cosmological constant of the order of magnitude of the observed value, without fine tuning. While the energy diffusion is negligible for matter in standard astrophysical configurations (from ordinary stars to neutron stars) here we argue that a similar diffusion mechanism could, nonetheless be important for black holes. If such effects are taken into account two observational puzzles might be solved by a single mechanism: the `$H_0$ tension’ and the relatively low rotational spin of the black holes detected via gravitational wave astronomy.

Authors: Qingdi WangWilliam G. Unruh

It is argued in a recent letter \cite{PhysRevLett.123.131302} that the effect of a large cosmological constant can be naturally hided in Planck scale curvature fluctuations. We point out that there are problems with the author’s arguments. The hiding actually does not work in the way proposed in \cite{PhysRevLett.123.131302}. In particular, it can not be achieved if the cosmological constant is positive. Fortunately, it works for a negative cosmological constant in a different way. In addition, the sign of the cosmological constant just needs to be negative to make the average spatial curvature $\langle R\rangle$ small.

 上午9:43 | gr-qc updates on arXiv.org

Authors: Raphael BoussoMarija Tomasevic

Under semiclassical evolution, black holes retain a smooth horizon but fail to return information. Yet, the Ryu-Takayanagi prescription computes the boundary entropy expected from unitary CFT evolution. We demonstrate this in a novel setting with an asymptotic bulk detector, eliminating an assumption about the entanglement wedge of auxiliary systems.

We consider three interpretations of this result. (i) At face value, information is lost in the bulk but not in the CFT. This conflicts with the AdS/CFT dictionary. (ii) No unique QFT state (pure or mixed) governs all detector responses to the bulk Hawking radiation. This conflicts with the existence of an S-matrix. (iii) Nonlocal couplings to the black hole interior cause asymptotic detectors to respond as though the radiation was pure, even though it is naively thermal. This invalidates the standard interpretation of the semiclassical state, including its smoothness at the horizon.

We conclude that unitary boundary evolution requires asymptotic bulk detectors to become unambiguously pure at late times. We ask whether the RT prescription can still reproduce the boundary entropy in this bulk scenario. We find that this requires a substantial failure of semiclassical gravity in a low-curvature region, such as a firewall that purifies the Hawking radiation.

Finally, we allow that the dual to semiclassical gravity may be an ensemble of unitary theories. This appears to relax the tensions we found: the ensemble average of out-states would be mixed, but the ensemble average of final entropies would vanish.

 上午9:43 | gr-qc updates on arXiv.org

Flatly Foliated Relativity (FFR) is a new theory which conceptually lies between Special Relativity (SR) and General Relativity (GR), in which spacetime is foliated by flat Euclidean spaces. While GR is based on the idea that “matter curves spacetime”, FFR is based on the idea that “matter curves spacetime, but not space”. This idea, inspired by the observed spatial flatness of our local universe, is realized by considering the same action as used in GR, but restricting it only to metrics which are foliated by flat spatial slices. FFR can be thought of as describing gravity without gravitational waves.

In FFR, a positive cosmological constant implies several interesting properties which do not follow in GR: the metric equations are elliptic on each euclidean slice, there exists a unique vacuum solution among those spherically symmetric at infinity, and there exists a geometric way to define the arrow of time. Furthermore, as gravitational waves do not exist in FFR, there are simple analogs to the positive mass theorem and Penrose-type inequalities.

Importantly, given that gravitational waves have a negligible effect on the curvature of spacetime, and that the universe appears to be locally flat, FFR may be a good approximation of GR. Moreover, FFR still admits many notable features of GR including the big bang, an accelerating expansion of the universe, and the Schwarzschild spacetime. Lastly, FFR is already known to have an existence theory for some simplified cases, which provokes an interesting discussion regarding the possibility of a more general existence theory, which may be relevant to understanding existence of solutions to GR.

Gravitational Waves (GWs) were observed for the first time in 2015, one century after Einstein predicted their existence. There is now growing interest to extend the detection bandwidth to low frequency. The scientific potential of multi-frequency GW astronomy is enormous as it would enable to obtain a more complete picture of cosmic events and mechanisms. This is a unique and entirely new opportunity for the future of astronomy, the success of which depends upon the decisions being made on existing and new infrastructures. The prospect of combining observations from the future space-based instrument LISA together with third generation ground based detectors will open the way towards multi-band GW astronomy, but will leave the infrasound (0.1 Hz to 10 Hz) band uncovered. GW detectors based on matter wave interferometry promise to fill such a sensitivity gap. We propose the European Laboratory for Gravitation and Atom-interferometric Research (ELGAR), an underground infrastructure based on the latest progress in atomic physics, to study space-time and gravitation with the primary goal of detecting GWs in the infrasound band. ELGAR will directly inherit from large research facilities now being built in Europe for the study of large scale atom interferometry and will drive new pan-European synergies from top research centers developing quantum sensors. ELGAR will measure GW radiation in the infrasound band with a peak strain sensitivity of $4.1 \times 10^{-22}/\sqrt{\text{Hz}}$ at 1.7 Hz. The antenna will have an impact on diverse fundamental and applied research fields beyond GW astronomy, including gravitation, general relativity, and geology.

 2019年11月15日 星期五 下午6:00 | Francesco Mori, Satya N. Majumdar, and Grégory Schehr | PRL: General Physics: Statistical and Quantum Mechanics, Quantum Information, etc.

Author(s): Francesco Mori, Satya N. Majumdar, and Grégory Schehr

We present an exact solution for the probability density function P(τ=tmin−tmax|T) of the time difference between the minimum and the maximum of a one-dimensional Brownian motion of duration T. We then generalize our results to a Brownian bridge, i.e., a periodic Brownian motion of period T. We demo…

[Phys. Rev. Lett. 123, 200201] Published Fri Nov 15, 2019

 2019年11月14日 星期四 上午8:00 | Latest Results for Synthese

### Abstract

Proposed derivations of the Born rule for Everettian theory are controversial. I argue that they are unnecessary but may provide justification for a simplified version of the Principal Principle. It’s also unnecessary to replace Everett’s idea that a subject splits in measurement contexts with the idea that subjects have linear histories which partition (Deutsch in Int J Theor Phys 24:1–41, 1985; The Beginning of Infinity. Allen Lane, London, 2011; Saunders and Wallace in Br J Philos Sci 59:293–305, 2008; Saunders, in: Saunders, Barrett, Kent, Wallace (eds) Many worlds? Everett, quantum theory, and reality, Oxford University Press, Oxford, pp 181–205, 2010; Wallace in The emergent multiverse, Oxford University Press, Oxford, 2012, Chapter 7; Wilson in Br J Philos Sci 64:709–737, 2013; The nature of contingency: quantum physics as modal realism, Oxford University Press, Oxford, forthcoming). Linear histories were introduced to provide a concept of pre-measurement uncertainty and I explain why pre-measurement uncertainty for splitting subjects is after all coherent, though not necessary because Everett’s original fission interpretation of branching can arguably be rendered coherent without it, via reference to Vaidman (Int Stud Philos Sci 12:245–66, 1998), Tappenden (Br J Philos Sci 62:99–123, 2011), Sebens and Carroll (Br J Philos Sci 69:25–74, 2018) and McQueen and Vaidman (Stud Hist Philos Mod Phys 66:14–23, 2019). A deterministic and probabilistic quantum mechanics can be made intelligible by replacing the standard collapse postulate with a no-collapse postulate which identifies objective probability with relative branch weight, supplemented by the simplified Principal Principle and some revisionary metaphysics.

 2019年11月14日 星期四 上午8:00 | Latest Results for Foundations of Physics

### Abstract

We point out a fundamental problem that hinders the quantization of general relativity: quantum mechanics is formulated in terms of systems, typically limited in space but infinitely extended in time, while general relativity is formulated in terms of events, limited both in space and in time. Many of the problems faced while connecting the two theories stem from the difficulty in shoe-horning one formulation into the other. A solution is not presented, but a list of desiderata for a quantum theory based on events is laid out.

 2019年11月14日 星期四 上午1:49 | Philsci-Archive: No conditions. Results ordered -Date Deposited.
Azhar, Feraz (2019) Effective field theories as a novel probe of fine-tuning of cosmic inflation. [Preprint]
 2019年11月14日 星期四 上午1:45 | Philsci-Archive: No conditions. Results ordered -Date Deposited.
Hubert, Mario (2019) Typicality and Atypicality: Unifying Probabilities and Really Statistical Explanations. [Preprint]
 2019年11月12日 星期二 上午8:00 | Latest Results for Foundations of Physics

### Abstract

The classical limit is fundamental in quantum mechanics. It means that quantum predictions must converge to classical ones as the macroscopic scale is approached. Yet, how and why quantum phenomena vanish at the macroscopic scale is difficult to explain. In this paper, quantum predictions for Greenberger–Horne–Zeilinger states with an arbitrary number q of qubits are shown to become indistinguishable from the ones of a classical model as q increases, even in the absence of loopholes. Provided that two reasonable assumptions are accepted, this result leads to a simple way to explain the classical limit and the vanishing of observable quantum phenomena at the macroscopic scale.

 2019年11月11日 星期一 上午9:21 | Philsci-Archive: No conditions. Results ordered -Date Deposited.
Oldofredi, Andrea and Esfeld, Michael (2019) Observability, Unobservability and the Copenhagen Interpretation in Dirac’s Methodology of Physics. Quanta, 8 (1). pp. 68-87.
 2019年11月11日 星期一 上午8:00 | Todd R. Gingrich | Nature Physics – Issue – nature.com science feeds

Nature Physics, Published online: 11 November 2019; doi:10.1038/s41567-019-0702-6

A new class of inequalities known as thermodynamic uncertainty relations provides quantitative tools for the description of physical systems out of equilibrium. A perspective is offered on these results and their future developments.

 2019年11月11日 星期一 上午8:00 | Peter J. Love | Nature Physics – Issue – nature.com science feeds

Nature Physics, Published online: 11 November 2019; doi:10.1038/s41567-019-0709-z

Finding ground states of given Hamiltonians is crucial for quantum simulation — a promising application of quantum computers. An algorithm now finds these states using minimal resources, making it implementable in near-term noisy devices.