Weekly Papers on Quantum Foundations (44)

The black hole information puzzle and the quantum de Finetti theorem. (arXiv:2110.14653v1 [hep-th])

上午9:00|Renato Renner, Jinzhao Wang|quant-ph updates on arXiv.org

The black hole information puzzle arises from a discrepancy between conclusions drawn from general relativity and quantum theory about the nature of the radiation emitted by black holes. According to Hawking’s original argument, the radiation is thermal and its entropy thus increases monotonously as the black hole evaporates. Conversely, due to the reversibility of time evolution according to quantum theory, the radiation entropy should decrease in the final stages of evaporation, as predicted by the Page curve. This behaviour has been confirmed by new calculations based on the replica trick, which also exhibited its geometrical origin: spacetime wormholes that form between the replicas. Here we analyse the discrepancy between these and Hawking’s original conclusions from a quantum information theory viewpoint, using in particular the quantum de Finetti theorem. The theorem implies the existence of extra information, $W$, which plays the role of a reference. The entropy obtained via the replica trick can then be identified to be the entropy $S(R|W)$ of the radiation conditioned on the reference $W$, whereas Hawking’s original result corresponds to the non-conditional entropy $S(R)$. The entropy $S(R|W)$, which mathematically is an ensemble average, gains its physical meaning in a many-black-holes scenario. Our analysis hints at an interpretation of the replica wormholes as the geometrical representation of the correlation between the black holes, which is mediated by $W$. It also suggests an extension of the widely used random unitary model of black holes, which we support with some new non-trivial checks.

A convergent inflation hierarchy for quantum causal structures. (arXiv:2110.14659v1 [quant-ph])

上午9:00|Laurens T. Ligthart, Mariami Gachechiladze, David Gross|quant-ph updates on arXiv.org

A causal structure is a description of the functional dependencies between random variables. A distribution is compatible with a given causal structure if it can be realized by a process respecting these dependencies. Deciding whether a distribution is compatible with a structure is a practically and fundamentally relevant, yet very difficult problem. Only recently has a general class of algorithms been proposed: These so-called inflation techniques associate to any causal structure a hierarchy of increasingly strict compatibility tests, where each test can be formulated as a computationally efficient convex optimization problem. Remarkably, it has been shown that in the classical case, this hierarchy is complete in the sense that each non-compatible distribution will be detected at some level of the hierarchy. An inflation hierarchy has also been formulated for causal structures that allow for the observed classical random variables to arise from measurements on quantum states – however, no proof of completeness of this quantum inflation hierarchy has been supplied. In this paper, we construct a first version of the quantum inflation hierarchy that is provably convergent. From a technical point of view, convergence proofs are built on de Finetti Theorems, which show that certain symmetries (which can be imposed in convex optimization problems) imply independence of random variables (which is not directly a convex constraint). A main technical ingredient to our proof is a Quantum de Finetti Theorem that holds for general tensor products of $C^*$-algebras, generalizing previous work that was restricted to minimal tensor products.

Holographic spacetime, black holes and quantum error correcting codes: A review. (arXiv:2110.14669v1 [hep-th])

上午9:00|Tanay Kibe, Prabha Mandayam, Ayan Mukhopadhyay|quant-ph updates on arXiv.org

This article reviews the progress in our understanding of the reconstruction of the bulk spacetime in the holographic correspondence from the dual field theory including an account of how these developments have led to the reproduction of the Page curve of the Hawking radiation from black holes. We review quantum error correction and relevant recovery maps with toy examples based on tensor networks, and discuss how it provides the desired framework for bulk reconstruction in which apparent inconsistencies with properties of the operator algebra in the dual field theory are naturally resolved. The importance of understanding the modular flow in the dual field theory has been emphasized. We discuss how the state-dependence of reconstruction of black hole microstates can be formulated in the framework of quantum error correction with inputs from extremal surfaces along with a quantification of the complexity of encoding of bulk operators. Finally, we motivate and discuss a class of tractable microstate models of black holes which can illuminate how the black hole complementarity principle can emerge operationally without encountering information paradoxes, and provide new insights into generation of desirable features of encoding into the Hawking radiation.

Quantum Computational Complexity — From Quantum Information to Black Holes and Back. (arXiv:2110.14672v1 [hep-th])

上午9:00|Shira Chapman, Giuseppe Policastro|quant-ph updates on arXiv.org

Quantum computational complexity estimates the difficulty of constructing quantum states from elementary operations, a problem of prime importance for quantum computation. Surprisingly, this quantity can also serve to study a completely different physical problem – that of information processing inside black holes. Quantum computational complexity was suggested as a new entry in the holographic dictionary, which extends the connection between geometry and information and resolves the puzzle of why black hole interiors keep growing for a very long time. In this pedagogical review, we present the geometric approach to complexity advocated by Nielsen and show how it can be used to define complexity for generic quantum systems; in particular, we focus on Gaussian states in QFT, both pure and mixed, and on certain classes of CFT states. We then present the conjectured relation to gravitational quantities within the holographic correspondence and discuss several examples in which different versions of the conjectures have been tested. We highlight the relation between complexity, chaos and scrambling in chaotic systems. We conclude with a discussion of open problems and future directions. This article was written for the special issue of EPJ-C Frontiers in Holographic Duality.

On Penrose’s theory of objective gravitationally induced wave function reduction. (arXiv:2110.14772v1 [gr-qc])

上午9:00|Ricardo Gallego Torromé|quant-ph updates on arXiv.org

The formal structure of Penrose’s gravitationally induced reduction of the wave function mechanism is analyzed. It is shown that pushing Penrose’s argument forward leads to the interpretation of quantum coherence in microscopic systems as an observable signature violation of general covariance. We discuss potential avenues to avoid this conclusion, among them emergent quantum mechanics and super-determinism.

Fundamental limits of quantum error mitigation. (arXiv:2109.04457v3 [quant-ph] UPDATED)

上午9:00|Ryuji Takagi, Suguru Endo, Shintaro Minagawa, Mile Gu|quant-ph updates on arXiv.org

The inevitable accumulation of errors in near-future quantum devices represents a key obstacle in delivering practical quantum advantage. This motivated the development of various quantum error-mitigation protocols, each representing a method to extract useful computational output by combining measurement data from multiple samplings of the available imperfect quantum device. What are the ultimate performance limits universally imposed on such protocols? Here, we derive a fundamental bound on the sampling overhead that applies to a general class of error-mitigation protocols, assuming only the laws of quantum mechanics. We use it to show that (1) the sampling overhead to mitigate local depolarizing noise for layered circuits — such as the ones used for variational quantum algorithms — must scale exponentially with circuit depth, and (2) the optimality of probabilistic error cancellation method among all strategies in mitigating a certain class of noise, demonstrating that our results provide a means to identify when a given quantum error-mitigation strategy is optimal and when there is potential room for improvement.

Change in Hamiltonian General Relativity with Spinors. (arXiv:2110.15266v1 [gr-qc])

上午9:00|physics.hist-ph updates on arXiv.org

Authors: J. Brian Pitts

In Hamiltonian GR, change has seemed to be missing, defined only asymptotically, or otherwise obscured at best. By construing change as essential time dependence, can one find change locally in Hamiltonian GR with spinors?

This paper is motivated by tendencies in space-time philosophy tends to slight fermionic/spinorial matter, in Hamiltonian GR to misplace changes of time coordinate, and in treatments of the Einstein-Dirac equation to include a gratuitous local Lorentz gauge symmetry. Spatial dependence is dropped in most of the paper. To include all and only the coordinate freedom, the Einstein-Dirac equation is investigated using the Schwinger time gauge and Kibble-Deser symmetric triad condition as a $3+1$ version of the DeWitt-Ogievetsky-Polubarinov nonlinear group realization formalism that dispenses with a tetrad and local Lorentz gauge freedom. Change is the lack of a time-like stronger-than-Killing field for which the Lie derivative of the metric-spinor complex vanishes. An appropriate $3+1$-friendly form of the Rosenfeld-Anderson-Bergmann-Castellani gauge generator $G$, a tuned sum of first class-constraints, changes the canonical Lagrangian by a total derivative and implements changes of time coordinate for solutions.

Unifying gravitational waves and dark energy. (arXiv:2110.14689v1 [gr-qc])

上午9:00|gr-qc updates on arXiv.org

Authors: Alice GaroffoloOmar Contigiani

We present a unifying treatment for metric and scalar perturbations across different energy regimes in scalar-tensor theories of gravity. To do so, we introduce two connected symmetry-breaking patterns: one due to the acquisition of nontrivial vacuum expectation values by the fields and the other due to the distinction between background and perturbations that live on top of it. We show that the geometric optics approximation commonly used to enforce this separation is not self-consistent for high-frequency perturbations since gauge transformations mix different tensor and scalar sectors orders. We derive the equations of motions for the perturbations and describe the behavior of the solutions in the low and high-frequency limits. We conclude by describing this phenomenology in the context of two screening mechanisms, chameleon and symmetron, and show that scalar waves in every frequency range are screened, hence not detectable.

Wormholes & Holography: An Introduction. (arXiv:2110.14958v1 [hep-th])

上午9:00|gr-qc updates on arXiv.org

Authors: Arnab Kundu

Wormholes are intriguing classical solutions in General Relativity, that have fascinated theoretical physicists for decades. In recent years, gravitational Wormhole geometries have found a new life in many theoretical ideas, especially in Holography. This is an introductory and pedagogical review of Wormholes and their recent applications in Gauge-Gravity duality and related ideas.

Can the displacemon device test objective collapse models?. (arXiv:2110.15180v1 [quant-ph])

上午9:00|gr-qc updates on arXiv.org

Authors: Lydia A. Kanari-NaishJack ClarkeMichael R. VannerEdward A. Laird

Testing the limits of the applicability of quantum mechanics will deepen our understanding of the universe and may shed light on the interplay between quantum mechanics and gravity. At present there is a wide range of approaches for such macroscopic tests spanning matter-wave interferometry of large molecules to precision measurements of heating rates in the motion of micro-scale cantilevers. The “displacemon” is a proposed electromechanical device consisting of a mechanical resonator flux coupled to a superconducting qubit, which could be used to generate and observe quantum interference between centre-of-mass trajectories in the motion of a resonator. In the original proposal, the mechanical resonator was a carbon nanotube, containing $10^6$ nucleons. Such a superposition would be massive by comparison to the present state-of-the-art, but still small compared with the mass scales on which we might feasibly test objective collapse models. Here, instead of a carbon nanotube, we propose using an aluminium mechanical resonator on two larger mass scales, one inspired by the Marshall-Simon-Penrose-Bouwmeester moving-mirror proposal, and one set by the Planck mass. For such a device, we examine the experimental requirements needed to perform a more macroscopic quantum test and thus feasibly detect the decoherence effects predicted by two objective collapse models: Di\'{o}si-Penrose and continuous spontaneous localization. Our protocol for testing these two theories takes advantage of the displacemon architecture by analyzing the measurement statistics of a superconducting qubit. We find that with improvements to the fabrication and vibration sensitivities of these electromechanical devices, the displacemon interferometer provides a new route to feasibly test decoherence mechanisms beyond standard quantum theory.

A “black hole theorem,” and its implications. (arXiv:2110.10690v1 [hep-th] CROSS LISTED)

上午9:00|gr-qc updates on arXiv.org

Authors: Steven B. Giddings

A “black hole theorem” is stated, exhibiting the basic conflict of the information problem. This is formulated in a more general context than that of quantum field theory on a background, and is based on describing a black hole as a quantum subsystem of a larger system, including its environment. As with the Coleman-Mandula theorem, the most important point is probably the loophole in the “theorem,” and what this tells us about the fundamental structure of quantum gravity. This “theorem” in particular connects to the general question of how to define quantum subsystems in quantum gravity. If black holes do behave as quantum subsystems, at least to a good approximation, evolve unitarily, and do not leave remnants, the “theorem” implies the presence of interactions between a black hole and its environment that go beyond a description based on local quantum fields. These can be parameterized in a principled way, and with motivated additional assumptions indicate possible observational signatures, which can be investigated by electromagnetic or gravitational wave observations of black holes.

Experimental Validation of Fully Quantum Fluctuation Theorems Using Dynamic Bayesian Networks

2021年10月29日 星期五 下午6:00|Kaonan Micadei, John P. S. Peterson, Alexandre M. Souza, Roberto S. Sarthour, Ivan S. Oliveira, Gabriel T. Landi, Roberto M. Serra, and Eric Lutz|PRL: General Physics: Statistical and Quantum Mechanics, Quantum Information, etc.

Author(s): Kaonan Micadei, John P. S. Peterson, Alexandre M. Souza, Roberto S. Sarthour, Ivan S. Oliveira, Gabriel T. Landi, Roberto M. Serra, and Eric Lutz

Fluctuation theorems are fundamental extensions of the second law of thermodynamics for small systems. Their general validity arbitrarily far from equilibrium makes them invaluable in nonequilibrium physics. So far, experimental studies of quantum fluctuation relations do not account for quantum cor…

[Phys. Rev. Lett. 127, 180603] Published Fri Oct 29, 2021

Direct Characterization of Quantum Measurements Using Weak Values

2021年10月28日 星期四 下午6:00|Liang Xu, Huichao Xu, Tao Jiang, Feixiang Xu, Kaimin Zheng, Ben Wang, Aonan Zhang, and Lijian Zhang|PRL: General Physics: Statistical and Quantum Mechanics, Quantum Information, etc.

Author(s): Liang Xu, Huichao Xu, Tao Jiang, Feixiang Xu, Kaimin Zheng, Ben Wang, Aonan Zhang, and Lijian Zhang

The time-symmetric formalism endows the weak measurement and its outcome, the weak value, with many unique features. In particular, it allows a direct tomography of quantum states without resorting to complicated reconstruction algorithms and provides an operational meaning to wave functions and den…

[Phys. Rev. Lett. 127, 180401] Published Thu Oct 28, 2021

The role of representational conventions in assessing the empirical significance of symmetries

2021年10月28日 星期四 下午3:53|Philsci-Archive: No conditions. Results ordered -Date Deposited.

Gomes, Henrique (2021) The role of representational conventions in assessing the empirical significance of symmetries. [Preprint]

Gauge-invariance and the empirical significance of symmetries

2021年10月28日 星期四 下午3:53|Philsci-Archive: No conditions. Results ordered -Date Deposited.

Gomes, Henrique (2020) Gauge-invariance and the empirical significance of symmetries. [Preprint]

Scientific Realism and Empirical Confirmation: a Puzzle

2021年10月28日 星期四 下午3:52|Philsci-Archive: No conditions. Results ordered -Date Deposited.

Allzén, Simon (2021) Scientific Realism and Empirical Confirmation: a Puzzle. Studies in History and Philosophy of Science Part A, 90. pp. 153-159. ISSN 00393681

Quantum States: An Analysis via the Orthogonality Relation

2021年10月28日 星期四 下午3:52|Philsci-Archive: No conditions. Results ordered -Date Deposited.

Zhong, Shengyang (2021) Quantum States: An Analysis via the Orthogonality Relation. Synthese. ISSN 1573-0964

MOND and Methodology

2021年10月28日 星期四 下午3:51|Philsci-Archive: No conditions. Results ordered -Date Deposited.

Merritt, David (2021) MOND and Methodology. Karl Popper’s Science and Philosophy. pp. 69-96.

The Cost of Closure: Logical Realism, Anti-Exceptionalism, and Theoretical Equivalence

2021年10月26日 星期二 下午4:12|Philsci-Archive: No conditions. Results ordered -Date Deposited.

McSweeney, Michaela (2021) The Cost of Closure: Logical Realism, Anti-Exceptionalism, and Theoretical Equivalence. Synthese. ISSN 1573-0964

Quantum Conditional Probabilities and New Measures of Quantum Information

2021年10月26日 星期二 下午4:11|Philsci-Archive: No conditions. Results ordered -Date Deposited.

Barandes, Jacob A. and Kagan, David (2021) Quantum Conditional Probabilities and New Measures of Quantum Information. [Preprint]


Measured distribution of cloud chamber tracks from radioactive decay: a new empirical approach to investigating the quantum measurement problem

Jonathan Schonfeld


Using publically available video of a cloud chamber with a very small radioactive source, I measure the spatial distribution of where tracks start, and consider possible implications. This is directly relevant to the quantum measurement problem and its possible resolution, and appears never to have been done before. The raw data are relatively uncontrolled, leading to caveats that should guide future, more tailored experiments. Results suggest a modification to Born’s rule at very small wavefunction. Track distributions from decays in cloud chambers represent a previously unappreciated way to probe the foundations of quantum mechanics, and a novel case of wavefunctions with macroscopic signatures.

Article written by