# Weekly Papers on Quantum Foundations (27)

Reply to the comment on “Quantum priniple of relativity”. (arXiv:2206.15247v1 [quant-ph])

We refute criticisms by Del Santo and Horvat towards our paper “Quantum principle of relativity”: most of their counterarguments can be dismissed, and the rest provides further evidence to our claims.

Stochastic Bohmian and Scaled Trajectories. (arXiv:2206.15260v1 [quant-ph])

In this review we deal with open (dissipative and stochastic) quantum systems within the Bohmian mechanics framework which has the advantage to provide a clear picture of quantum phenomena in terms of trajectories, originally in configuration space. The gradual decoherence process is studied from linear and nonlinear Schr\”odinger equations through Bohmian trajectories as well as by using the so-called quantum-classical transition differential equation through scaled trajectories. This transition is governed by a continuous parameter, the transition parameter, covering these two extreme open dynamical regimes. Thus, two sources of decoherence of different nature are going to be considered. Several examples will be presented and discussed in order to illustrate the corresponding theory behind each case, namely: the so-called Brownian-Bohmian motion leading to quantum diffusion coefficients, dissipative diffraction in time, dissipative tunnelling for a parabolic barrier under the presence of an electric field and stochastic early arrivals for the same type of barrier. In order to simplify the notations and physical discussion, the theoretical developments will be carried out in one dimension throughout all this wok. One of the main goals is to analyze the gradual decoherence process existing in these open dynamical regimes in terms of trajectories, leading to a more intuitive way of understanding the underlying physics in order to gain new insights.

Quantum mechanics? It’s all fun and games until someone loses an $i$. (arXiv:2206.15343v1 [quant-ph])

QBism regards quantum mechanics as an addition to probability theory. The addition provides an extra normative rule for decision-making agents concerned with gambling across experimental contexts, somewhat in analogy to the double-slit experiment. This establishes the meaning of the Born Rule from a QBist perspective. Moreover it suggests that the best way to formulate the Born Rule for foundational discussions is with respect to an informationally complete reference device. Recent work [DeBrota, Fuchs, and Stacey, Phys. Rev. Res. 2, 013074 (2020)] has demonstrated that reference devices employing symmetric informationally complete POVMs (or SICs) achieve a minimal quantumness: They witness the irreducible difference between classical and quantum. In this paper, we attempt to answer the analogous question for real-vector-space quantum theory. While standard quantum mechanics seems to allow SICs to exist in all finite dimensions, in the case of quantum theory over the real numbers it is known that SICs do not exist in most dimensions. We therefore attempt to identify the optimal reference device in the first real dimension without a SIC (i.e., $d=4$) in hopes of better understanding the essential role of complex numbers in quantum mechanics. In contrast to their complex counterparts, the expressions that result in a QBist understanding of real-vector-space quantum theory are surprisingly complex.

Thermalization of Gauge Theories from their Entanglement Spectrum. (arXiv:2107.11416v2 [quant-ph] UPDATED)

Using dual theories embedded into a larger unphysical Hilbert space along entanglement cuts, we study the Entanglement Structure of $\mathbf{Z}_2$ lattice gauge theory in $(2+1)$ spacetime dimensions. We demonstrate Li and Haldane’s conjecture, and show consistency of the Entanglement Hamiltonian with the Bisognano-Wichmann theorem. Studying non-equilibrium dynamics after a quench, we provide an extensive description of thermalization in $\mathbf{Z}_2$ gauge theory which proceeds in a characteristic sequence: Maximization of the Schmidt rank and spreading of level repulsion at early times, self-similar evolution with scaling coefficients $\alpha = 0.8 \pm 0.2$ and $\beta = 0.0 \pm 0.1$ at intermediate times, and finally thermal saturation of the von Neumann entropy.

Platonic Bell inequalities for all dimensions. (arXiv:2112.03887v2 [quant-ph] UPDATED)

In this paper we study the Platonic Bell inequalities for all possible dimensions. There are five Platonic solids in three dimensions, but there are also solids with Platonic properties (also known as regular polyhedra) in four and higher dimensions. The concept of Platonic Bell inequalities in the three-dimensional Euclidean space was introduced by Tavakoli and Gisin [Quantum 4, 293 (2020)]. For any three-dimensional Platonic solid, an arrangement of projective measurements is associated where the measurement directions point toward the vertices of the solids. For the higher dimensional regular polyhedra, we use the correspondence of the vertices to the measurements in the abstract Tsirelson space. We give a remarkably simple formula for the quantum violation of all the Platonic Bell inequalities, which we prove to attain the maximum possible quantum violation of the Bell inequalities, i.e. the Tsirelson bound. To construct Bell inequalities with a large number of settings, it is crucial to compute the local bound efficiently. In general, the computation time required to compute the local bound grows exponentially with the number of measurement settings. We find a method to compute the local bound exactly for any bipartite two-outcome Bell inequality, where the dependence becomes polynomial whose degree is the rank of the Bell matrix. To show that this algorithm can be used in practice, we compute the local bound of a 300-setting Platonic Bell inequality based on the halved dodecaplex. In addition, we use a diagonal modification of the original Platonic Bell matrix to increase the ratio of quantum to local bound. In this way, we obtain a four-dimensional 60-setting Platonic Bell inequality based on the halved tetraplex for which the quantum violation exceeds the $\sqrt 2$ ratio.

A comment on Bell’s Theorem Logical Consistency. (arXiv:2202.09639v2 [quant-ph] UPDATED)

In their recent paper, Lambare and Franco correctly claim that Bell deterministic model and inequalities may be derived using only local causality, perfect correlations and measurement independence, without talking about joint probabilities. However, measurement independence, as we explain, should not be called no-conspiracy or freedom of choice. Measurement independence should be called noncontextuality, because it allows implementing random variables, describing incompatible random experiments, on a unique probability space, on which they are jointly distributed. Using a precise terminology proposed by Dzhafarov and Kujala in the Contextuality by Default approach , such implementation defines a probabilistic coupling, which we explain in this paper. The frequentists proof of inequalities fails, if this probabilistic coupling and joint probabilities do not exist. We construct also a probabilistic coupling for their counterexample to prove, that there is no contradiction with Fine Theorem. Nobody questions Bell Theorem logical consistency and nobody claims that Fine disproved Bell Theorem. Various metaphysical assumptions, such as local realism, classicality or counterfactual definiteness may motivate a choice of a probabilistic model. However, once a model is chosen, its meaning and its implications may only be discussed rigorously in a probabilistic framework. Bell inequalities are violated in various Bell Tests; for us it proves that hidden variables depend on settings confirming contextual character of quantum observables and an active role played by measuring instruments. Bell was a realist, thus he thought that he had to choose between nonlocality and super-determinism. From two bad choices he chose nonlocality. Today he would probably choose contextuality

Law of Total Probability in Quantum Theory and Its Application in Wigner’s Friend Scenario. (arXiv:2204.12285v2 [quant-ph] UPDATED)

It is well-known that the law of total probability does not hold in general in quantum theory. However, the recent arguments on some of the fundamental assumptions in quantum theory based on the extended Wigner’s Friend scenario show a need to clarify how the law of total probability should be formulated in quantum theory and under what conditions it still holds. In this work, the definition of conditional probability in quantum theory is extended to POVM measurements. Rule to assign two-time conditional probability is proposed for incompatible POVM operators, which leads to a more general and precise formulation of the law of total probability. Sufficient conditions under which the law of total probability holds are identified. Applying the theory developed here to analyze several quantum no-go theorems related to the extended Wigner’s friend scenario reveals logical loopholes in these no-go theorems. The loopholes exist as a consequence of taking for granted the validity of the law of total probability without verifying the sufficient conditions. Consequently, the contradictions in these no-go theorems only reconfirm the invalidity of the law of total probability in quantum theory, rather than invalidating the physical statements that the no-go theorems attempt to refute.

Interaction between macroscopic quantum systems and gravity. (arXiv:2206.07574v3 [gr-qc] UPDATED)

We review experiments and theoretical models about the possible mutual interplay between the gravitational field and materials in the superconducting state or other macroscopic quantum states. More generally, we focus on the possibility for quantum macrosystems in a coherent state to produce local alterations of the gravitational field in which they are immersed. This fully interdisciplinary research field has witnessed a conspicuous progress in the last decades, with hundreds of published papers, and yet several questions are still completely open.

A comment on Bell’s Theorem Logical Consistency. (arXiv:2202.09639v2 [quant-ph] UPDATED)

Authors: Marian Kupczynski

In their recent paper, Lambare and Franco correctly claim that Bell deterministic model and inequalities may be derived using only local causality, perfect correlations and measurement independence, without talking about joint probabilities. However, measurement independence, as we explain, should not be called no-conspiracy or freedom of choice. Measurement independence should be called noncontextuality, because it allows implementing random variables, describing incompatible random experiments, on a unique probability space, on which they are jointly distributed. Using a precise terminology proposed by Dzhafarov and Kujala in the Contextuality by Default approach , such implementation defines a probabilistic coupling, which we explain in this paper. The frequentists proof of inequalities fails, if this probabilistic coupling and joint probabilities do not exist. We construct also a probabilistic coupling for their counterexample to prove, that there is no contradiction with Fine Theorem. Nobody questions Bell Theorem logical consistency and nobody claims that Fine disproved Bell Theorem. Various metaphysical assumptions, such as local realism, classicality or counterfactual definiteness may motivate a choice of a probabilistic model. However, once a model is chosen, its meaning and its implications may only be discussed rigorously in a probabilistic framework. Bell inequalities are violated in various Bell Tests; for us it proves that hidden variables depend on settings confirming contextual character of quantum observables and an active role played by measuring instruments. Bell was a realist, thus he thought that he had to choose between nonlocality and super-determinism. From two bad choices he chose nonlocality. Today he would probably choose contextuality

Space and time transformations with a minimal length. (arXiv:2206.15422v1 [gr-qc])

Authors: Pasquale Bosso

Phenomenological studies of quantum gravity have proposed a modification of the commutator between position and momentum in quantum mechanics so to introduce a minimal uncertainty in position in quantum mechanics. Such a minimal uncertainty and the consequent minimal measurable length have important consequences on the dynamics of quantum systems. In the present work, we show that such consequences go beyond dynamics, reaching the definition of quantities such as energy, momentum, and the same Hamiltonian. Furthermore, since the Hamiltonian, defined as the generator of time evolution, results to be bounded, a minimal length implies a minimal time.

The intrinsic pathology of self-interacting vector fields. (arXiv:2205.07784v3 [gr-qc] UPDATED)

Authors: Andrew CoatesFethi M. Ramazanoğlu

We show that self-interacting vector field theories exhibit unphysical behaviour even when they are not coupled to any external field. This means any theory featuring such vectors is in danger of being unphysical, an alarming prospect for many proposals in cosmology, gravity, high energy physics and beyond. The problem arises when vector fields with healthy configurations naturally reach a point where time evolution is mathematically ill-defined. We develop tools to easily identify this issue, and provide a simple and unifying framework to investigate it.

Newton’s Third Rule and the Experimental Argument for Universal Gravity

Domski, Mary (2022) Newton’s Third Rule and the Experimental Argument for Universal Gravity. Routledge Focus on Philosophy . Routledge, New York. ISBN 978-1-032-02036-5

Hilbert-Style Axiomatic Completion: On von Neumann and Hidden Variables in Quantum Mechanics

Mitsch, Chris (2022) Hilbert-Style Axiomatic Completion: On von Neumann and Hidden Variables in Quantum Mechanics. [Preprint]

Complete Physical Characterization of Quantum Nondemolition Measurements via Tomography

Author(s): L. Pereira, J. J. García-Ripoll, and T. Ramos

We introduce a self-consistent tomography for arbitrary quantum nondemolition (QND) detectors. Based on this, we build a complete physical characterization of the detector, including the measurement processes and a quantification of the fidelity, ideality, and backaction of the measurement. This fra…

[Phys. Rev. Lett. 129, 010402] Published Wed Jun 29, 2022

Humeanism and the Measurement Problem

Gao, Shan (2022) Humeanism and the Measurement Problem. [Preprint]

Consciousness and Complexity: Neurobiological Naturalism and Integrated Information Theory

Ellia, Francesco and Chis-Ciure, Robert (2022) Consciousness and Complexity: Neurobiological Naturalism and Integrated Information Theory. [Preprint]