# Weekly Papers on Quantum Foundations (25)

On Causal State Updates in Quantum Field Theory. (arXiv:2106.09027v1 [quant-ph])

In relativistic Quantum Field Theory (QFT) ideal measurements of certain observables are physically impossible without violating causality. This prompts two questions: i) can a given observable be ideally measured in QFT, and ii) if not, in what sense can it be measured? Here we formulate a necessary and sufficient condition that any measurement, and more generally any state update (quantum operation), must satisfy to respect causality. Our focus is scalar QFT, although our results should be applicable to observables in fermionic QFT. We argue that for unitary `kicks’ and operations involving 1-parameter families of Kraus operators, e.g. Gaussian measurements, the only causal observables are smeared fields and the identity – the basic observables in QFT. We provide examples with more complicated operators such as products of smeared fields, and show that the associated state updates are acausal, and hence impossible. Despite this, one can still recover expectation values of such operators, and we show how to do this using only causal measurements of smeared fields.

Contextuality without incompatibility. (arXiv:2106.09045v1 [quant-ph])

The existence of incompatible measurements is often believed to be a feature of quantum theory which signals its inconsistency with any classical worldview. To prove the failure of classicality in the sense of Kochen-Specker noncontextuality, one does indeed require sets of incompatible measurements. However, a more broadly applicable and more permissive notion of classicality is the existence of a generalized-noncontextual ontological model. In particular, this notion can imply constraints on the representation of outcomes even within a single nonprojective measurement. We leverage this fact to demonstrate that measurement incompatibility is neither necessary nor sufficient for proofs of the failure of generalized noncontextuality. Furthermore, we show that every proof of the failure of generalized noncontextuality in a prepare-measure scenario can be converted into a proof of the failure of generalized noncontextuality in a corresponding scenario with no incompatible measurements.

Quantum Entanglement of Free Particles. (arXiv:2106.09356v1 [quant-ph])

The Schrodinger equation is solved for many free particles and their quantum entanglement is studied via correlation analysis. Converting the Schrodinger equation in the Madelung hydrodynamic-like form, the quantum mechanics is extended to open quantum systems by adding Ohmic friction forces. The dissipative evolution confirms correlation decay over time, but a new integral of motion is discovered, being appropriate for storing everlasting quantum information.

How to administer an antidote to Schr\”{o}dinger’s cat. (arXiv:2106.09705v1 [quant-ph])

In his 1935 Gedankenexperiment, Erwin Schr\”{o}dinger imagined a poisonous substance which has a 50% probability of being released, based on the decay of a radioactive atom. As such, the life of the cat and the state of the poison become entangled, and the fate of the cat is determined upon opening the box. We present an experimental technique that keeps the cat alive on any account. This method relies on the time-resolved Hong-Ou-Mandel effect: two long, identical photons impinging on a beam splitter always bunch in either of the outputs. Interpreting the first photon detection as the state of the poison, the second photon is identified as the state of the cat. Even after the collapse of the first photon’s state, we show their fates are intertwined through quantum interference. We demonstrate this by a sudden phase change between the inputs, administered conditionally on the outcome of the first detection, which steers the second photon to a pre-defined output and ensures that the cat is always observed alive.

Granular: “Stochastic space-time and quantum theory”. (arXiv:1601.07171v13 [quant-ph] UPDATED)

In an earlier paper, a stochastic model had been presented for the Planck-scale nature of space-time. From it, many features of quantum mechanics and relativity were derived. But as mathematical points have no extent, the stochastic manifold cannot be tessellated with points (if the points are independently mobile) and so a granular model is required. As grains have orientations as well as positions, spinors (or quaternians) are required to describe them, resulting in phenomena as described by the Dirac equation. We treat both space and time stochastically and thus require a new interpretation of time to prevent an object being in multiple places at the same time. As the grains do have a definite volume, a mechanism is required to create and annihilate grains (without leaving gaps in space-time) as the universe, or parts thereof, expands or contracts. Making the time coordinate complex provides a mechanism. From geometric considerations alone, both the General Relativity field equations (the master equations of Relativity) and the Schr\”odinger equation (the master equation of quantum mechanics) are produced. Finally, to preserve the constancy of the volume element even internal to a mass, we propose a rolled-up fifth-dimension which is non-zero only in the presence of mass or energy.

Adiabatic theorem revisited: the unexpectedly good performance of adiabatic passage. (arXiv:2010.05093v2 [quant-ph] UPDATED)

Adiabatic passage employs a slowly varying time-dependent Hamiltonian to control the evolution of a quantum system along the Hamiltonian eigenstates. For processes of finite duration, the exact time evolving state may deviate from the adiabatic eigenstate at intermediate times, but in numerous applications it is observed that this deviation reaches a maximum and then decreases significantly towards the end of the process. We provide a straightforward theoretical explanation for this welcome but often unappreciated fact. Our analysis emphasizes a separate adiabaticity criterion for high fidelity state-to-state transfer and it points to new effective shortcut strategies for near adiabatic dynamics.

Genuine Multipartite Entanglement in Time. (arXiv:2011.09340v4 [quant-ph] UPDATED)

While spatial quantum correlations have been studied in great detail, much less is known about the genuine quantum correlations that can be exhibited by temporal processes. Employing the quantum comb formalism, processes in time can be mapped onto quantum states, with the crucial difference that temporal correlations have to satisfy causal ordering, while their spatial counterpart is not constrained in the same way. Here, we exploit this equivalence and use the tools of multipartite entanglement theory to provide a comprehensive picture of the structure of correlations that (causally ordered) temporal quantum processes can display. First, focusing on the case of a process that is probed at two points in time — which can equivalently be described by a tripartite quantum state — we provide necessary as well as sufficient conditions for the presence of bipartite entanglement in different splittings. Next, we connect these scenarios to the previously studied concepts of quantum memory, entanglement breaking superchannels, and quantum steering, thus providing both a physical interpretation for entanglement in temporal quantum processes, and a determination of the resources required for its creation. Additionally, we construct explicit examples of W-type and GHZ-type genuinely multipartite entangled two-time processes and prove that genuine multipartite entanglement in temporal processes can be an emergent phenomenon. Finally, we show that genuinely entangled processes across multiple times exist for any number of probing times.

Capacity of Entanglement in Local Operators. (arXiv:2106.00228v2 [hep-th] UPDATED)

We study the time evolution of the excess value of capacity of entanglement between a locally excited state and ground state in free, massless fermionic theory and free Yang-Mills theory in four spacetime dimensions. Capacity has non-trivial time evolution and is sensitive to the partial entanglement structure, and shows a universal peak at early times. We define a quantity, the normalized “Page time”, which measures the timescale when capacity reaches its peak. This quantity turns out to be a characteristic property of the inserted operator. This firmly establishes capacity as a valuable measure of entanglement structure of an operator, especially at early times similar in spirit to the Renyi entropies at late times. Interestingly, the time evolution of capacity closely resembles its evolution in microcanonical and canonical ensemble of the replica wormhole model in the context of the black hole information paradox.

Fluctuation Theorems with Retrodiction rather than Reverse Processes. (arXiv:2106.08589v1 [cond-mat.stat-mech] CROSS LISTED)

Irreversibility is usually captured by a comparison between the process that happens and a corresponding “reverse process”. In the last decades, this comparison has been extensively studied through fluctuation relations. Here we revisit fluctuation relations from the standpoint, suggested decades ago by Watanabe, that the comparison should involve the prediction and the retrodiction on the unique process, rather than two processes. We prove that Bayesian retrodiction underlies every fluctuation relation involving state variables. The retrodictive narrative also brings to the fore the possibility of deriving fluctuation relations based on various statistical divergences, and clarifies some of the traditional assumptions as arising from the choice of a reference prior.

Quantum Gravity Microstates from Fredholm Determinants. (arXiv:2106.09048v1 [hep-th])

Authors: Clifford V. Johnson

A large class of two dimensional quantum gravity theories of Jackiw-Teitelboim form have a description in terms of random matrix models. Such models, treated fully non-perturbatively, can give an explicit and tractable description of the underlying “microstate” degrees of freedom. They play a prominent role in regimes where the smooth geometrical picture of the physics is inadequate. This is shown using a natural tool for extracting the detailed microstate physics, a Fredholm determinant ${\rm det}(\mathbf{1}{-}\mathbf{ K})$. Its associated kernel $K(E,E^\prime)$ can be defined explicitly for a wide variety of JT gravity theories. To illustrate the methods, the statistics of the first several energy levels of a non-perturbative definition of JT gravity are constructed explicitly using numerical methods, and the full quenched free energy $F_Q(T)$ of the system is computed for the first time. These results are also of relevance to quantum properties of black holes in higher dimensions.

On Planetary Orbits in Entropic Gravity. (arXiv:2106.09155v1 [gr-qc])

Authors: G. Pérez-CuéllarM. Sabido

Starting with an entropy that includes volumetric, area and length terms as well as logarithmic contributions, we derive the corresponding modified Newtonian gravity and derive the expression for planetary orbits. We calculate the shift of the perihelion of Mercury to find bounds to the parameters associated to the modified Newtonian gravity. We compare the parameter associated to the volumetric contribution in the entropy-area relationship with the value derived for galactic rotation curves and the value obtained from the cosmological constant.

Graviton Self-Energy from Gravitons in Cosmology. (arXiv:2103.08547v2 [gr-qc] UPDATED)

Authors: L. Tan (Florida), N. C. Tsamis (Crete), R. P. Woodard (Florida)

Although matter contributions to the graviton self-energy $-i[\mbox{}^{\mu\nu} \Sigma^{\rho\sigma}](x;x’)$ must be separately conserved on $x^{\mu}$ and ${x’}^{\mu}$, graviton contributions obey the weaker constraint of the Ward identity, which involves a divergence on both coordinates. On a general homogeneous and isotropic background this leads to just four structure functions for matter contributions but nine structure functions for graviton contributions. We propose a convenient parameterization for these nine structure functions. We also apply the formalism to explicit one loop computations of $-i[\mbox{}^{\mu\nu} \Sigma^{\rho\sigma}](x;x’)$ on de Sitter background, one of the contributions from a massless, minimally coupled scalar and the other for the contribution from gravitons in the simplest gauge. We also specialize the linearized, quantum-corrected Einstein equation to the graviton mode function and to the gravitational response to a point mass.

Quantum Computing for Inflationary, Dark Energy and Dark Matter Cosmology. (arXiv:2105.13849v2 [quant-ph] UPDATED)

Cosmology is in an era of rapid discovery especially in areas related to dark energy, dark matter and inflation. Quantum cosmology treats the cosmology quantum mechanically and is important when quantum effects need to be accounted for, especially in the very early Universe. Quantum computing is an emerging new method of computing which excels in simulating quantum systems. Quantum computing may have some advantages when simulating quantum cosmology, especially because the Euclidean action of gravity is unbounded from below, making the implementation of Monte Carlo simulation problematic. In this paper we present several examples of the application of quantum computing to cosmology. These include a dark energy model that is related to Kaluza-Klein theory, dark matter models where the dark sector is described by a self interacting gauge field or a conformal scalar field and an inflationary model with a slow roll potential. We implement quantum computations in the IBM QISKit software framework and show how to apply the Variational Quantum Eigensolver (VQE) and Evolution of Hamiltonian (EOH) algorithms to solve the Wheeler-DeWitt equation that can be used to describe the cosmology in the mini-superspace approximation. We find excellent agreement with classical computing results and describe the accuracy of the different quantum algorithms. Finally we discuss how these methods can be scaled to larger problems going beyond the mini-superspace approximation where the quantum computer may exceed the performance of classical computation.

No Future in Black Holes. (arXiv:2106.03715v2 [hep-th] UPDATED)

Authors: Malcolm J. Perry

The black hole information paradox has been with us for some time. We outline the nature of the paradox. We then propose a resolution based on an examination of the properties of quantum gravity under circumstances that give rise to a classical singularity. We show that the gravitational wavefunction vanishes as one gets close to the classical singularity. This results in a future boundary condition inside the black hole that allows for quantum information to be recovered in the evaporation process.

Dynamical measurements of deviations from Newton’s $1/r^2$ law. (arXiv:2106.08611v1 [hep-ph] CROSS LISTED)

In a previous work (arXiv:1609.05654v2), an experimental setup aiming at the measurement of deviations from the Newtonian $1/r^2$ distance dependence of gravitational interactions was proposed. The theoretical idea behind this setup was to study the trajectories of a “Satellite” with a mass $m_{\rm S} \sim {\cal O}(10^{-9})$ $\mathrm{g}$ around a “Planet” with mass $m_{\rm P} \in [10^{-7},10^{-5} ]$ $\mathrm{g}$, looking for precession of the orbit. The observation of such feature induced by gravitational interactions would be an unambiguous indication of a gravitational potential with terms different from $1/r$ and, thus, a powerful tool to detect deviations from Newton’s $1/r^2$ law. In this paper we optimize the proposed setup in order to achieve maximal sensitivity to look for {\em Beyond-Newtonian} corrections. We study in detail possible background sources that could induce precession and quantify their impact on the achievable sensitivity. We conclude that a dynamical measurement of deviations from newtonianity can test Yukawa-like corrections to the $1/r$ potential with strength as low as $\alpha \sim 10^{-2}$ for distances as small as $\lambda \sim 10 \, \mu\mathrm{m}$.

Truthlikeness for probabilistic laws

Abstract

Truthlikeness is a property of a theory or a proposition that represents its closeness to the truth. We start by summarizing Niiniluoto’s (Truthlikeness, Reidel, Dordrecht, 1987) proposal of truthlikeness for deterministic laws (DL), which defines truthlikeness as a function of accuracy, and García-Lapeña’s (Br J Philos Sci, Forthcoming, 2021) expanded version, which defines truthlikeness for DL as a function of two factors, accuracy and nomicity. Then, we move to develop an appropriate definition of truthlikeness for probabilistic laws (PL) based on Niiniluoto’s (Truthlikeness, Reidel, Dordrecht, 1987) suggestion to use the Kullback–Leibler divergence to define the distance between a probability law $$X$$ and the true probability law $$T$$ . We argue that the Kullback–Leibler divergence seems to be the best of the available probability distances to measure accuracy between PL. However, as in the case of DL, we argue that accuracy represents a necessary but not sufficient condition, as two PL may be equally accurate and still one may imply more true or truthlike consequences, behaviours or true facts about the system than the other. The final proposal defines truthlikeness for PL as a function of two factors, p-accuracy and p-nomicity, in intimate connexion with García-Lapeña’s proposal for DL.

Introduction: Individuality, Distinguishability, and (Non‑)Entanglement

Friebe, Cord and Salimkhani, Kian and Wachter, Tina (2021) Introduction: Individuality, Distinguishability, and (Non‑)Entanglement. Journal for General Philosophy of Science. ISSN 0925-4560

Humeanism in Light of Quantum Gravity

Cinti, Enrico and Sanchioni, Marco (2021) Humeanism in Light of Quantum Gravity. [Preprint]

Does Neuroplasticity Support the Hypothesis of Multiple Realizability?

Maimon, Amber and Hemmo, Meir (2020) Does Neuroplasticity Support the Hypothesis of Multiple Realizability? [Preprint]