# Weekly Papers on Quantum Foundations (38)

This is a list of this week’s papers on quantum foundations published in various journals or uploaded to preprint servers such as arxiv.org and PhilSci Archive.

Are weak measurements really necessary for Leggett-Garg type measurements?. (arXiv:1509.05136v1 [quant-ph])

on 2015-9-18 6:55am GMT

Authors: N.D. Hari Dass

Leggett-Garg inequalities are an important milestone in our quest to bridge the classical-quantum divide. An experimental investigation of these inequalities requires the so called \emph{non-invasive measurements}(NIM). It has become popular to invoke weak measurements as the means of realising NIMs to very good approximation, because of their allegedly low disturbance of systems under measurement. In this note, this is shown to be a myth; it is shown, by simple estimates of errors, that for comparable levels of statistical errors, even the strong or projective measurements can be used. In fact, it is shown that resource-wise, strong measurements are even preferable.

Keldysh formalism for multiple parallel worlds. (arXiv:1509.04253v1 [quant-ph])

on 2015-9-15 1:48am GMT

We present here a compact and self-contained review of recently developed Keldysh formalism for multiple parallel worlds. The formalism has been applied to consistent quantum evaluation of the flows of informational quantities, in particular, to evaluation of Renyi and Shannon entropy flows. We start with the formulation of standard and extended Keldysh technique in single world in a form convenient for our presentation. We explain the use of Keldysh contours encompassing multiple parallel worlds In the end, we shortly summarize the concrete results obtained with the method.

A Quantum Paradox of Choice: More Freedom Makes Summoning a Quantum State Harder. (arXiv:1509.04226v1 [quant-ph])

on 2015-9-15 1:48am GMT

Authors: Emily AdlamAdrian Kent (Centre for Quantum Information and Foundations, DAMTP, University of Cambridge)

The properties of quantum information in space-time can be investigated by studying operational tasks. In one such task, summoning, an unknown quantum state is supplied at one point, and a call is made at another for it to be returned at a third. Hayden-May recently proved necessary and sufficient conditions for guaranteeing successful return of a summoned state for finite sets of call and return points when there is a guarantee of at most one summons. We prove necessary and sufficient conditions when there may be several possible summonses and complying with any one constitutes success. We show there is a “quantum paradox of choice” in summoning: the extra freedom in completing the task makes it strictly harder. This intriguing result has practical applications for distributed quantum computing and cryptography and also implications for our understanding of relativistic quantum information and its localization in space-time.

Generalized Cauchy-Schwarz inequality and Uncertainty Relation. (arXiv:1509.03701v1 [quant-ph])

on 2015-9-15 1:48am GMT

Authors: Vishnu M. Bannur

A generalized Cauchy-Schwarz inequality is derived and applied to uncertainty relation in quantum mechanics. We see a modification in the uncertainty relation and minimum uncertainty wave packet.

Thermodynamical cost of some interpretations of quantum theory. (arXiv:1509.03641v1 [quant-ph])

on 2015-9-15 1:48am GMT

The interpretation of quantum theory is one of the longest-standing debates in physics. Type-I interpretations see quantum probabilities as determined by intrinsic properties of the world. Type-II interpretations see quantum probabilities as not directly dealing with intrinsic properties of the world but with relational experiences between an observer and the world. It is usually believed that deciding between these two types cannot be made simply on purely physical grounds but it requires an act of metaphysical judgement. Here we show that, although the problem is undecidable within the framework of quantum theory, it is decidable, under some assumptions, within the framework of thermodynamics. We prove that type-I interpretations are incompatible with the following assumptions: (i) the decision of which measurement is performed on a quantum system can be made independently of the system, (ii) a quantum system has limited memory, and (iii) Landauer’s principle is valid. We consider an ideal experiment in which an individual quantum system is submitted to a sequence of quantum projective measurements that leave the system in pure quantum states. We show that in any type-I interpretation satisfying (i)-(iii) the system must reset its internal state, which implies that a minimum amount of heat per measurement has to be dissipated into the system’s environment. We calculate a lower bound to the heat dissipated per measurement assuming that the measurements are chosen from a set of size $2^n$. Then, we show that this lower bound becomes infinite in the limit of $n$ tending to infinity. This leads to the conclusion that either type-I interpretations are untenable or at least one of the assumptions (i)-(iii) has to be abandoned.

Cyclic multiverses. (arXiv:1509.04074v1 [gr-qc])

on 2015-9-15 1:47am GMT

Starting with the idea of regularization of singularities due to the variability of the fundamental constants in cosmology we first study the cyclic universe models. We find two models of oscillating mass density and pressure regularized by varying gravitational constant $G$. Then, we extend this idea onto the multiverse containing cyclic individual universes with either growing or decreasing entropy though leaving the net entropy constant. In order to get the key idea, we consider the doubleverse with the same geometrical evolution of the two “parallel” universes with their physical evolution (physical coupling constants $c(t)$ and $G(t)$) being different. An interesting point is that there is a possibility to exchange the universes at the point of maximum expansion — the fact which was already noticed in quantum cosmology. Similar scenario is also possible within the framework of Brans-Dicke theory.

Multiverse as an ensemble of stable and unstable Universes. (arXiv:1509.03830v1 [gr-qc])

on 2015-9-15 1:47am GMT

Authors: K. Urbanowski

Calculations performed within the Standard Model suggest that the electroweak vacuum is unstable if the mass of the Higgs particle is around 125 — 126 GeV. Recent LHC results concerning the mass of the Higgs boson indicate that its mass is around 125.7 GeV. So it is possible that the vacuum in our Universe may be unstable. This means that it is reasonable to analyze properties of Universes with unstable vacua. We analyze properties of an ensemble of Universes with unstable vacua considered as an ensemble of unstable systems from the point of view of the quantum theory of unstable states and we try to explain why the universes with the unstable vacuum needs not decay.

Restoring particle phenomenology

Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics

on 2015-9-14 1:25pm GMT

Publication date: August 2015
Source:Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics, Volume 51
Author(s): Giovanni Valente
No-go theorems are known in the literature to the effect that, in relativistic quantum field theory, particle localizability in the strict sense violates relativistic causality. In order to account for particle phenomenology without particle ontology, Halvorson and Clifton (2002) proposed an approximate localization scheme. In a recent paper, Arageorgis and Stergiou (2013) proved a no-go result that suggests that, even within such a scheme, there would arise act–outcome correlations over the entire spacetime, thereby violating relativistic causality. Here, we show that this conclusion is untenable. In particular, we argue that one can recover particle phenomenology without having to give up relativistic causality.