Causal Decision Theory and EPR correlations

on 2014-9-13 12:00am GMT

Abstract

The paper argues that on three out of eight possible hypotheses about the EPR experiment we can construct novel and realistic decision problems on which (a) Causal Decision Theory and Evidential Decision Theory conflict (b) Causal Decision Theory and the EPR statistics conflict. We infer that anyone who fully accepts any of these three hypotheses has strong reasons to reject Causal Decision Theory. Finally, we extend the original construction to show that anyone who gives any of the three hypotheses *any*non-zero credence has strong reasons to reject Causal Decision Theory. However, we concede that no version of the Many Worlds Interpretation (Vaidman, in Zalta, E.N. (ed.), *Stanford Encyclopaedia of Philosophy* 2014) gives rise to the conflicts that we point out.

Comment on Ashtekar: Generalization of Wigner׳s principle

on 2014-9-12 7:48pm GMT

Publication date: Available online 11 September 2014

**Source:**Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics

Author(s): Bryan W. Roberts

Ashtekar׳s generalization of Curie׳s principle and Kabir׳s principle in this volume shows that these principles are robust, obtaining in a variety of modifications of quantum theory. In this note, I illustrate how Wigner׳s principle can be similarly generalized.

Entanglement entropy as a witness of the Aharonov-Bohm effect in QFT. (arXiv:1409.3269v1 [hep-th])

on 2014-9-12 1:44am GMT

We study the dependence of the entanglement entropy with a magnetic flux, and show that the former quantity witnesses an Aharonov Bohm-like effect. In particular, we consider free charged scalar and Dirac fields living on a two dimensional cylinder and study how the entanglement entropy for a strip-like region on the surface of the cylinder is affected by a magnetic field enclosed by it.

QBism: A Critical Appraisal. (arXiv:1409.3312v1 [quant-ph])

physics.hist-ph updates on arXiv.org

on 2014-9-12 1:44am GMT

By insisting that the formal apparatus of quantum mechanics is a probability calculus, QBism opens the door to a deeper understanding of what quantum mechanics is trying to tell us. By insisting on a subjectivist Bayesian interpretation of probability in the context of quantum foundations, it closes this door again. To find the proper balance between subject and object, one must turn to Niels Bohr, the alleged “obscurity” of whose views casts a poor light on the current state of foundational research.

The Schr\”odinger-Newton equation and its foundations. (arXiv:1407.4370v2 [quant-ph] UPDATED)

on 2014-9-12 1:41am GMT

The necessity of quantising the gravitational field is still subject to an open debate. In this paper we compare the approach of quantum gravity, with that of a fundamentally semi-classical theory of gravity, in the weak-field non-relativistic limit. We show that, while in the former case the Schr\”odinger equation stays linear, in the latter case one ends up with the so-called Schr\”odinger-Newton equation, which involves a nonlinear, non-local gravitational contribution. We further discuss that the Schr\”odinger-Newton equation does not describe the collapse of the wave-function, although it was initially proposed for exactly this purpose. Together with the standard collapse postulate, fundamentally semi-classical gravity gives rise to superluminal signalling. A consistent fundamentally semi-classical theory of gravity can therefore only be achieved together with a suitable prescription of the wave-function collapse. We further discuss, how collapse models avoid such superluminal signalling and compare the nonlinearities appearing in these models with those in the Schr\”odinger-Newton equation.

on 2014-9-12 1:41am GMT

We introduce a ‘uniform tension-reduction’ (UTR) model, which allows to represent the probabilities associated with an arbitrary measurement situation and use it to explain the emergence of quantum probabilities (the Born rule) as ‘uniform’ fluctuations on this measurement situation. The model exploits the geometry of simplexes to represent the states, in a way that the measurement probabilities can be derived as the ‘Lebesgue measure’ of suitably defined convex subregions of the simplexes. We consider a very simple and evocative physical realization of the abstract model, using a material point particle which is acted upon by elastic membranes, which by breaking and collapsing produce the different possible outcomes. This easy to visualize mechanical realization allows one to gain considerable insight into the possible hidden structure of an arbitrary measurement process. We also show that the UTR-model can be further generalized into a ‘general tension-reduction’ (GTR) model, describing conditions of lack of knowledge generated by ‘non-uniform’ fluctuations. In this ampler framework, particularly suitable to describe experiments in cognitive science, we define and motivate a notion of ‘universal measurement’, describing the most general possible condition of lack of knowledge in a measurement, emphasizing that the uniform fluctuations characterizing quantum measurements can also be understood as an average over all possible forms of non-uniform fluctuations which can be actualized in a measurement context. This means that the Born rule of quantum mechanics can be understood as a first order approximation of a more general non-uniform theory, thus explaining part of the great success of quantum probability in the description of different domains of reality. This is the first part of a two-part article.

on 2014-9-12 1:41am GMT

The role of measurement induced disturbance in weak measurements is of central importance for the interpretation of the weak value. Uncontrolled disturbance can interfere with the postselection process and make the weak value dependent on the details of the measurement process. Here we develop the concept of a generalized weak measurement for classical and quantum mechanics. The two cases appear remarkably similar, but we point out some important differences. A priori it is not clear what the correct notion of disturbance should be in the context of weak measurements. We consider three different notions and get three different results: (1) For a `strong’ definition of disturbance, we find that weak measurements are disturbing. (2) For a weaker definition we find that a general class of weak measurements are non-disturbing, but that one gets weak values which depend on the measurement process. (3) Finally, with respect to an operational definition of the `degree of disturbance’, we find that the AAV weak measurements are the least disturbing, but that the disturbance is still non-zero.

on 2014-9-12 1:41am GMT

We examine the results of the paper “Precision metrology using weak measurements”, [Zhang, Datta, and Walmsley, arXiv:1310.5302] from a quantum state discrimination point of view. The Heisenberg scaling of the photon number for the precision of the interaction parameter between coherent light and a spin one-half particle (or pseudo-spin) has a simple interpretation in terms of the interaction rotating the quantum state to an orthogonal one. In order to achieve this scaling, the information must be extracted from the spin rather than from the coherent state of light, limiting the applications of the method to phenomena such as cross-phase modulation. We next investigate the effect of dephasing noise, and show a rapid degradation of precision, in agreement with general results in the literature concerning Heisenberg scaling metrology. We also demonstrate that a von Neumann-type measurement interaction can display a similar effect.

The PBR theorem seen from the eyes of a Bohmian. (arXiv:1409.3478v1 [quant-ph])

on 2014-9-12 1:41am GMT

The aim of this paper is to present an analysis of the new theorem by Pusey, Barrett and Rudolph (PBR) concerning ontic and epistemic hidden variables in quantum mechanics [Nature Phys. 8, 476 (2012)]. This is a kind of review and defense of my previous critical analysis done in the context of Bohmian mechanics. This is also the occasion for me to review some of the fundamental aspects of Bohmian theory rarely discussed in the literature. The paper is submitted for the issue PBR theorem and Beyond of IJQF (International Journal of Quantum Foundations).

PhilSci-Archive: No conditions. Results ordered -Date Deposited.

on 2014-9-11 11:09pm GMT

Bacciagaluppi, Guido (2014) Did Bohr Understand EPR? [Preprint]show enclosure

Group field theories for all loop quantum gravity. (arXiv:1409.3150v1 [gr-qc])

on 2014-9-11 2:01am GMT

Group field theories represent a 2nd quantized reformulation of the loop quantum gravity state space and a completion of the spin foam formalism. States of the canonical theory, in the traditional continuum setting, have support on graphs of arbitrary valence. On the other hand, group field theories have usually been defined in a simplicial context, thus dealing with a restricted set of graphs. In this paper, we generalize the combinatorics of group field theories to cover all the loop quantum gravity state space. As an explicit example, we describe the GFT formulation of the KKL spin foam model, as well as a particular modified version. We show that the use of tensor model tools allows for the most effective construction. In order to clarify the mathematical basis of our construction and of the formalisms with which we deal, we also give an exhaustive description of the combinatorial structures entering spin foam models and group field theories, both at the level of the boundary states and of the quantum amplitudes.

Information loss, made worse by quantum gravity. (arXiv:1409.3157v1 [gr-qc])

on 2014-9-11 2:01am GMT

Quantum gravity is often expected to solve both the singularity problem and the information-loss problem of black holes. This article presents an example from loop quantum gravity in which the singularity problem is solved in such a way that the information-loss problem is made worse. Quantum effects in this scenario, in contrast to previous non-singular models, do not eliminate the event horizon and introduce a new Cauchy horizon where determinism breaks down. Although infinities are avoided, for all practical purposes the core of the black hole plays the role of a naked singularity. Recent developments in loop quantum gravity indicate that this aggravated information loss problem is likely to be the generic outcome, putting strong conceptual pressure on the theory.

on 2014-9-11 1:59am GMT

A candidate for a realistic relativistic quantum theory is the hypersurface Bohm-Dirac model. Its formulation uses a foliation of spacetime into space-like hypersurfaces. This structure may well arise from the universal wave function itself, entailing the relativistic character of the theory despite the appearance of a preferred foliation. However, to apply the theory and to make contact with the usual quantum formalism one needs a framework for the description of subsystems. The presence of spin together with the foliation renders the subsystem description more complicated as compared to the non-relativistic case with spin. In this paper, we provide such a framework in terms of an appropriate conditional density matrix and an effective wave function as well as clarify their relation, thereby generalizing previous subsystem descriptions in the non-relativistic case.

on 2014-9-11 1:59am GMT

We address two major conceptual developments introduced by Aharonov and collaborators through a \textit{quantum phase space} approach: the concept of \textit{modular variables} devised to explain the phenomena of quantum dynamical non-locality and the \textit{two-state formalism} for Quantum Mechanics which is a retrocausal time-symmetric interpretation of quantum physics which led to the discovery of \textit{weak values.} We propose that a quantum phase space structure underlies these profound physical insights in a unifying manner. For this, we briefly review the Weyl-Wigner and the coherent state formalisms as well as the inherent symplectic structures of quantum projective spaces in order to gain a deeper understanding of the weak value concept.

We also review Schwinger’s finite quantum kinematics so that we may apply this discrete formalism to understand Aharonov’s modular variable concept in a different manner that has been proposed before in the literature. We discuss why we believe that this\ is indeed the correct kinematic framework for the modular variable concept and how this may shine some light on the physical distinction between quantum dynamical non-locality and the kinematic non-locality, generally associated with entangled quantum systems.

A subquantum arrow of time. (arXiv:1409.3131v1 [quant-ph])

on 2014-9-11 1:59am GMT

The outcome of a single quantum experiment is unpredictable, except in a pure-state limit. The definite process that takes place in the apparatus may either be intrinsically random or be explainable from a deeper theory. While the first scenario is the standard lore, the latter implies that quantum mechanics is emergent. In that case, it is likely that one has to reconsider radiation by accelerated charges as a physical effect, which thus must be compensated by an energy input. Stochastic electrodynamics, for example, asserts that the vacuum energy arises from classical fluctuations with energy $\frac{1}{2}\hbar\omega$ per mode. In such theories the stability of the hydrogen ground state will arise from energy input from fluctuations and output by radiation, hence due to an energy throughput. That flux of energy constitutes an arrow of time, which we call the “subquantum arrow of time”. It is related to the stability of matter and it is more fundamental than, e.g., the thermodynamic and cosmological arrows.

on 2014-9-11 12:00am GMT

Abstract

The paper has two aims: (1) it sets out to show that it is well motivated to seek for an account of quantum non-locality in the framework of ontic structural realism (OSR), which integrates the notions of holism and non-separability that have been employed since the 1980s to achieve such an account. However, recent research shows that OSR on its own cannot provide such an account. Against this background, the paper argues that by applying OSR to the primitive ontology theories of quantum physics, one can accomplish that task. In particular, Bohmian mechanics offers the best prospect for doing so. (2) In general, the paper seeks to bring OSR and the primitive ontology theories of quantum physics together: on the one hand, in order to be applicable to quantum mechanics, OSR has to consider what the quantum ontology of matter distributed in space-time is. On the other hand, as regards the primitive ontology theories, OSR provides the conceptual tools for these theories to answer the question of what the ontological status of the wave-function is.

Short-time quantum propagator and Bohmian trajectories

ScienceDirect Publication: Physics Letters A

on 2014-9-10 10:54pm GMT

Publication date: 6 December 2013

**Source:**Physics Letters A, Volume 377, Issue 42

Author(s): Maurice de Gosson , Basil Hiley

We begin by giving correct expressions for the short-time action following the work Makri–Miller. We use these estimates to derive an accurate expression modulo Δ t 2 for the quantum propagator and we show that the quantum potential is negligible modulo Δ t 2 for a point source, thus justifying an unfortunately largely ignored observation of Holland made twenty years ago. We finally prove that this implies that the quantum motion is classical for very short times.

Emergent Gravity requires (kinematic) non-locality. (arXiv:1409.2509v1 [hep-th])

on 2014-9-10 1:04am GMT

This work refines arguments forbidding non-linear dynamical gravity from appearing in the low energy effective description of field theories with local kinematics, even for those with instantaneous long-range interactions. Specifically, we note that gravitational theories with universal coupling to energy — an intrinsically non-linear phenomenon — are characterized by Hamiltonians that are pure boundary terms on shell. In order for this to be the low energy effective description of a field theory with local kinematics, all bulk dynamics must be frozen and thus irrelevant to the construction. The result applies to theories defined either on a lattice or in the continuum, and requires neither Lorentz-invariance nor translation-invariance.

Twistor Origin of the Superstring. (arXiv:1409.2510v1 [hep-th])

on 2014-9-10 1:04am GMT

After introducing a d=10 pure spinor $\lambda^\alpha$, the Virasoro constraint $\partial x^m \partial x_m =0$ can be replaced by the twistor-like constraint $\partial x^m (\gamma_m \lambda)_\alpha=0$. Quantizing this twistor-like constraint leads to the pure spinor formalism for the superstring where the fermionic superspace variables $\theta^\alpha$ and their conjugate momenta come from the ghosts and antighosts of the twistor-like constraint.

New Foundations for Physical Geometry: The Theory of Linear Structures, by Tim Maudlin

Taylor and Francis: Australasian Journal of Philosophy: Table of Contents

on 2014-9-10 1:03am GMT

Australasian Journal of Philosophy, Ahead of Print. <br/>

on 2014-9-10 1:03am GMT

This work develops analytic methods to quantitatively demarcate quantum reality from its subset of classical phenomenon, as well as from the superset of general probabilistic theories. Regarding quantum nonlocality, we discuss how to determine the quantum limit of Bell-type linear inequalities. In contrast to semidefinite programming approaches, our method allows for the consideration of inequalities with abstract weights, by means of leveraging the Hermiticity of quantum states. Recognizing that classical correlations correspond to measurements made on separable states, we also introduce a practical method for obtaining sufficient separability criteria. We specifically vet the candidacy of driven and undriven superradiance as schema for entanglement generation. We conclude by reviewing current approaches to quantum contextuality, emphasizing the operational distinction between nonlocal and contextual quantum statistics. We utilize our abstractly-weighted linear quantum bounds to explicitly demonstrate a set of conditional probability distributions which are simultaneously compatible with quantum contextuality while being incompatible with quantum nonlocality. It is noted that this novel statistical regime implies an experimentally-testable target for the Consistent Histories theory of quantum gravity.

Quanta of Geometry. (arXiv:1409.2471v1 [hep-th])

on 2014-9-09 2:22am GMT

In the construction of spectral manifolds in noncommutative geometry, a higher degree Heisenberg commutation relation involving the Dirac operator and the Feynman slash of real scalar fields naturally appears and implies, by equality with the index formula, the quantization of the volume. We first show that this condition implies that the manifold decomposes into disconnected spheres which will represent quanta of geometry. We then refine the condition by involving the real structure and two types of geometric quanta, and show that connected manifolds with large quantized volume are then obtained as solutions. When this condition is adopted in the gravitational action it leads to the quantization of the four volume with the cosmological constant obtained as an integration constant. Restricting the condition to a three dimensional hypersurface implies quantization of the three volume and the possible appearance of mimetic dark matter. When restricting to a two dimensional hypersurface, under appropriate boundary conditions, this results in the quantization of area and has many interesting applications to black hole physics.

physics.hist-ph updates on arXiv.org

on 2014-9-09 2:22am GMT

Christopher Fuchs and R\”udiger Schack have developed a way of understanding science, which, among other things, resolves many of the conceptual puzzles of quantum mechanics that have vexed people for the past nine decades. They call it QBism. I speculate on how John Bell might have reacted to QBism, and I explain the many ways in which QBism differs importantly from the orthodox ways of thinking about quantum mechanics associated with the term “Copenhagen interpretation.”

physics.hist-ph updates on arXiv.org

on 2014-9-09 2:22am GMT

We present an alternative to the Copenhagen interpretation of the formalism of nonrelativistic quantum mechanics. The basic difference is that the new interpretation is formulated in the language of epistemological realism. It involves a change in some basic physical concepts. Elementary particles are considered as extended objects and nonlocal effects are included. The role of the new concepts in the problem of measurement and of the Einstein-Podolsky-Rosen correlations is described. Experiments to distinguish the proposed interpretation from the Copenhagen one are pointed out.

on 2014-9-09 2:21am GMT

We develop a general, non-probabilistic model of prediction which is suitable for assessing the (un)predictability of individual physical events. We use this model to provide, for the first time, a rigorous proof of the unpredictability of a class of individual quantum measurement outcomes, a well-known quantum attribute postulated or claimed for a long time. We prove that quantum indeterminism – formally modelled as value indefiniteness – is incompatible with the supposition of predictability: value indefinite observables are unpredictable. The proof makes essential use of a strengthened form of the Kochen-Specker theorem proven previously to identify value indefinite observables. As a result, quantum unpredictability, like the Kochen-Specker theorem, relies on three assumptions: compatibility with quantum mechanical predictions, non-contextuality, and the value definiteness of observables corresponding to the preparation basis of a quantum state. Finally, quantum unpredictability is used to prove that quantum randomness is “maximally incomputable”, and to discuss a real model of hypercomputation whose computational power has yet to be determined. The paper ends with a further open problem.

on 2014-9-09 2:21am GMT

The ontological model framework for an operational theory has generated much interest in recent years. The debate concerning reality of quantum states has been made more precise in this framework. With the introduction of generalized notion of contextuality in this framework, it has been shown that completely mixed state of a qubit is \emph{preparation contextual}. Interestingly, this new idea of preparation contextuality has been used to demonstrate nonlocality of some $\psi$-epistemic models without any use of Bell’s inequality. In particular, nonlocality of a non maximally $\psi$-epistemic model has been demonstrated from preparation contextuality of a maximally mixed qubit and Schr\”{o}dinger’s steerability of the maximally entangled state of two qubits [Phys. Rev. Lett {\bf 110}, 120401 (2013)]. In this paper, we, show that any mixed state is preparation contextual. We, then, show that nonlocality of any bipartite pure entangled state, with Schmidt rank two, follows from preparation contextuality and steerability provided we impose certain condition on the epistemicity of the underlying ontological model. More interestingly, if the pure entangled state is of Schmidt rank greater than two, its nonlocality follows without any further condition on the epistemicity. Thus our result establishes a stronger connection between nonlocality and preparation contextuality by revealing nonlocality of any bipartite pure entangled states without any use of Bell-type inequality.

A Process Algebra Approach to Quantum Mechanics. (arXiv:1409.2146v1 [quant-ph])

on 2014-9-09 2:21am GMT

The process approach to NRQM offers a fourth framework for the quantization of physical systems. Unlike the standard approaches (Schrodinger-Heisenberg, Feynman, Wigner-Gronewald-Moyal), the process approach is not merely equivalent to NRQM and is not merely a re-interpretation. The process approach provides a dynamical completion of NRQM. Standard NRQM arises as a asymptotic quotient by means of a set-valued process covering map, which links the process algebra to the usual space of wave functions and operators on Hilbert space. The process approach offers an emergentist, discrete, finite, quasi-non-local and quasi-non-contextual realist interpretation which appears to resolve many of the paradoxes and is free of divergences. Nevertheless, it retains the computational power of NRQM and possesses an emergent probability structure which agrees with NRQM in the asymptotic quotient. The paper describes the process algebra, the process covering map for single systems and the configuration process covering map for multiple systems. It demonstrates the link to NRQM through a toy model. Applications of the process algebra to various quantum mechanical situations – superpositions, two-slit experiments, entanglement, Schrodinger’s cat – are presented along with an approach to the paradoxes and the issue of classicality.

Unitary evolution and the distinguishability of quantum states. (arXiv:1409.2244v1 [quant-ph])

on 2014-9-09 2:21am GMT

The study of quantum systems evolving from initial states to distinguishable, orthogonal final states is important for information processing applications such as quantum computing and quantum metrology. However, for most unitary evolutions and initial states the system does not evolve to an orthogonal quantum state. Here, we ask what proportion of quantum states evolves to nearly orthogonal systems as a function of the dimensionality of the Hilbert space of the system, and numerically study the evolution of quantum states in low-dimensional Hilbert spaces. We find that, as well as the speed of dynamical evolution, the level of maximum distinguishability depends critically on the Hamiltonian of the system.

on 2014-9-09 2:21am GMT

The purpose of this article is to present standard quantum mechanics from an ontological point of view called physical realism: it states that the goal of physics is to study entities of the natural world, existing independently from any particular observer’s perception, and obeying universal and intelligible rules. Though the compatibility of physical realism and quantum mechanics has been much debated, we claim here that both are perfectly compatible, provided that what is meant by physical properties is — slightly but profoundly — modified: contrary to the ordinary, classical ontology, physical properties must be attributed jointly to the system, and to the context in which it is embedded. This intrinsically bipartite nature of physical reality sheds new light on counter-intuitive features of quantum mechanics such as non-locality or the quantum-classical boundary.

The missing piece of the puzzle: the discovery of the Higgs boson

on 2014-9-09 12:00am GMT

Abstract

The missing piece of the puzzle: the discovery of the Higgs boson On July 4, 2012 the CMS and ATLAS collaborations at the large hadron collider jointly announced the discovery of a new elementary particle, which resembled the Higgs boson, the last remaining undiscovered piece of the standard model of elementary particles. Both groups claimed to have observed a five-standard-deviation (five sigmas) effect above background, the gold standard for discovery in high-energy physics. In this essay I will briefly discuss the how the CMS collaboration performed the experiment and analyzed the data. I will also show the experimental results.

The Shaky Game +25, or: on locavoracity

on 2014-9-09 12:00am GMT

Abstract

Taking Arthur Fine’s *The Shaky Game* as my inspiration, and the recent 25 \({\textit{th}}\) anniversary of the publication of that work as the occasion to exercise that inspiration, I sketch an alternative to the “Naturalism” prevalent among philosophers of physics. Naturalism is a methodology eventuating in a metaphysics. The methodology is to seek the deep framework assumptions that make the best sense of science; the metaphysics is furnished by those assumptions and supported by their own support of science. The alternative presented here, which I call “Locavoracity,” shares Naturalism’s commitment to making sense of science, but alters Naturalism’s methodology. The Locavore’s sense-making projects are piecemeal, rather than sweeping. The Locavore’s hypothesis is that the collection of local sense-making projects fails to issue a single overarching unifying framework deserving of the title “*the* metaphysics that makes the best sense of science.” I muster some examples supporting the Locavore hypothesis from the interpretation of quantum field theories.

Canonical Quantum Gravity on Noncommutative Spacetime. (arXiv:1409.1751v1 [gr-qc])

on 2014-9-08 7:34am GMT

In this paper canonical quantum gravity on noncommutative space-time is considered. The corresponding generalized classical theory is formulated by using the moyal star product, which enables the representation of the field quantities depending on noncommuting coordinates by generalized quantities depending on usual coordinates. But not only the classical theory has to be generalized in analogy to other field theories. Besides, the necessity arises to replace the commutator between the gravitational field operator and its canonical conjugated quantity by a corresponding generalized expression on noncommutative space-time. Accordingly the transition to the quantum theory has also to be performed in a generalized way and leads to extended representations of the quantum theoretical operators. If the generalized representations of the operators are inserted to the generalized constraints, one obtains the corresponding generalized quantum constraints including the Hamiltonian constraint as dynamical constraint. After considering quantum geometrodynamics under incorporation of a coupling to matter fields, the theory is transferred to the Ashtekar formalism. The holonomy representation of the gravitational field as it is used in loop quantum gravity opens the possibility to calculate the corresponding generalized area operator.

Holomorphy in the Standard Model Effective Field Theory. (arXiv:1409.0868v1 [hep-ph] CROSS LISTED)

on 2014-9-08 7:34am GMT

The anomalous dimensions of dimension-six operators in the Standard Model Effective Field Theory (SMEFT) respect holomorphy to a large extent. The holomorphy conditions are reminiscent of supersymmetry, even though the SMEFT is not a supersymmetric theory.

Is the quantum state real? A review of $\psi$-ontology theorems. (arXiv:1409.1570v1 [quant-ph])

on 2014-9-08 3:08am GMT

Towards the end of 2011, Pusey, Barrett and Rudolph (PBR) derived a theorem that aimed to show that the quantum state must be ontic (a state of reality) in a broad class of realist approaches to quantum theory. This result attracted a lot of attention and controversy. The aim of this review article is to to review the background to the PBR Theorem, to provide a clear presentation of the theorem itself, and to review related work that has appeared since the publication of the PBR paper. In particular, this review:

– Explains what it means for the quantum state to be ontic or epistemic (a state of knowledge).

– Reviews arguments for and against an ontic interpretation of the quantum state as they existed prior to the PBR Theorem.

– Explains why proving the reality of the quantum state is a very strong constraint on realist theories in that it would imply many of the known no-go theorems, such as Bell’s Theorem and the need for an exponentially large ontic state space.

– Provides a comprehensive presentation of the PBR Theorem itself, along with subsequent improvements and criticisms of its assumptions.

– Reviews two other arguments for the reality of the quantum state: the first due to Hardy and the second due to Colbeck and Renner, and explains why their assumptions are less compelling than those of the PBR Theorem.

– Reviews subsequent work aimed at ruling out stronger notions of what it means for the quantum state to be epistemic and points out open questions in this area.

The overall aim is not only to provide the background needed for the novice in this area to understand the current status, but also to discuss often overlooked subtleties that should be of interest to the experts.

Weak Measurements via Quantum Erasure. (arXiv:1409.1575v1 [quant-ph])

on 2014-9-08 3:08am GMT

Weak measurement is increasingly acknowledged as an important theoretical and experimental tool. Until now however, it was not known how to perform an efficient weak non-local measurement of a general operator. We propose a novel scheme for performing non-local weak measurement which is based on the principle of quantum erasure. This method is then demonstrated within a few gedanken experiments, and also applied to the case of measuring sequential weak values. Comparison with other protocols for extracting non-local weak values offers several advantages of the suggested algorithm. In addition to the practical merits, this scheme sheds new light on fundamental topics such as causality, non-locality, measurement and uncertainty.

on 2014-9-08 3:08am GMT

A longstanding challenge in the foundations of quantum mechanics is the veri?cation of alternative collapse theories despite their mathematical similarity to decoherence. To this end, we suggest a novel method based on dynamical decoupling. Experimental observation of nonzero saturation of the decoupling error in the limit of fast decoupling operations can provide evidence for alternative quantum theories. As part of the analysis we prove that unbounded Hamiltonians can always be decoupled, and provide novel dilations of Lindbladians.

Measurement and self-adjoint operators. (arXiv:1405.7224v3 [quant-ph] UPDATED)

on 2014-9-08 3:08am GMT

The approximations of classical mechanics resulting from quantum mechanics are richer than a correspondence of classical dynamical variables with self-adjoint Hilbert space operators. Assertion that classical dynamic variables correspond to self-adjoint Hilbert space operators is disputable and sets unnatural limits on quantum mechanics. Well known examples of classical dynamical variables not associated with self-adjoint Hilbert space operators are discussed as a motivation for the realizations of quantum field theory that lack Hermitian field operators but exhibit interaction.

Quantum Entanglement, Bohmian Mechanics, and Humean Supervenience

Taylor and Francis: Australasian Journal of Philosophy: Table of Contents

on 2014-9-07 4:04am GMT

Australasian Journal of Philosophy, Volume 92, Issue 3, pages 567-583, September 2014. <br/>

Ontological aspects of the Casimir Effect

on 2014-9-06 7:38pm GMT

Publication date: Available online 6 September 2014

**Source:**Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics

Author(s): William M.R. Simpson

The role of the vacuum, in the Casimir Effect, is a matter of some dispute: the Casimir force has been variously described as a phenomenon resulting “from the alteration, by the boundaries, of the zero-point electromagnetic energy” (Bordag, Mohideen, & Mostepanenko, 2001), or a “van der Waals force between the metal plates” that can be “computed without reference to zero point energies” (Jaffe, 2005). Neither of these descriptions is grounded in a consistently quantum mechanical treatment of matter interacting with the electromagnetic field. However, the Casimir Effect has been canonically described within the framework of macroscopic quantum electrodynamics (Philbin, 2010). On this general account, the force is seen to arise due to the coupling of fluctuating currents to the zero-point radiation, and it is in this restricted sense that the phenomenon requires the existence of zero-point fields. The conflicting descriptions of the Casimir Effect, on the other hand, appear to arise from ontologies in which an unwarranted metaphysical priority is assigned either to the matter or the fields, and this may have a direct bearing on the problem of the cosmological constant.

on 2014-9-06 4:31pm GMT

Publication date: 10 July 2014

**Source:**Physics Reports, Volume 540, Issue 2

Author(s): M. Belloni , R.W. Robinett

The infinite square well and the attractive Dirac delta function potentials are arguably two of the most widely used models of one-dimensional bound-state systems in quantum mechanics. These models frequently appear in the research literature and are staples in the teaching of quantum theory on all levels. We review the history, mathematical properties, and visualization of these models, their many variations, and their applications to physical systems.

Decoherence in generalized measurement and the quantum Zeno paradox

on 2014-9-06 4:31pm GMT

Publication date: 1 July 2014

**Source:**Physics Reports, Volume 540, Issue 1

Author(s): Gerhard Mack , Sascha Wallentowitz , Peter E. Toschek

In the development of quantum mechanics, the evolution of a quantum system was a controversial item. The duality of unitary evolution and state reduction as proposed by John von Neumann was widely felt unsatisfactory. Among the various attempts to reconcile the two incompatible modes of dynamics, the model of decoherence has turned out rather convincing. While the debate has been going on mainly by reasoning the consequences of gedanken experiments, the technical progress has made available techniques of addressing real experiments, even on an individual quantum object. In particular, the impeded evolution of an atom under continuous or reiterated measurement, predicted long ago, has been proven. The procedure of such an experiment–as with many a more conventional one–includes sequences of alternating time intervals of preparation and detection, known as “pump–probe”, or “drive–probe” measurements. We discuss this procedure in the context of the decoherence model. The emergence of pointer states of the meter is outlined. We show compatibility of this approach with photon counting, and emphasize the importance of information transfer in the course of measurement. Qualitative conditions having been considered so far necessary and sufficient criteria for meeting the “quantum Zeno paradox” are being quantified.

The angular momentum controversy: What’s it all about and does it matter?

on 2014-9-06 4:31pm GMT

Publication date: 20 August 2014

**Source:**Physics Reports, Volume 541, Issue 3

Author(s): Elliot Leader , Cédric Lorcé

The general question, crucial to an understanding of the internal structure of the nucleon, of how to split the total angular momentum of a photon or gluon into spin and orbital contributions is one of the most important and interesting challenges faced by gauge theories like Quantum Electrodynamics and Quantum Chromodynamics. This is particularly challenging since all QED textbooks state that such a splitting cannot be done for a photon (and a fortiori for a gluon) in a gauge-invariant way, yet experimentalists around the world are engaged in measuring what they believe is the gluon spin! This question has been a subject of intense debate and controversy, ever since, in 2008, it was claimed that such a gauge-invariant split was, in fact, possible. We explain in what sense this claim is true and how it turns out that one of the main problems is that such a decomposition is not unique and therefore raises the question of what is the most natural or physical choice. The essential requirement of measurability does not solve the ambiguities and leads us to the conclusion that the choice of a particular decomposition is essentially a matter of taste and convenience. In this review, we provide a pedagogical introduction to the question of angular momentum decomposition in a gauge theory, present the main relevant decompositions and discuss in detail several aspects of the controversies regarding the question of gauge invariance, frame dependence, uniqueness and measurability. We stress the physical implications of the recent developments and collect into a separate section all the sum rules and relations which we think experimentally relevant. We hope that such a review will make the matter amenable to a broader community and will help to clarify the present situation.

On reduction of the wave-packet, decoherence, irreversibility and the second law of thermodynamics

on 2014-9-06 4:31pm GMT

Publication date: 30 August 2014

**Source:**Physics Reports, Volume 541, Issue 4

Author(s): H. Narnhofer , W.F. Wreszinski

We prove a quantum version of the second law of thermodynamics: the (quantum) Boltzmann entropy increases if the initial (zero time) density matrix decoheres, a condition generally satisfied in Nature. It is illustrated by a model of wave-packet reduction, the Coleman–Hepp model, along the framework introduced by Sewell (2005) in his approach to the quantum measurement problem. Further models illustrate the monotonic-versus-non-monotonic behavior of the quantum Boltzmann entropy in time. As a last closely related topic, decoherence, which was shown by Narnhofer and Thirring (1999) to enforce macroscopic purity in the case of quantum K systems, is analyzed within a different class of quantum chaotic systems, viz. the quantum Anosov models as defined by Emch, Narnhofer, Sewell and Thirring (1994). A review of the concept of quantum Boltzmann entropy, as well as of some of the rigorous approaches to the quantum measurement problem within the framework of Schrödinger dynamics, is given, together with an overview of the C* algebra approach, which encompasses the relevant notions and definitions in a comprehensive way.

The maximum average rate of state change. (arXiv:1109.4994v16 [quant-ph] UPDATED)

on 2014-9-06 1:40pm GMT

For an isolated quantum system, average energy above the ground state and average momentum determine the minimum time and distance needed to transition through $N$ distinct states. Other moments of energy and momentum provide similar bounds. For $N\gg1$, we can equate average energy and momentum with the maximum average rates of distinct state change in time and space.

on 2014-9-06 1:40pm GMT

I discuss the problems of probability and the future in the Everett-Wheeler understanding of quantum theory. To resolve these, I propose an understanding of probability arising from a form of temporal logic: the probability of a future-tense proposition is identified with its truth value in a many-valued and context-dependent logic. I construct a lattice of tensed propositions, with truth values in the interval $[0,1]$, and derive logical properties of the truth values given by the usual quantum-mechanical formula for the probability of histories.

Anomalous weak values are proofs of contextuality. (arXiv:1409.1535v1 [quant-ph])

on 2014-9-06 1:40pm GMT

The average result of a weak measurement of some observable $A$ can, under post-selection of the measured quantum system, exceed the largest eigenvalue of $A$. The nature of weak measurements, as well as the presence of post-selection and hence possible contribution of measurement-disturbance, has led to a long-running debate about whether or not this is surprising. Here, it is shown that such “anomalous weak values” are non-classical in a precise sense: a sufficiently weak measurement of one constitutes a proof of contextuality. This clarifies, for example, which features must be present (and in an experiment, verified) to demonstrate an effect with no satisfying classical explanation.

on 2014-9-06 1:40pm GMT

The construction of a continuum limit for the dynamics of loop quantum gravity is unavoidable to complete the theory. We explain that such a construction is equivalent to obtaining the continuum physical Hilbert space, which encodes the solutions of the theory. We present iterative coarse graining methods to construct physical states in a truncation scheme and explain in which sense this scheme represents a renormalization flow. We comment on the role of diffeomorphism symmetry as an indicator for the continuum limit.

This is a test comment, testing the email notifications function.