It is often assumed that the only effect of the Ghirardi-Rimini-Weber (`GRW’) dynamical collapse mechanism on the `tails’ of the wavefunction (that is, the components of superpositions on which the collapse is not centred) is to reduce their weight. In consequence the tails are often thought to behave exactly as do the various branches in the Everett interpretation except for their much lower weight. These assumptions are demonstrably inaccurate: the collapse mechanism has substantial and detectable effects within the tails. The relevance of this misconception for the dynamical-collapse theories is debatable, though.

]]>By investigating the Feynman Path Integral we prove that elementary quantum particle dynamics are directly associated to single compact (cyclic) world-line parameters, playing the role of the particles’ internal clock, implicit in ordinary undulatory mechanics and indirectly observed for instance in Time Crystals. This allows us to formulate a novel purely four-dimensional stringy description of elementary particles as possible physics beyond quantum mechanics. The novelty of this approach is that quantum mechanics originate from a non-trivial compact structure of the minkowskian space-time. Our result is a further evidence in support of Elementary Cycles Theory (ECT), which in previous papers has been proven to be consistent with known physics from theoretical particle physics to condensed matter. Here we provide additional conceptual arguments in support to this novel unified scenario of quantum and relativistic physics, potentially deterministic, and fully falsifiable having no fine-tunable parameters. The first evidences of such new physics characterized by ultra-fast cyclic time dynamics will be observed by probing quantum phenomena with experimental time accuracy of the order of 10^{-21} sec. Considerations about the emergence of the arrow of time from the realm of pure, zero temperature, quantum physics governed by intrinsic time periodicity are also provided. Concerning Einstein’s dilemma “God does not play dice” we conclude that, all in all, “God” would have no fun playing quantum dice.

]]>Bell inequalities may only be derived, if hidden variables do not depend on the experimental settings. The stochastic independence of hidden and setting variables is called: freedom of choice, free will, measurement independence (MI) or no conspiracy. By imbedding the Bell causal structure in a larger causal network the authors correctly prove, that one can explain and quantify possible violation of MI without evoking super-determinism. They assume the independence of the variables that causally determine the settings and investigate how they might become correlated with hidden variables (e.g., when the cosmic photons enter the laboratory). Using their extended causal networks they derive a contextual probabilistic model on which their further correct results are based. The authors seem to ignore that contextual probabilistic model may be derived directly using only probabilistic concepts and incorporating correctly setting dependent variables describing measuring instruments. In these contextual probabilistic models experimentersâ€™ freedom of choice is not compromised and the results of Bell Tests including an apparent violation of Einsteinian non-signaling may be explained in a locally causal way. Talking about freedom of choice is misleading and is rooted in incorrect understanding of Bayes Theorem. We explain why MI should be called noncontextuality and why its violation in Bell Tests confirms only the contextual character of quantum observables. Therefore, contextuality and not experimentersâ€™ freedom of choice are important resources in quantum information.

]]>A novel approach for analyzing “classical” alternatives to quantum mechanics for explaining the statistical results of an EPRB-like experiment is proposed. This perspective is top-down instead of bottom-up. Rather than beginning with an inequality derivation, a hierarchy of model types is constructed, each distinguished by appropriately parameterized conditional probabilities. This hierarchy ranks the “classical” model types in terms of their ability to reproduce QM statistics or not. The analysis goes beyond the usual consideration of model types that “fall short” (i.e., satisfy all of the CHSH inequalities) to ones that are “excessive” (i.e., not only violate CHSH but even exceed a Tsirelson bound). This approach clearly shows that noncontextuality is the most general property of an operational model that blocks replication of at least some QM statistical predictions. Factorizability is naturally revealed to be a special case of noncontextuality. The same is true for the combination of remote context independence and outcome determinism (RCI+OD). It is noncontextuality that determines the dividing line between “classical” model instances that satisfy the CHSH inequalities and those that don’t. Outcome deterministic operational models are revealed to be the “building blocks” of all the rest, including quantum mechanical, noncontextual, and contextual ones. The set of noncontextual model instances is exactly the convex hull of all 16 RCI+OD model instances, and furthermore, the set of all model instances, including all QM ones, is equal to the convex hull of the 256 OD model instances. It is shown that, under a mild assumption, the construction of convex hulls of finite ensembles of OD model instances is (mathematically) equivalent to the traditional hidden variables approach. Via the introduction of operational models that possess outcome and measurement “predictability”, a new perspective is gained on the impossibility of faster-than-light transfer of information in an EPRB experiment. Finally, many plots and figures, some of which appear to be new, provide visual affirmation of many of the results.

]]>We address the question of whether a non-nomological (i.e., anomic) interpretation of the wavefunction is compatible with the quantum formalism. After clarifying the distinction between ontic, epistemic, nomic and anomic models we focus our attention on two famous no-go theorems due to Pusey, Barrett, and Rudolph (PBR) on the one side and Hardy on the other side which forbid the existence of anomic-epistemic models. Moreover, we demonstrate that the so called restricted ontic indifference introduced by Hardy induces new constraints. We show that after modifications the Hardy theorem actually rules out all anomic models of the wavefunction assuming only restricted ontic indifference and preparation independence.

]]>Yoko Suzuki and Kevin M Mertes

We introduce a new interpretation of quantum mechanics by examining the Einstein, Podolsky and Rosen (EPR) paradox and Bell’s inequality experiments under the assumption that the vacuum fluctuation has a locally varying texture (a local variable) for energy levels below the Heisenberg time-energy uncertainty relation. In this article, selected results from the most reliable Bell’s inequality experiments will be quantitatively analyzed to show that our interpretation of quantum mechanics creates a new loophole in Bell’s inequality, and that the past experimental findings do not contradict our new interpretation. Under the vacuum texture interpretation of quantum mechanics in a Bell’s inequality experiment, the states of the pair of particles created at the source (e.g. during parametric down conversion) is influenced by an inhomogeneous vacuum texture sent with the speed of light from the measurement apparatus. We will also show that the resulting pair of particles are not entangled and that the theory of vacuum texture preserves local realism with complete causality. This article will also suggest an experiment to definitively confirm the existence of vacuum texture.

]]>The mind-body problem is reviewed in the context of a non-technical account of quantum theory. The importance of clearly defining: `what is physical?’ is highlighted, since only then can we give meaning to the concept `non-physical’. Physicality is defined in terms of interaction, which is in turn defined to be a correlated exchange of information. This is asserted to be the basis of any meaningful concept of epistemology. Hence, it is argued that a non-physical entity can not `know’ anything about the world. Information transfer is then discussed in terms of quantum entanglement and an argument for our perception of time is presented. It is then contended that the notion of `mind’ may be meaningfully discussed in the context of a quantum theoretic framework.

]]>In a recent series of papers and lectures, John Conway and Simon Kochen presented The Free Will Theorem. “It asserts, roughly, that if indeed we humans have free will, then elementary particles already have their own small share of this valuable commodity.” Perhaps the primary motivation of their papers was to place stringent constraints on quantum mechanical hidden variable theories, which they indeed do. Nevertheless, the notion of free will is crucial to the proof and they even speculate that the free will afforded to elementary particles is the ultimate explanation of our own free will. I don’t challenge the mathematics/logic of their proof but rather their premises. Free will and determinism are, for me, not nearly adequately clarified for them to form the bases of a theoretical proof. In addition, they take for granted supplemental concepts in quantum mechanics that are in need of further explanation. It’s also not clear to me what utility is afforded by the free will theorem, i.e., what, if anything, follows from it. Despite the cheeky subtitle of my essay, I do think that the explicit introduction of free will into discussions of hidden variables and other interpretations of quantum mechanics might help expose foibles in many of those deliberations. For this reason, I consider the Conway-Kochen free will theorem to be a positive contribution to the philosophy of quantum mechanics.

]]>This short article concentrates on the conceptual aspects of the violation of Bell inequalities, and acts as a map to the 265 cited references. The article outlines (a) relevant characteristics of quantum mechanics, such as statistical balance and entanglement, (b) the thinking that led to the derivation of the original Bell inequality, and (c) the range of claimed implications, including realism, locality and others which attract less attention. The main conclusion is that violation of Bell inequalities appears to have some implications for the nature of physical reality, but that none of these are definite. The violations constrain possible prequantum (underlying) theories, but do not rule out the possibility that such theories might reconcile at least one understanding of locality and realism to quantum mechanical predictions. Violation might reflect, at least partly, failure to acknowledge the contextuality of quantum mechanics, or that data from different probability spaces have been inappropriately combined. Many claims that there are definite implications reflect one or more of (i) imprecise non-mathematical language, (ii) assumptions inappropriate in quantum mechanics, (iii) inadequate treatment of measurement statistics and (iv) underlying philosophical assumptions.

]]>It is possible to construct a classical, macroscopic system which has a mathematical structure that is exactly the same as that of a quantum mechanical system and which can be put into a state which has exactly the same probability predictions as a quantum mechanical with entanglement. This paper presents a simple example, including a way in which the system can be measured to violate Bell’s inequalities. This classical simulation of a quantum system helps us to see what aspects of quantum mechanical systems are truly nonclassical.

]]>I comment briefly on derivations of the Born rule presented by Masanes et al. and by Hossenfelder.

]]>