International Journal of Quantum Foundations
https://ijqf.org
An online forum for exploring the conceptual foundations of quantum mechanics, quantum field theory and quantum gravitySat, 19 Sep 2020 09:13:22 +0000enUS
hourly
1 https://wordpress.org/?v=5.5.1https://ijqf.org/wpcontent/uploads/2014/08/croppedshangaoweibo32x32.jpgInternational Journal of Quantum Foundations
https://ijqf.org
3232Weekly Papers on Quantum Foundations (38)
https://ijqf.org/archives/6060
https://ijqf.org/archives/6060#respondSat, 19 Sep 2020 02:46:47 +0000https://ijqf.org/?p=6060 Read more → ]]>Finding the world in the wave function: some strategies for solving the macroobject problem
上午8:00

Latest Results for Synthese
Abstract
Realists wanting to capture the facts of quantum entanglement in a metaphysical interpretation find themselves faced with several options: to grant some species of fundamental nonseparability, adopt holism, or (more radically) to view localized spacetime systems as ultimately reducible to a higherdimensional entity, the quantum state or wave function. Those adopting the latter approach and hoping to view the macroscopic world as grounded in the quantum wave function face the macroobject problem. The challenge is to articulate the metaphysical relation obtaining between threedimensional macroobjects and the wave function so that the latter may be seen in some sense as constituting the former. This paper distinguishes several strategies for doing so and defends one based on a notion of partial instantiation.
Creativity is generally thought to be the production of things that are novel and valuable (whether physical artefacts, actions, or ideas). Humans are unique in the extent of their creativity, which plays a central role in innovation and problem solving, as well as in the arts. But what are the cognitive sources of novelty? More particularly, what are the cognitive sources of stochasticity in creative production? I will argue that they belong to two broad categories. One is associative, enabling the selection of goalrelevant ideas that have become activated by happenstance in an unrelated context. The other relies on selection processes that leverage stochastic fluctuations in neural activity. At the same time, I will address a central puzzle, which is to understand how the outputs of stochastic processes can nevertheless generally fall within task constraints. While the components appealed to in the accounts that I offer are well established, the ways in which I combine them together are new.
A simple argument proposes a direct link between realism about quantum mechanics and one kind of metaphysical holism: if elementary quantum theory is at least approximately true, then there are entangled systems with intrinsic whole states for which the intrinsic properties and spatiotemporal arrangements of salient subsystem parts do not suffice. Initially, the proposal is compelling: we can find variations on such reasoning throughout influential discussions of entanglement. Upon further consideration, though, this simple argument proves a bit too simple. To get such metaphysically robust consequences out, we need to put more than minimal realism in. This paper offers a diagnosis: our simple argument seems so compelling thanks to an equivocation. The predictions of textbook quantum theory already resonate with familiar holistic slogans; for realists, then, any underlying reality, conforming to such predictions, also counts as holistic in some sense or other, if only by association. Such associated holism, though, does not establish the sort of specific, robust supervenience failure claimed by our simple argument. While it may be natural to slide to this stronger conclusion, facilitating the slide is not minimal realism per se but an additional explanatory assumption about how and why reality behaves in accordance with our theory: roughly, quantum theory accurately captures patterns in the features and behaviors of physical reality because some underlying metaphysical structure constrains reality to exhibit these patterns. Along with the diagnosis comes a recommendation: we can and should understand one traditional disagreement about the metaphysics of entanglement as another manifestation of a familiar and more general conflict between reductive and nonreductive conceptions of metaphysical theorizing. Such reframing makes clearer what resources reductionists have for resisting the simple argument’s challenge from quantum holism. It also has an important moral for their opponents. Traditional focus on wholepart supervenience failure distracts from a root disagreement about metaphysical structure and its role in our theorizing. Nonreductionists fond of our simple argument would be better off tackling this root directly.
This paper critically assesses whether quantum entanglement can be made compatible with Humean supervenience. After reviewing the prima facie tension between entanglement and Humeanism, I outline a recentlyproposed Humean response, and argue that it is subject to two problems: one concerning the determinacy of quantities, and one concerning its relationship to scientific practice.
Quantum mechanics seems to portray nature as nonseparable, in the sense that it allows spatiotemporally separated entities to have states that cannot be fully specified without reference to each other. This is often said to implicate some form of “holism.” We aim to clarify what this means, and why this seems plausible. Our core idea is that the best explanation for nonseparability is a “common ground” explanation (modeled after common cause explanations), which casts nonseparable entities in a holistic light, as scattered reflections of a more unified underlying reality.
An influential theory has it that metaphysical indeterminacy occurs just when reality can be made completely precise in multiple ways. That characterization is formulated by employing the modal apparatus of ersatz possible worlds. As quantum physics taught us, reality cannot be made completely precise. I meet the challenge by providing an alternative theory which preserves the use of ersatz worlds but rejects the precisificational view of metaphysical indeterminacy. The upshot of the proposed theory is that it is metaphysically indeterminate whether p just in case it is neither true nor false that p, and no terms in ‘p’ are semantically defective. In other words, metaphysical indeterminacy arises when the world cannot be adequately described by a complete set of sentences defined in a semantically nondefective language. Moreover, the present theory provides a reductive analysis of metaphysical indeterminacy, unlike its influential predecessor. Finally, I argue that any adequate logic of a language with an indeterminate subject matter is neither compositional nor bivalent.
One way that philosophers have attempted to defend free will against the threat of fatalism and against the threat from divine beliefs has been to endorse timelessness views (about propositions and God’s beliefs, respectively). In this paper, I argue that, in order to respond to general worries about fatalism and divine beliefs, timelessness views must appeal to the notion of dependence. Once they do this, however, their distinctive position as timelessness views becomes otiose, for the appeal to dependence, if it helps at all, would itself be sufficient to block worries about fatalism and divine beliefs. I conclude by discussing some implications for dialectical progress.
The existence of nonlocal correlations between outcomes of measurements in quantum entangled systems strongly suggests that we are dealing with some form of causation here. An assessment of this conjecture in the context of the collapse interpretation of quantum mechanics is the primary goal of this paper. Following the counterfactual approach to causation, I argue that the details of the underlying causal mechanism which could explain the nonlocal correlations in entangled states strongly depend on the adopted semantics for counterfactuals. Several relativisticallyinvariant interpretations of spatiotemporal counterfactual conditionals are discussed, and the corresponding causal stories describing interactions between parts of an entangled system are evaluated. It is observed that the most controversial feature of the postulated causal connections is not so much their nonlocal character as a peculiar type of circularity that affects them.
We develop a unified framework for understanding the sign of fermionmediated interactions by exploiting the symmetry classification of Green’s functions. In particular, we establish a theorem regarding the sign of fermionmediated interactions in systems with chiral symmetry. The strength of the theorem is demonstrated within multiple examples with an emphasis on electronmediated interactions in materials.
Difficult it is to formulate achievable sensitivity bounds for quantum multiparameter estimation. Consider a special case, one parameter from many: many parameters of a process are unknown; estimate a specific linear combination of these parameters without having the ability to control any of the parameters. Superficially similar to singleparameter estimation, the problem retains genuinely multiparameter aspects. Geometric reasoning demonstrates the conditions, necessary and sufficient, for saturating the fundamental and attainable quantumprocess bound in this context.
In a microscopic quantum system one cannot perform a simultaneous measurement of particle and wave properties. This, however, may not be true for macroscopic quantum systems. As a demonstration, we propose to measure the local macroscopic current passed through two slits in a superconductor. According to the theory based on the linearized GinzburgLandau equation for the macroscopic pseudo wave function, the streamlines of the measured current should have the same form as particle trajectories in the Bohmian interpretation of quantum mechanics. By an explicit computation we find that the streamlines should show a characteristic wiggling, which is a consequence of quantum interference.
It has been shown that certain quantum walks give rise to relativistic wave equations, such as the Dirac and Weyl equations, in their longwavelength limits. This intriguing result raises the question of whether something similar can happen in the multiparticle case. We construct a onedimensional quantum cellular automaton (QCA) model which matches the quantum walk in the single particle case, and which approaches the quantum field theory of free fermions in the longwavelength limit. However, we show that this class of constructions does not generalize to higher spatial dimensions in any straightforward way, and that no construction with similar properties is possible in two or more spatial dimensions. This rules out the most common approaches based on QCAs. We suggest possible methods to overcome this barrier while retaining locality.
In 1934 Enrico Fermi accepted an invitation to deliver lectures in Argentina, Brazil and Uruguay. He arrived in Buenos Aires on July 30, lectured in Buenos Aires, Cordoba, La Plata and Montevideo, and then moved on August 18 to Sao Paulo via Santos and Rio de Janeiro; he traveled back from Rio to Naples on September 1st. His visit had a large resonance, and halls were crowded despite the fact that he lectured in Italian. The University of Buenos Aires recorded his five lectures and transcribed them in Spanish. They contain the first public presentations of the theory of beta decay and of the works on artificial radioactivity started by the via Panisperna group, but are not included in Fermi’s Collected Works edited by the Accademia dei Lincei in Rome and by the University of Chicago, although listed in the Bibliography. In this paper we present the transcription of Fermi’s five lectures in Buenos Aires, a summary of the lecture in La Plata and an extended summary of the lecture in Cordoba, translating them in English for the first time.
I critically discuss two dogmas of the “dynamical approach” to spacetime in general relativity, as advanced by Harvey Brown [Physical Relativity (2005) Oxford:Oxford University Press] and collaborators. The first dogma is that positing a “spacetime geometry” has no implications for the behavior of matter. The second dogma is that postulating the “Strong Equivalence Principle” suffices to ensure that matter is “adapted” to spacetime geometry. I conclude by discussing “spacetime functionalism”. The discussion is presented in reaction to and sympathy with recent work by James Read [“Explanation, geometry, and conspiracy in relativity theory”(20??) Thinking about Spacetime. Boston: Birkauser].
We propose a generalization of the recently proposed holographic duality between spin networks and superstrings, and show that it can provide a possible solution to the cosmological constant problem.
The Einstein Equivalence Principle carries a pivotal role in the understanding theory of gravity and spacetime. In its weak form, namely, weak equivalence principle (WEP), it implies the universality of free fall. Currently, a WEP test of relativistic form is blank in the experiments. In this work, we propose a novel scheme for the test of the WEP using frequency measurements. Our proposal consists of the comparison of highprecision clocks, comoving with the freely falling frame. In the presence of WEP violation, described by the E$\ddot{\text{o}}$tv$\ddot{\text{o}}$s parameter $\delta$, we demonstrate the feasibility to measure the potential changes of clock rates. In contrast to the traditional tests, measuring the difference on E$\ddot{\text{o}}$tv$\ddot{\text{o}}$s parameter between two materials of different compositions, our proposal allows for measuring the E$\ddot{\text{o}}$tv$\ddot{\text{o}}$s parameter for a single “test” body. Therefore, it potentially opens up a new window for the tests of the WEP. By searching for a daily variation of frequency difference between strontium optical clocks connected by optical fiber links, we obtain the upper limit of E$\ddot{\text{o}}$tv$\ddot{\text{o}}$s parameter about the Earth $\delta_{\text{E}}=(0.3\pm0.9)\times10^{4}$.
When is decoherence “effectively irreversible”? Here we examine this central question of quantum foundations using the tools of quantum computational complexity. We prove that, if one had a quantum circuit to determine if a system was in an equal superposition of two orthogonal states (for example, the $$Alive$\rangle$ and $$Dead$\rangle$ states of Schr\”{o}dinger’s cat), then with only a slightly larger circuit, one could also $\mathit{swap}$ the two states (e.g., bring a dead cat back to life). In other words, observing interference between the $$Alive$\rangle$and $$Dead$\rangle$ states is a “necromancyhard” problem, technologically infeasible in any world where death is permanent. As for the converse statement (i.e., ability to swap implies ability to detect interference), we show that it holds modulo a single exception, involving unitaries that (for example) map $$Alive$\rangle$ to $$Dead$\rangle$ but $$Dead$\rangle$ to $$Alive$\rangle$. We also show that these statements are robust—i.e., even a $\mathit{partial}$ ability to observe interference implies partial swapping ability, and vice versa. Finally, without relying on any unproved complexity conjectures, we show that all of these results are quantitatively tight. Our results have possible implications for the state dependence of observables in quantum gravity, the subject that originally motivated this study.
PhilsciArchive: No conditions. Results ordered Date Deposited.
Kurpaska, Sławomir and Tyszka, Apoloniusz (2020) The physical limits of computation inspire an open problem that concerns abstract computable sets X⊆N and cannot be formalized in the set theory ZFC as it refers to our current knowledge on X. [Preprint]
PRL: General Physics: Statistical and Quantum Mechanics, Quantum Information, etc.
Author(s): Gianmaria Falasco and Massimiliano Esposito
We show that the entropy production rate bounds the rate at which physical processes can be performed in stochastic systems far from equilibrium. In particular, we prove the fundamental tradeoff ⟨S˙e⟩T≥kB between the entropy flow ⟨S˙e⟩ into the reservoirs and the mean time T to complete any process …
[Phys. Rev. Lett. 125, 120604] Published Wed Sep 16, 2020
A weaktostrong quantum measurement transition has been observed in a singletrappedion system, where the ion’s internal electronic state and its vibrational motion play the roles of the measured system and the measuring pointer.
Nature Physics, Published online: 07 September 2020; doi:10.1038/s4156702010084The radiation emission rate from gravityrelated wave function collapse is calculated and the results of a dedicated experiment at the Gran Sasso laboratory are reported, ruling out the natural parameterfree version of the Diósi–Penrose model.
This paper aims at reproducing quantum mechanical (QM) spin and spin entanglement results using a realist, stochastic, and local approach, without the standard QM mathematical formulation. The concrete model proposed includes the description of SternGerlach apparatuses and of Bell test experiments. Single particle trajectories are explicitly evaluated as a function of a few stochastic variables that they assumedly carry on. QM predictions re retrieved as probability distributions of similarlyprepared ensembles of particles. Notably, it is shown that the proposed model, despite being both local and realist, is able to violate the Bell–CHSH inequalities by exploiting the coincidence loophole and thus intrinsically renouncing to one of the Bell’s assumptions.LeggettGarg tests for macrorealism: interference experiments and the simple harmonic oscillator. (arXiv:2009.03856v2 [quantph] UPDATED)
LeggettGarg (LG) tests for macrorealism were originally designed to explore quantum coherence on the macroscopic scale. Interference experiments and systems modelled by harmonic oscillators provide useful examples of situations in which macroscopicity has been approached experimentally and may be turned into LG tests with a dichotomic variable Q by simple partitionings of a continuous variable such as position. Applying this approach to the doubleslit experiment in which a measurement at the slits and screen are considered, we find that LG violations are always accompanied by destructive interference. The converse is not true in general and we find that there are nontrivial regimes in which there is destructive interference but the twotime LG inequalities are satisfied which implies that it is in fact often possible to assign (indirectly determined) probabilities for the interferometer paths. Similar features have been observed in recent work involving a LG analysis of a MachZehnder interferometer and we compare with those results. We also compare with the related problem in which a more direct determination of the paths is carried out using a variablestrength measurement at the slits and the resulting deterioration of the interference pattern is examined. We extend the analysis to the tripleslit experiment. We find examples of some surprising relationships between LG inequalities and NSIT conditions that do not exist for dichotomic variables, including a violation of the Luders bound. We analyse a twotime LG inequality for the simple harmonic oscillator. We find an analytically tractable example showing a twotime LG violation with a gaussian initial state, echoing recent results of Bose et al (Phys. Rev. Lett. 120, 210402 (2018)).The Beginning of the Nuclear Age. (arXiv:2009.05001v1 [physics.histph])
The article below is based on lectures delivered to new students remotely in the course of orientation. It presents the quantum theory tree from its inception a century ago till today. The main focus is on the nuclear physics – HEP branch.The philosophical underpinning of the absorber theory of radiation
In this paper we examine the possibility of a Josephson AC effect between two superconductors induced by the Earth’s gravitational field, making use of the gravitoMaxwell formalism. The theoretical framework exploits the symmetry between the weak field expansion of the gravitational field and the standard Maxwell formulation, combined with the Josephson junction physics. We also suggest a suitable experimental setup, analysing also the related possible difficulties in measurements.Quantum entanglement and the nonorientability of spacetime. (arXiv:2009.04990v1 [hepth])
We argue, in the context of Ads/CFT correspondence, that the degree of entanglement on the CFTs side determines the orientation of space and time on the dual global spacetime. That is, the global spacetime dual to entangled copies of field theory is nonorientable, while the product state of the CFTs results in an orientable spacetime. As a result, disentangling the degrees of freedom between two copies of CFT implies, on the gravity side, the transition from a nonorientable spacetime to a spacetime having a definite orientation of space and time, thus an orientable spacetime. We conclude showing that topology change induced by decreasing the entanglement between two sets of degrees of freedom corresponds to a topological blow down operation.The End of a Black Hole’s Evaporation — Part I. (arXiv:2009.05016v1 [grqc])
With the direct detection of gravitational waves by advanced LIGO detector, a new “window” to quantum gravity phenomenology has been opened. At present, these detectors achieve the sensitivity to detect the length variation ($\delta L$), $\mathcal{O} \approx 10^{17}10^{21}$ meter. Recently a more stringent upperbound on the dimensionless parameter $\beta_0$, bearing the effect of generalized uncertainty principle has been given which corresponds to the intermediate length scale $l_{im}= \sqrt{\beta_0} l_{pl} \sim 10^{23} m$. Hence the flavour of the generalized uncertainty principle can be realised by observing the response of the vibrations of phonon modes in such resonant detectors in the near future. In this paper, therefore, we calculate the resonant frequencies and transition rates induced by the incoming gravitational waves on these detectors in the generalized uncertainty principle framework. It is observed that the effects of the generalized uncertainty principle bears its signature in both the time independent and dependent part of the gravitational waveharmonic oscillator Hamiltonian. We also make an upper bound estimate of the GUP parameter.What Have Google’s Random Quantum Circuit Simulation Experiments Demonstrated about Quantum Supremacy?
上午2:26

PhilsciArchive: No conditions. Results ordered Date Deposited.
PhilsciArchive: No conditions. Results ordered Date Deposited.
Martens, Niels C.M. and Lehmkuhl, Dennis (2020) Dark Matter = Modified Gravity? Scrutinising the spacetime–matter distinction through the modified gravity/ dark matter lens. Studies in History and Philosophy of Modern Physics. ISSN 13552198Classical Particle Indistinguishability, Precisely
2020年9月9日 星期三 上午8:40

PhilsciArchive: No conditions. Results ordered Date Deposited.
How do we know about other minds on the basis of perception? The two most common answers to this question are that we literally perceive others’ mental states, or that we infer their mental states on the basis of perceiving something else. In this paper, I argue for a different answer. On my view, we don’t perceive mental states, and yet perceptual experiences often immediately justify mental state attributions. In a slogan: other minds are neither seen nor inferred. I argue that this view offers the best explanation of our deeply equivocal intuitions about perceptionbased mental state attributions, and also holds substantial interest for the epistemology of perception more generally.
Although “Heisenberg’s uncertainty principle” is represented by a rigorously proven relation about intrinsic indeterminacy in quantum states, Heisenberg’s errordisturbance relation (EDR) has been commonly believed as another aspect of the principle. However, recent developments of quantum measurement theory made Heisenberg’s EDR testable to observe its violations. Here, we study the EDR for SternGerlach measurements and we conclude that their EDR is close to the theoretical optimal and surprisingly that even the original SternGerlach experiment in 1922 violates Heisenberg’s EDR.A Personal History of the HastingsMichalakis Proof of Hall Conductance Quantization. (arXiv:2009.01645v1 [physics.histph])
This article compares treatments of the SternGerlach experiment across different physical theories, building up to a novel analysis of electron spin measurement in the context of classical Dirac field theory. Modeling the electron as a classical rigid body or point particle, we can explain why the entire electron is always found at just one location on the detector (uniqueness) but we cannot explain why there are only two locations where the electron is ever found (discreteness). Using nonrelativistic or relativistic quantum mechanics, we can explain both uniqueness and discreteness. Moving to more fundamental physics, both features can be explained within a quantum theory of the Dirac field. In a classical theory of the Dirac field, the rotating charge of the electron can split into two pieces that each hit the detector at a different location. In this classical context, we can explain a feature of electron spin that is often described as distinctively quantum (discreteness) but we cannot explain another feature that could be explained within any of the other theories (uniqueness).Sequential dynamics of complex networks in mind: Consciousness and creativity
Atom interferometers have been developed in the last three decades as new powerful tools to investigate gravity. Here I describe past and ongoing experiments with an outlook on what I think are the main prospects in this field and the potential to search for new physics.An Exact False Vacuum Decay Rate. (arXiv:2009.01535v1 [hepth])
We discuss an exact false vacuum decay rate at one loop for a real and complex scalar field in a quarticquartic potential with two treelevel minima. The bounce solution is used to compute the functional determinant from both fluctuations. We obtain the finite product of eigenvalues and remove translational zero modes. The orbital modes are regularized with the zeta function and we end up with an complete decay rate after renormalization. We derive simple expansions in the thin and thick wall limits and investigate their validity.Predictive power of grand unification from quantum gravity. (arXiv:1909.07318v2 [hepth] UPDATED)
If a grandunified extension of the asymptotically safe Reuter fixedpoint for quantum gravity exists, it determines free parameters of the grandunified scalar potential. All quartic couplings take their fixedpoint values in the transPlanckian regime. They are irrelevant parameters that are, in principle, computable for a given particle content of the grand unified model. In turn, the direction of spontaneous breaking of the grandunified gauge symmetry becomes predictable. For the flow of the couplings below the Planck mass, gauge and Yukawa interactions compete for the determination of the minimum of the effective potential.Testing ER=EPR. (arXiv:2002.08178v2 [hepth] UPDATED)
We discuss a few tests of the ER=EPR proposal. We consider certain conceptual issues as well as explicit physical examples that could be experimentally realized. In particular, we discuss the role of the Bell bounds, the large N limit, as well as the consistency of certain theoretical assumptions underlying the ER=EPR proposal. As explicit tests of the ER=EPR proposal we consider limits coming from the entropyenergy relation and certain limits coming from measurements of the speed of light as well as measurements of effective weights of entangled states. We also discuss various caveats of such experimental tests of the ER=EPR proposal.The cosmological constant and the use of cutoffs. (arXiv:2009.00728v1 [hepth] CROSS LISTED)
Of the contributions to the cosmological constant, zeropoint energy and self energy contributions scale as $\Lambda^4$ where $\Lambda$ is an ultraviolet cutoff used to regulate the calculations. I show that such contributions vanish when calculated in perturbation theory. This demonstration uses a littleknown modification to perturbation theory found by Honerkamp and Meetz and by Gerstein, Jackiw, Lee and Weinberg which comes into play when using cutoffs and interactions with multiple derivatives, as found in chiral theories and gravity. In a path integral treatment, the new interaction arises from the path integral measure. This reduces the sensitivity of the cosmological constant to the high energy cutoff significantly, although it does not resolve the cosmological constant problem. The feature removes one of the common motivations for supersymmetry. It also calls into question some of the results of the Asymptotic Safety program. Covariance and quadratic cutoff dependence are also briefly discussed.Distinguishing topological and causal explanation
上午4:00

PhilsciArchive: No conditions. Results ordered Date Deposited.
A. Vinante, M. Carlesso, A. Bassi, A. Chiasera, S. Varas, P. Falferi, B. Margesin, R. Mezzena, and H. Ulbricht

PRL: General Physics: Statistical and Quantum Mechanics, Quantum Information, etc.
Author(s): A. Vinante, M. Carlesso, A. Bassi, A. Chiasera, S. Varas, P. Falferi, B. Margesin, R. Mezzena, and H. Ulbricht
Despite the unquestionable empirical success of quantum theory, witnessed by the recent uprising of quantum technologies, the debate on how to reconcile the theory with the macroscopic classical world is still open. Spontaneous collapse models are one of the few testable solutions so far proposed. I…
Karel Proesmans, Jannik Ehrich, and John Bechhoefer

PRL: General Physics: Statistical and Quantum Mechanics, Quantum Information, etc.
Author(s): Karel Proesmans, Jannik Ehrich, and John Bechhoefer
Entropy produced by erasure of information in finite time can be minimized regardless of the final state by the use of optimal protocols that control the potential landscape.
This article compares treatments of the Stern–Gerlach experiment across different physical theories, building up to a novel analysis of electron spin measurement in the context of classical Dirac field theory. Modeling the electron as a classical rigid body or point particle, we can explain why the entire electron is always found at just one location on the detector (uniqueness) but we cannot explain why there are only two locations where the electron is ever found (discreteness). Using nonrelativistic or relativistic quantum mechanics, we can explain both uniqueness and discreteness. Moving to more fundamental physics, both features can be explained within a quantum theory of the Dirac field. In a classical theory of the Dirac field, the rotating charge of the electron can split into two pieces that each hit the detector at a different location. In this classical context, we can explain a feature of electron spin that is often described as distinctively quantum (discreteness) but we cannot explain another feature that could be explained within any of the other theories (uniqueness).Meaning relativism and subjective idealism
2020年9月1日 星期二 上午8:00

Latest Results for Synthese
Abstract
The paper discusses an objection, put forward by—among others—John McDowell, to Kripke’s Wittgenstein’s nonfactualist and relativist view of semantic discourse. The objection goes roughly as follows: while it is usually possible to be a relativist about a given domain of discourse without being a relativist about anything else, relativism about semantic discourse entails global relativism, which in turn entails subjective idealism, which we can reasonably assume to be false. The paper’s first section sketches Kripke’s Wittgenstein’s ideas about semantic discourse and gives a fully explicit formulation of the objection. The second section describes and briefly discusses the formal apparatus needed to evaluate the objection—which is basically equivalent to John MacFarlane’s recent development of David Kaplan’s classic semantic framework. Finally, the third section explains in detail why the objection fails. I show that even though relativism about semantic discourse does entail a form of global relativism, the relativism in question does not entail anything like Berkeleyan or Fichtean idealism. This particular kind of relativism holds that which character (in Kaplan’s sense) is associated to a given utterance depends on what MacFarlane calls “the context of assessment”.A consciousnessbased quantum objective collapse model
2020年9月1日 星期二 上午8:00

Latest Results for Synthese
Abstract
Ever since the early days of quantum mechanics it has been suggested that consciousness could be linked to the collapse of the wave function. However, no detailed account of such an interplay is usually provided. In this paper we present an objective collapse model (a variation of the Continuous Spontaneous Location model) where the collapse operator depends on integrated information, which has been argued to measure consciousness. By doing so, we construct an empirically adequate scheme in which superpositions of conscious states are dynamically suppressed. Unlike other proposals in which “consciousness causes the collapse of the wave function,” our model is fully consistent with a materialistic view of the world and does not require the postulation of entities suspicious of laying outside of the quantum realm.A challenge for SuperHumeanism: the problem of immanent comparisons
2020年9月1日 星期二 上午8:00

Latest Results for Synthese
Abstract
According to the doctrine of SuperHumeanism (Esfeld in Synthese. http://dx.doi.org/10.1007/s1122901714268, 2017), the world’s mosaic consists only of permanent matter points and changing spatial relations, while all the other entities and features figuring in scientific theories are nomological parameters, whose role is merely to build the best law system. In this paper, I develop an argument against SuperHumeanism by pointing out that it is vulnerable to and does not have the resources to solve the wellknown problem of immanent comparisons. Firstly, I show that it cannot endorse a fundamentalist solution à la Lewis, since its two pillars—a minimalist ontology and a best system account of lawhood—would generate, together, a tedious problem of internal coherence. Secondly, I consider antifundamentalist strategies, proposed within Humeanism, and find them inapplicable to the SuperHumean doctrine. The concern is that, since it is impossible to choose the best law system within SuperHumeanism, this doctrine may be charged with incoherence.On the Classification between ψ−Ontic and ψ−Epistemic Ontological Models
2020年8月31日 星期一 下午1:13

PhilsciArchive: No conditions. Results ordered Date Deposited.
The main purpose of this article is to try to understand the connection between the physical universe and the mathematical principles that underlies the cosmological account of the Timaeus. Aristotle’s common criticism of Plato’s cosmology is that he confuses mathematical and physical constructions. Namely, the Timaeus is the first cosmology founded on mathematical physics. We give a new translation of Timaeus 31b32b, an important passage to understand the connection between mathematics and physics in Timaeus’ cosmological construction. This article is the first of a series about the kh{\^o}ra. We will restrict our focus here to the muchdebated question of the primary elements in the kh{\^o}ra, the components of the whole physical world reduced, in an extraordinary elegant construction, to two right triangles.
Quantum mechanics has lacked a widely recognized interpretation since its birth. Many interpretations are under consideration because they are difficult to disprove experimentally. In this paper, we show that the results of a recent experiment go against one of them: the pilotwave interpretation. This is because the key assumption of this interpretation, particle locality, contradicts the assumption of wave function nonlocality on which the experiment is founded. This is a rare example of a quantum model and an experiment not indifferent to the interpretation of quantum mechanics.
Contrary to what Lambare [arXiv:2008.00369] assumes, in nonNewtonian calculus (a calculus based on nonDiophantine arithmetic) an integral is typically given by a nonlinear map. This is the technical reason why all the standard proofs of Belltype inequalities fail if nonNewtonian hidden variables are taken into account. From the nonNewtonian perspective, Bell’s inequality is a property of a limited and unphysical class of hiddenvariable models. An explicit counterexample to Bell’s theorem can be easily constructed.
The delayedchoice quantum eraser has long been a subject of controversy, and has been looked at as being incomprehensible to having retrocausal effect in time. Here the delayedchoice quantum eraser is theoretically analyzed using standard quantum mechanics. Employing MachZehnder interferometer, instead of a conventional twoslit interference, brings in surprising clarity. Some common mistakes in interpreting the experiment are pointed out. It is demonstrated that in the delayed mode there is no whichway information present after the particle is registered on the screen or the final detectors, contrary to popular belief. However, it is shown that another kind of path information is present even after the particle is registered in the final detectors. The registered particle can be used to predict the results of certain yet to be made measurements on the whichway detector. This novel correlation can be tested in a careful experiment. It is consequently argued that there is no big mystery in the experiment, and no retrocausal effect whatsoever.
We consider a formal discretisation of Euclidean quantum gravity defined by a statistical model of random $3$regular graphs and making using of the Ollivier curvature, a coarse analogue of the Ricci curvature. Numerical analysis shows that the Hausdorff and spectral dimensions of the model approach $1$ in the joint classicalthermodynamic limit and we argue that the scaling limit of the model is the circle of radius $r$, $S^1_r$. Given mild kinematic constraints, these claims can be proven with full mathematical rigour: speaking precisely, it may be shown that for $3$regular graphs of girth at least $4$, any sequence of action minimising configurations converges in the sense of GromovHausdorff to $S^1_r$. We also present strong evidence for the existence of a secondorder phase transition through an analysis of finite size effects. This—essentially solvable—toy model of emergent onedimensional geometry is meant as a controllable paradigm for the nonperturbative definition of random flat surfaces.The way ahead for fusion
2020年8月28日 星期五 上午8:00

Nature Physics – Issue – nature.com science feeds
Nature Physics, Published online: 28 August 2020; doi:10.1038/s41567020010439As the construction of the ITER tokamak enters its next phase — the machine assembly — now is a good time for a recap of the history and current status of nuclear fusion research.
PRL: General Physics: Statistical and Quantum Mechanics, Quantum Information, etc.
Author(s): Peter J. Brown and Roger Colbeck
Alice and Bob each have half of a pair of entangled qubits. Bob measures his half and then passes his qubit to a second Bob who measures again and so on. The goal is to maximize the number of Bobs that can have an expected violation of the ClauserHorneShimonyHolt (CHSH) Bell inequality with the s…
[Phys. Rev. Lett. 125, 090401] Published Mon Aug 24, 2020
Sandu Popescu, Ana Belén Sainz, Anthony J. Short, and Andreas Winter

PRL: General Physics: Statistical and Quantum Mechanics, Quantum Information, etc.
Author(s): Sandu Popescu, Ana Belén Sainz, Anthony J. Short, and Andreas Winter
Even in the presence of conservation laws, one can perform arbitrary transformations on a system if given access to a suitable reference frame, since conserved quantities may be exchanged between the system and the frame. Here we explore whether these quantities can be separated into different parts…
[Phys. Rev. Lett. 125, 090601] Published Mon Aug 24, 2020
PhilsciArchive: No conditions. Results ordered Date Deposited.
Jaksland, Rasmus and Linnemann, Niels (2020) Holography without holography: How to turn interrepresentational into intratheoretical relations in AdS/CFT. Studies in History and Philosophy of Modern Physics, 71. pp. 101117. ISSN 13552198The Dynamical Approach to Spin2 Gravity
2020年8月23日 星期日 下午3:48

PhilsciArchive: No conditions. Results ordered Date Deposited.
Salimkhani, Kian (2020) The Dynamical Approach to Spin2 Gravity. Studies in History and Philosophy of Modern Physics. ISSN 13552198Jury Theorems for Peer Review
2020年8月22日 星期六 下午3:18

PhilsciArchive: No conditions. Results ordered Date Deposited.
PhilsciArchive: No conditions. Results ordered Date Deposited.
De Baerdemaeker, Siska and Boyd, Nora Mills (2020) Jump Ship, Shift Gears, or Just Keep on Chugging: Assessing the Responses to Tensions between Theory and Evidence in Contemporary Cosmology. [Preprint]
]]>Weekly Papers on Quantum Foundations (34)
https://ijqf.org/archives/6046
Sat, 22 Aug 2020 01:38:59 +0000https://www.ijqf.org/?p=6046 Read more → ]]>
The paper discusses an objection, put forward by—among others—John McDowell, to Kripke’s Wittgenstein’s nonfactualist and relativist view of semantic discourse. The objection goes roughly as follows: while it is usually possible to be a relativist about a given domain of discourse without being a relativist about anything else, relativism about semantic discourse entails global relativism, which in turn entails subjective idealism, which we can reasonably assume to be false. The paper’s first section sketches Kripke’s Wittgenstein’s ideas about semantic discourse and gives a fully explicit formulation of the objection. The second section describes and briefly discusses the formal apparatus needed to evaluate the objection—which is basically equivalent to John MacFarlane’s recent development of David Kaplan’s classic semantic framework. Finally, the third section explains in detail why the objection fails. I show that even though relativism about semantic discourse does entail a form of global relativism, the relativism in question does not entail anything like Berkeleyan or Fichtean idealism. This particular kind of relativism holds that which character (in Kaplan’s sense) is associated to a given utterance depends on what MacFarlane calls “the context of assessment”.
Ever since the early days of quantum mechanics it has been suggested that consciousness could be linked to the collapse of the wave function. However, no detailed account of such an interplay is usually provided. In this paper we present an objective collapse model (a variation of the Continuous Spontaneous Location model) where the collapse operator depends on integrated information, which has been argued to measure consciousness. By doing so, we construct an empirically adequate scheme in which superpositions of conscious states are dynamically suppressed. Unlike other proposals in which “consciousness causes the collapse of the wave function,” our model is fully consistent with a materialistic view of the world and does not require the postulation of entities suspicious of laying outside of the quantum realm.
According to the doctrine of SuperHumeanism (Esfeld in Synthese. http://scihub.tw/10.1007/s1122901714268, 2017), the world’s mosaic consists only of permanent matter points and changing spatial relations, while all the other entities and features figuring in scientific theories are nomological parameters, whose role is merely to build the best law system. In this paper, I develop an argument against SuperHumeanism by pointing out that it is vulnerable to and does not have the resources to solve the wellknown problem of immanent comparisons. Firstly, I show that it cannot endorse a fundamentalist solution à la Lewis, since its two pillars—a minimalist ontology and a best system account of lawhood—would generate, together, a tedious problem of internal coherence. Secondly, I consider antifundamentalist strategies, proposed within Humeanism, and find them inapplicable to the SuperHumean doctrine. The concern is that, since it is impossible to choose the best law system within SuperHumeanism, this doctrine may be charged with incoherence.
A quantum system’s state is identified with a density matrix. Though their probabilistic interpretation is rooted in ensemble theory, density matrices embody a known shortcoming. They do not completely express an ensemble’s physical realization. Conveniently, when working only with the statistical outcomes of projective and positive operatorvalued measurements this is not a hindrance. To track ensemble realizations and so remove the shortcoming, we explore geometric quantum states and explain their physical significance. We emphasize two main consequences: one in quantum state manipulation and one in quantum thermodynamics.
Building on parallels between geometric quantum mechanics and classical mechanics, we explore an alternative basis for quantum thermodynamics that exploits the differential geometry of the underlying state space. We develop both microcanonical and canonical ensembles, introducing continuous mixed states as distributions on the manifold of quantum states. We call out the experimental consequences for a gas of qudits. We define quantum heat and work in an intrinsic way, including singletrajectory work, and reformulate thermodynamic entropy in a way that accords with classical, quantum, and informationtheoretic entropies. We give both the First and Second Laws of Thermodynamics and Jarzynki’s Fluctuation Theorem. The result is a more transparent physics, than conventionally available, in which the mathematical structure and physical intuitions underlying classical and quantum dynamics are seen to be closely aligned.
Wigner Friend scenarios — in which an external agent describes quantum mechanically a laboratory in which a Friend is making a measurement — give rise to possible inconsistencies due to the ambiguous character of quantum measurements. In this work, we investigate Wigner Friend scenarios in which the external agents can probe in a noninvasive manner the dynamics inside the laboratories. We examine probes that can be very weakly coupled to the systems measured by the Friends, or to the pointers or environments inside the laboratories. These couplings, known as Weak Measurements, are asymptotically small and do not change the outcomes obtained by the Friends nor their probabilities. Within our scheme, we show that the weakly coupled probes indicate to the external agents how to obtain consistent predictions, irrespective of the possible inconsistencies of quantum measurement theory. These noninvasive couplings could be implemented with presentday technologies.
Jan de Boer, Victor Godet, Jani Kastikainen, Esko KeskiVakkuri

quantph updates on arXiv.org
One of the key tasks in physics is to perform measurements in order to determine the state of a system. Often, measurements are aimed at determining the values of physical parameters, but one can also ask simpler questions, such as “is the system in state A or state B?”. In quantum mechanics, the latter type of measurements can be studied and optimized using the framework of quantum hypothesis testing. In many cases one can explicitly find the optimal measurement in the limit where one has simultaneous access to a large number $n$ of identical copies of the system, and estimate the expected error as $n$ becomes large. Interestingly, error estimates turn out to involve various quantum information theoretic quantities such as relative entropy, thereby giving these quantities operational meaning.
In this paper we consider the application of quantum hypothesis testing to quantum manybody systems and quantum field theory. We review some of the necessary background material, and study in some detail the situation where the two states one wants to distinguish are parametrically close. The relevant error estimates involve quantities such as the variance of relative entropy, for which we prove a new inequality. We explore the optimal measurement strategy for spin chains and twodimensional conformal field theory, focusing on the task of distinguishing reduced density matrices of subsystems. The optimal strategy turns out to be somewhat cumbersome to implement in practice, and we discuss a possible alternative strategy and the corresponding errors.
The Copenhagen interpretation has been the subject of much criticism, notably by De Broglie and Einstein, because it contradicts the principles of causality and realism. The aim of this essay is to study the wave mechanics as an alternative to traditional quantum mechanics, in the continuity of the ideas of Louis de Broglie: the pilot wave theory of De Broglie (where each particle is associated with a wave which guides it), De BroglieBohm theory, stochastic electrodynamics (where the stochastic character of particles is caused by the energy field of the fluctuating vacuum), and the analogies between quantum mechanics and hydrodynamics.
The epitome of acausal or antichronological behaviour would be to see a clock running backwards in time. In this essay we point out that this is indeed possible, but there is no problem with causality. What you see isn’t what is really happening. Locally, causality is always respected. However our observation should be cause for pause to astronomers and cosmologists, who strictly observe events occurring at very large distances or very long ago and certainly not locally. It can be that what you see isn’t what you necessarily get.
The selfdual spacetime was derived from the minisuperspace approach, based on the polymerization quantization procedure in loop quantum gravity (LQG). Its deviation from the Schwarzschild spacetime is characterized by the polymeric function $P$, purely due to the geometric quantum effects from LQG. In this paper, we consider the observational constraints imposed on $P$ by using the solar system experiments and observations. For this purpose, we calculate in detail the effects of $P$ on astronomical observations conducted in the Solar system, including the deflection angle of light by the Sun, gravitational time delay, perihelion advance, and geodetic precession. The observational constraints are derived by confronting the theoretical predictions with the most recent observations. Among these constraints, we find that the tightest one comes from the measurement of the gravitational time delay by the Cassini mission, which yields $0<P<5.5\times 10^{6}$. In addition, we also discuss the potential constraint that can be obtained in the near future by the joint EuropeanJapanese BepiColombo project and show that it could significantly improve the current constraints.
In this article we consider the problem to what extent the motion of gaugecharged matter that generates the gravitational field can be arbitrary, as well as what equations are superimposed on the gauge field due to conditions of compatibility of gravitational field equations. Considered problem is analyzed from the point of view symmetry of the theory with respect to the generalized gauge deformed groups without specification of Lagrangians.
In particular it is shown, that the motion of uncharged particles along geodesics of Riemannian space is inherent in an extremely wide range of theories of gravity and is a consequence of the gauge translational invariance of these theories under the condition of fulfilling equations of gravitational field. In the cause of gaugecharged particles, the Lorentz force, generalized for gaugecharged matter, appear in equations of mouton as a consequence of the gauge symmetry of the theory under the condition of fulfilling the equations of the gravitational and gauge fields. In addition, we found relationships of equations for some fields that follow from the assumption about fulfilling of equations for other fields, for example, relationships of equations of the gravitational field and the gauge field of internal symmetry which follow from the assumption about fulfilling of equations of matter fields. In particular, we obtained the identity that generalizes in the case of arbitrary gauge field (and in the presence of gaugecharged matter) the identity found by Hilbert for the electromagnetic field.
At the end of the article there is an Appendix, which briefly presents the main provisions and facts from the theory of generalized gauge deformed groups, which are the basic tool of this work.
The presence of axionlike dark matter candidates is expected to induce an oscillating magnetic field, enhanced by a ferromagnet. Limits on the electromagnetic coupling strength of axionlike particles are reported over a mass range spanning three decades.
The discussion of the quantum mechanical Wigner’s friend thought experiment has regained intensity. Recent theoretical results and experimental tests restrict the possibility of maintaining an observerindependent notion of measurement outcomes.
For a scenario of two separated but entangled observers, inequalities are derived from three fundamental assumptions. An experiment shows that these inequalities can be violated if quantum evolution is controllable on the scale of an observer.
Author(s): T. Aoyama, N. Asmussen, M. Benayoun, J. Bijnens, T. Blum, M. Bruno, I. Caprini, C.M. Carloni Calame, M. Cè, G. Colangelo, F. Curciarello, H. Czyż, I. Danilkin, M. Davier, C.T.H. Davies, M. Della Morte, S.I. Eidelman, A.X. ElKhadra, A. Gérardin, D. Giusti
In this paper we consider a new geometric approach to Madelung’s quantum hydrodynamics (QHD) based on the theory of gauge connections. Unlike previous approaches, our treatment comprises a constant curvature thereby endowing QHD with intrinsic nonzero holonomy. In the hydrodynamic context, this leads to a fluid velocity which no longer is constrained to be irrotational and allows instead for vortex filaments solutions. After exploiting the RasettiRegge method to couple the Schr\”odinger equation to vortex filament dynamics, the latter is then considered as a source of geometric phase in the context of BornOppenheimer molecular dynamics. Similarly, we consider the Pauli equation for the motion of spin particles in electromagnetic fields and we exploit its underlying hydrodynamic picture to include vortex dynamics.
Quantum information science is an exciting, wide, rapidly progressing, crossdisciplinary field, and that very nature makes it both attractive and hard to enter. In this primer, we first provide answers to the three essential questions that any newcomer needs to know: How is quantum information represented? How is quantum information processed? How is classical information extracted from quantum states? We then introduce the most basic quantum information theoretic notions concerning entropy, sources, and channels, as well as secure communications and error correction. We conclude with examples that illustrate the power of quantum correlations. No prior knowledge of quantum mechanics is assumed.
We study the holographic map in AdS/CFT, as modeled by a quantum error correcting code with exact complementary recovery. We show that the map is determined by local conditional expectations acting on the operator algebras of the boundary/physical Hilbert space. Several existing results in the literature follow easily from this perspective. The Black Hole area law, and more generally the RyuTakayanagi area operator, arises from a central sum of entropies on the relative commutant. These entropies are determined in a state independent way by the conditional expectation. The conditional expectation can also be found via a minimization procedure, similar to the minimization involved in the RT formula. For a local net of algebras associated to connected boundary regions, we show the complementary recovery condition is equivalent to the existence of a standard net of inclusions — an abstraction of the mathematical structure governing QFT superselection sectors given by Longo and Rehren. For a code consisting of algebras associated to two disjoint regions of the boundary theory we impose an extra condition, dubbed dualadditivity, that gives rise to phase transitions between different entanglement wedges. Dualadditive codes naturally give rise to a new split code subspace, and an entropy bound controls which subspace and associated algebra is reconstructable. We also discuss known shortcomings of exact complementary recovery as a model of holography. For example, these codes are not able to accommodate holographic violations of additive for overlapping regions. We comment on how approximate codes can fix these issues.
Gabriel R. Bengochea, Gabriel León, Philip Pearle, Daniel Sudarsky

quantph updates on arXiv.org
In this work we consider a wide variety of alternatives opened when applying the continuous spontaneous localization (CSL) dynamical collapse theory to the inflationary era. This exploration includes: two different approaches to deal with quantum field theory and gravitation, the identification of the collapsegenerating operator and the general nature and values of the parameters of the CSL theory. All the choices connected with these issues have the potential to dramatically alter the conclusions one can draw. We also argue that the incompatibilities found in a recent paper, between the CSL parameter values and the CMB observational data, are associated with specific choices made for the extrapolation to the cosmological context of the CSL theory (as it is known to work in nonrelativistic laboratory situations) which do not represent the most natural ones.
We discuss a new formalism for constructing a nonrelativistic (NR) theory in curved background. Named as galilean gauge theory, it is based on gauging the global galilean symmetry. It provides a systematic algorithm for obtaining the covariant curved space time generalisation of any NR theory defined in flat space time. The resulting background is just the Newton Cartan manifold. The example of NR free particle is explicitly demonstrated.
We establish radiative stability of all generalized Proca theories. While standard powercounting arguments would conclude otherwise, we find nontrivial cancellations of leading order corrections by explicit computation of divergent oneloop diagrams up to fourpoint. These results are crosschecked against an effective action based generalized SchwingerDeWitt method. Further, these cancellations are understood as coming from the specific structure of the theory through a decoupling limit analysis which at the same time allows for an extension of the results to all orders.
PhilsciArchive: No conditions. Results ordered Date Deposited.
Fernández Mouján, Raimundo (2020) Greek philosophy for quantum physics. The return to the Greeks in the works of Heisenberg, Pauli and Schrödinger. [Preprint]
Remember how Kim (Philos Perspect 3:77–108, 1989, in: Heil and Mele (eds) Mental causation, Clarendon Press, Oxford, 1993b) used to argue against nonreductive physicalism to the effect that it cannot accommodate the causal efficacy of the mental? The argument was that if physicalists accept the causal closure of the physical, they are faced with an exclusion problem. In the original version of the argument, the dependence holding between the mental and the physical was cashed out in terms of supervenience. Due to the work or Fine (Philos Perspect 8:1–16, 1994) and others, we have since come to realize that modal notions are not wellsuited to perform the work of properly characterizing dependence. As a consequence of this, an increasingly larger community of contemporary metaphysicians prefer to spell out mentalphysical dependence in terms of a noncausal and nonreductive notion called grounding, which is intended to target a particular sort of metaphysical relation that takes us from ontologically less fundamental features of the world to that which is more fundamental. In this paper I join forces with those who think that this shift in focus is on the right track. More specifically, I will argue that the grounding physicalist can solve the exclusion problem in a way that is preferable to the superveniencebased nonreductive physicalist solution, as well as in a way that is compatible with the externalist picture of the mental.
This essay is concerned with a number of related proposals that claim there is a link between spacetime topology and quantum entanglement. I indicate the extent to which these proposals can be understood as stating a duality, and then consider two general approaches to articulating such a duality: a “statebased” approach, under which one attempts to identify relevant topological states as dual to quantum entangled states; and an “observablebased” approach, under which one attempts to identify relevant topological observables as dual to quantum entangement observables. Both approaches are faced with issues, essentially due to the ambiguous nature of quantum entanglement, that remain to be addressed.
Many nonphysicalists, including Chalmers, hold that the zombie argument succeeds in rejecting the physicalist view of consciousness. Some nonphysicalists, including, again, Chalmers, hold that quantum collapse interactionism (QCI), i.e., the idea that nonphysical consciousness causes collapse of the wave function in phenomena such as quantum measurement, is a viable interactionist solution for the problem of the relationship between the physical world and the nonphysical consciousness. In this paper, I argue that if QCI is true, the zombie argument fails. In particular, I show that if QCI is true, a zombie world physically identical to our world is impossible because there is at least one law of nature, a fundamental law of physics in particular, that exist only in the zombie world but not in our world. This shows that philosophers like Chalmers are committing an error in endorsing the zombie argument and QCI at the same time.
PRL: General Physics: Statistical and Quantum Mechanics, Quantum Information, etc.
Author(s): David Felce and Vlatko Vedral
We propose a thermodynamic refrigeration cycle which uses indefinite causal orders to achieve nonclassical cooling. The cycle cools a cold reservoir while consuming purity in a control qubit. We first show that the application to an input state of two identical thermalizing channels of temperature T…
[Phys. Rev. Lett. 125, 070603] Published Tue Aug 11, 2020
A protocol for the reliable, efficient and precise characterization of quantum noise is reported and implemented in an architecture consisting of 14 superconducting qubits. Correlated noise within arbitrary sets of qubits can be easily detected.
While there has been much discussion about what makes some mathematical proofs more explanatory than others, and what are mathematical coincidences, in this article I explore the distinct phenomenon of mathematical facts that call for explanation. The existence of mathematical facts that call for explanation stands in tension with virtually all existing accounts of “calling for explanation”, which imply that necessary facts cannot call for explanation. In this paper I explore what theoretical revisions are needed in order to accommodate this phenomenon. One of the important upshots is that, contrary to the current consensus, low prior probability is not a necessary condition for calling for explanation. In the final section I explain how the results of this inquiry help us make progress in assessing Hartry Field’s style of reliability argument against mathematical Platonism and against robust realism in other domains of necessary facts, such as ethics.
A memorial to Prof. Hidekuni Hidekoshi, Kyoto University, Japan – an accelerator pioneer in Japan, teacher, mentor, friend, unassuming but knowing his accomplishments and worth, frugal but generous, enjoying life. Japanese contributions given in Japanese and English, English in English.
This paper deals with the Newton–Wigner position observable for Poincar\’einvariant classical systems. We prove an existence and uniqueness theorem for elementary systems that parallels the wellknown Newton–Wigner theorem in the quantum context. We also discuss and justify the geometric interpretation of the Newton–Wigner position as `centre of spin’, already proposed by Fleming in 1965 again in the quantum context.
Shenglong Xu, Leonard Susskind, Yuan Su, Brian Swingle

quantph updates on arXiv.org
We study a sparse version of the SachdevYeKitaev (SYK) model defined on random hypergraphs constructed either by a random pruning procedure or by randomly sampling regular hypergraphs. The resulting model has a new parameter, $k$, defined as the ratio of the number of terms in the Hamiltonian to the number of degrees of freedom, with the sparse limit corresponding to the thermodynamic limit at fixed $k$. We argue that this sparse SYK model recovers the interesting global physics of ordinary SYK even when $k$ is of order unity. In particular, at low temperature the model exhibits a gravitational sector which is maximally chaotic. Our argument proceeds by constructing a path integral for the sparse model which reproduces the conventional SYK path integral plus gapped fluctuations. The sparsity of the model permits larger scale numerical calculations than previously possible, the results of which are consistent with the path integral analysis. Additionally, we show that the sparsity of the model considerably reduces the cost of quantum simulation algorithms. This makes the sparse SYK model the most efficient currently known route to simulate a holographic model of quantum gravity. We also define and study a sparse supersymmetric SYK model, with similar conclusions to the nonsupersymmetric case. Looking forward, we argue that the class of models considered here constitute an interesting and relatively unexplored sparse frontier in quantum manybody physics.
Everett’s relativestate construction in quantum theory has never been satisfactorily expressed in the Heisenberg picture. What one might have expected to be a straightforward process was impeded by conceptual and technical problems that we solve here. The result is a construction which, unlike Everett’s own one in the Schr\”odinger picture, makes manifest the locality of Everettian multiplicity, and its inherently approximative nature, and its origin in certain kinds of entanglement and locally inaccessible information. Our construction also allows us to give a more precise definition of an Everett ‘universe’ (which is fully quantum, not quasiclassical), and we compare the Everettian decomposition of a quantum state with the foliation of a spacetime.
The use of fractional momentum operators and fractionary kinetic energy used to model linear damping in dissipative systems such as resistive circuits and a springmass ensambles was extended to a quantum mechanical formalism. Three important associated 1 dimensional problems were solved: the free particle case, the infinite potential well, and the harmonic potential. The wave equations generated reproduced the same type of 2order ODE observed in classical dissipative systems, and produced quantized energy levels. In the infinite potential well, a zeropoint energy emerges, which can be fitted to the rest energy of the particle described by special relativity, given by relationship $E_r=mc^2$. In the harmonic potential, new fractional creation and destruction operators were introduced to solve the problem in the energy basis. The energy eigenvalues found are different to the ones reported by earlier approaches to the quantum damped oscillator problem reported by other authors. In this case, a direct relationship between the relativistic rest energy of the particle and the expected value of the fractionary kinetic energy in the base state was obtained. We conclude that there exists a relationship between fractional kinetic energy and special relativity energies, that remains unclear and needs further exploration, but also conclude that the current form of transforming fractionary momentum operators to the position basis will yield nonobservable imaginary momentum quantities, and thus a correction to the way of transforming them needs to be explored further.
We study discrete Lorentzian spectral geometry by investigating to what extent causal sets can be identified through a set of geometric invariants such as spectra. We build on previous work where it was shown that the spectra of certain operators derived from the causal matrix possess considerable but not complete power to distinguish causal sets. We find two especially successful methods for classifying causal sets and we computationally test them for all causal sets of up to $9$ elements. One of the spectral geometric methods that we study involves holding a given causal set fixed and collecting a growing set of its geometric invariants such as spectra (including the spectra of the commutator of certain operators). The second method involves obtaining a limited set of geometric invariants for a given causal set while also collecting these geometric invariants for small `perturbations’ of the causal set, a novel method that may also be useful in other areas of spectral geometry. We show that with a suitably chosen set of geometric invariants, this new method fully resolves the causal sets we considered. Concretely, we consider for this purpose perturbations of the original causal set that are formed by adding one element and a link. We discuss potential applications to the path integral in quantum gravity.
The response of a gravitating object to an external tidal field is encoded in its Love numbers, which identically vanish for classical blackholes (BHs). Here we show, using standard timeindependent quantum perturbation theory, that for a quantum BH, generically, the Love numbers are nonvanishing and negative, and that their magnitude depends on the lowest lying levels of the quantum spectrum of the BH. We calculate the quadrupolar electric quantum Love number of nonrotating BHs and show that it depends most strongly on the first excited level of the quantum BH. We then compare our results to the same Love number of exotic ultra compact objects and to that of classical compact stars and highlight their different parametric dependence. Finally, we discuss the detectability of the quadrupolar quantum Love number in future precision gravitationalwave observations and show that, under favourable circumstances, its magnitude is large enough to imprint an observable signature on the gravitational waves emitted during the inspiral phase of two moderately spinning BHs.
PRL: General Physics: Statistical and Quantum Mechanics, Quantum Information, etc.
Author(s): Mirjam Weilenmann and Roger Colbeck
Selftesting usually refers to the task of taking a given set of observed correlations that are assumed to arise via a process that is accurately described by quantum theory, and trying to infer the quantum state and measurements. In other words it is concerned with the question of whether we can te…
[Phys. Rev. Lett. 125, 060406] Published Thu Aug 06, 2020
PhilsciArchive: No conditions. Results ordered Date Deposited.
de Ronde, Christian (2020) Quantum Theory Needs No ‘Interpretation’ But ‘Theoretical FormalConceptual Unity’ (Or: Escaping Adán Cabello’s “Map of Madness” With the Help of David Deutsch’s Explanations). [Preprint]
Every clock is a physical system and thereby ultimately quantum. A naturally arising question is how to describe time evolution relative to quantum clocks and, specifically, how the dynamics relative to different quantum clocks are related. This is a pressing issue in view of the multiple choice problem of time in quantum gravity, which posits that there is no distinguished choice of internal clock in generic general relativistic systems and that different choices lead to inequivalent quantum theories. Exploiting a recent approach to switching quantum reference systems (arXiv:1809.00556, arXiv:1809:05093), we exhibit a systematic method for switching between different clock choices in the quantum theory. We illustrate it by means of the parametrized particle, which, like gravity, features a Hamiltonian constraint. We explicitly switch between the quantum evolution relative to the nonrelativistic time variable and that relative to the particle’s position, which requires carefully regularizing the zeromodes in the socalled timeofarrival observable. While this toy model is simple, our approach is general and directly amenable to quantum cosmology. It proceeds by systematically linking the reduced quantum theories relative to different clock choices via the clockchoiceneutral Dirac quantized theory, in analogy to coordinate changes on a manifold. This method suggests a new perspective on the multiple choice problem, indicating that it is rather a multiple choice feature of a complete relational quantum theory, taken as the conjunction of Dirac quantized and quantum deparametrized theories. Precisely this conjunction permits one to consistently switch between different temporal reference systems which is a prerequisite for a quantum notion of general covariance. Finally, we show that quantum uncertainties lead to discontinuity in the relational dynamics when switching clocks.
Recently, new holographic models of black hole evaporation have given fresh insights into the information paradox [arXiv:1905.08255, arXiv:1905.08762, arXiv:1908.10996]. In these models, the black hole evaporates into an auxiliary bath space after a quantum quench, wherein the holographic theory and the bath are joined. One particularly exciting development is the appearance of “ER=EPR”like wormholes in the (doubly) holographic model of [arXiv:1908.10996]. At late times, the entanglement wedge of the bath includes the interior of the black hole. In this paper, we employ both numerical and analytic methods to study how information about the black hole interior is encoded in the Hawking radiation. In particular, we systematically excise intervals from the bath from the system and study the corresponding Page transition. Repeating this process ad infinitum, we end up with a fractal structure on which the black hole interior is encoded, implementing the uberholography protocol of [arXiv:1612.00017].
In a September 1976 PRL Eguchi and Freund considered two topological invariants: the Pontryagin number $P \sim \int d^4x \sqrt{g}R^* R$ and the Euler number $\chi \sim \int d^4x \sqrt{g}R^* R^*$ and posed the question: to what anomalies do they contribute? They found that $P$ appears in the integrated divergence of the axial fermion number current, thus providing a novel topological interpretation of the anomaly found by Kimura in 1969 and Delbourgo and Salam in 1972. However, they found no analogous role for $\chi$. This provoked my interest and, drawing on my April 1976 paper with Deser and Isham on gravitational Weyl anomalies, I was able to show that for Conformal Field Theories the trace of the stress tensor depends on just two constants: \[ g^{\mu\nu}\langle T_{\mu\nu}\rangle=\frac{1}{(4\pi)^2}(cFaG)\] where $F$ is the square of the Weyl tensor and $\int d^4x\sqrt{g} G/(4\pi)^2$ is the Euler number. For free CFTs with $N_s$massless fields of spin $s$ \[ 720c=6N_0 + 18N_{1/2} + 72 N_1~~~~ 720a=2N_0 + 11N_{1/2} + 124N_1 \]
This is the first of two papers which attempt to comprehensively analyse superdeterministic hiddenvariables models of Bell correlations. We first give an overview of superdeterminism and discuss various criticisms of it raised in the literature. We argue that the most common criticism, the violation of `freewill’, is incorrect. We take up Bell’s intuitive criticism that these models are `conspiratorial’. To develop this further, we introduce nonequilibrium extensions of superdeterministic models. We show that the measurement statistics of these extended models depend on the physical system used to determine the measurement settings. This suggests a finetuning in order to eliminate this dependence from experimental observation. We also study the signalling properties of these extended models. We show that although they generally violate the formal nosignalling constraints, this violation cannot be equated to an actual signal. We therefore suggest that the socalled nosignalling constraints be more appropriately named the marginalindependence constraints. We discuss the mechanism by which marginalindependence is violated in superdeterministic models. Lastly, we consider a hypothetical scenario where two experimenters use the apparentsignalling of a superdeterministic model to communicate with each other. This scenario suggests another conspiratorial feature peculiar to superdeterminism. These suggestions are quantitatively developed in the second paper.
There is a long tradition of thinking of thermodynamics, not as a theory of fundamental physics (or even a candidate theory of fundamental physics), but as a theory of how manipulations of a physical system may be used to obtain desired effects, such as mechanical work. On this view, the basic concepts of thermodynamics, heat and work, and with them, the concept of entropy, are relative to a class of envisaged manipulations. This view has been dismissed by many philosophers of physics, in my opinion too hastily. This paper is a sketch and defense of a science of manipulations and their effects on physical systems. This is, I claim, the best way to make sense of thermodynamics as it is found in textbooks and as it is practiced. I call this science thermodynamics (with hyphen), or $\Theta \Delta^{cs}$, for short, to highlight that it may be different from the science of thermodynamics, as the reader conceives it. Even if one is not convinced that it is the best way to make sense of thermodynamics as it is practiced, it should be noncontroversial that $\Theta \Delta^{cs}$ is a legitimate science. An upshot of the discussion is a clarification of the roles of the Gibbs and von Neumann entropies. Given the definition of statistical thermodynamic entropy, it can be proven that, under the assumption of availability of thermodynamically reversible processes, these functions are the unique (up to an additive constant) functions that represent thermodynamic entropy. Light is also shed on the use of coarsegrained entropies.}
Landauer’s principle is, roughly, the principle that there is an entropic cost associated with implementation of logically irreversible operations. Though widely accepted in the literature on the thermodynamics of computation, it has been the subject of considerable dispute in the philosophical literature. Both the cogency of proofs of the principle and its relevance, should it be true, have been questioned. In particular, it has been argued that microscale fluctuations entail dissipation that always greatly exceeds the Landauer bound. In this article Landauer’s principle is treated within statistical mechanics, and a proof is given that neither relies on neglect of fluctuations nor assumes the availability of thermodynamically reversible processes. In addition, it is argued that microscale fluctuations are no obstacle to approximating thermodynamic reversibility as closely as one would like
We prove that superdeterministic models of quantum mechanics are conspiratorial in a mathematically welldefined sense, by further development of the ideas presented in a previous article $\mathcal{A}$. We consider a Bell scenario where, in each run and at each wing, the experimenter chooses one of $N$ devices to determine the local measurement setting. We prove, without assuming any features of quantum statistics, that superdeterministic models of this scenario must have a finelytuned distribution of hidden variables. Specifically, finetuning is required so that the measurement statistics depend on the measurement settings but not on the details of how the settings are chosen. We quantify this as the overhead finetuning $F$ of the model, and show that $F > 0$ (corresponding to `finetuned’) for any $N >1$. The notion of finetuning assumes that arbitrary (`nonequilibrium’) hiddenvariables distributions are possible in principle. We also show how to quantify superdeterministic conspiracy without using nonequilibrium. This second approach is based on the fact that superdeterministic correlations can mimic actual signalling. We argue that an analogous situation occurs in equilibrium for a superdeterministic model of our scenario. We quantify the conspiracy by showing that an appropriately defined formal entropy for superdeterministic models of our scenario spontaneously decreases with time. In both approaches, superdeterministic models become arbitrarily conspiratorial as $N \to \infty$. We thus quantitatively confirm Bell’s intuition that superdeterministic models are conspiratorial.
PRL: General Physics: Statistical and Quantum Mechanics, Quantum Information, etc.
Author(s): Chiara Marletto and Vlatko Vedral
In the AharonovBohm (AB) effect, a superposed charge acquires a detectable phase by enclosing an infinite solenoid, in a region where the solenoid’s electric and magnetic fields are zero. Its generation seems therefore explainable only by the local action of gaugedependent potentials, not of gauge…
[Phys. Rev. Lett. 125, 040401] Published Wed Jul 22, 2020
Einstein claimed that the fundamental dynamical insight of special relativity was the equivalence of mass and energy. I disagree. Not only are mass and energy not equivalent (whatever exactly that means) but talk of such equivalence obscures the real dynamical insight of special relativity, which concerns the nature of 4forces and interactions more generally. In this paper I present and defend a new ontology of special relativistic particle dynamics that makes this insight perspicuous and I explain how alleged cases of mass–energy conversion can be accommodated within that ontology.
An adaptive heterodyne technique with a Josephson parametric amplifier detector allows a highprecision singleshot canonical phase measurement on a onephoton wave packet, complementing nearideal measurements of photon number or field amplitude.
]]>Weekly Papers on Quantum Foundations (29)
https://ijqf.org/archives/6035
Sat, 18 Jul 2020 02:06:25 +0000https://www.ijqf.org/?p=6035 Read more → ]]>
Bohm developed the Bohmian mechanics (BM), in which the Schrödinger equation is transformed into two differential equations: a continuity equation and an equation of motion similar to the Newtonian equation of motion. This transformation can be executed both for singleparticle systems and for manyparticle systems. Later, Kuzmenkov and Maksimov used basic quantum mechanics for the derivation of manyparticle quantum hydrodynamics (MPQHD) including one differential equation for the mass balance and two differential equations for the momentum balance, and we extended their analysis in a prework (K. Renziehausen, I. Barth in Prog. Theor. Exp. Phys. 2018:013A05, 2018) for the case that the particle ensemble consists of different particle sorts. The purpose of this paper is to show how the differential equations of MPQHD can be derived for such a particle ensemble with the differential equations of BM as a starting point. Moreover, our discussion clarifies that the differential equations of MPQHD are more suitable for an analysis of manyparticle systems than the differential equations of BM because the differential equations of MPQHD depend on a single position vector only while the differential equations of BM depend on the complete set of all particle coordinates.
This paper argues that the path integral formulation of quantum mechanics suggests a form of holism for which the whole (total ensemble of paths) has properties that are not strongly reducible to the properties of the parts (the single trajectories). Feynman’s sum over histories calculates the probability amplitude of a particle moving within a boundary by summing over all the possible trajectories that the particle can undertake. These trajectories and their individual probability amplitudes are thus necessary in calculating the total amplitude. However, not all possible trajectories are differentiable, thus suggesting that they are not physical possibilities, but only mathematical entities. It follows that if the possible differentiable trajectories are taken to be part of the physical system, they are not sufficient to calculate the total probability amplitude. The conclusion is that the total ensemble is weakly nonsupervenient upon the physically possible trajectories.
We propose a Quantum Field Theory description of beams on a Mach–Zehnder interferometer and apply the method to describe Interaction Free Measurements (IFMs), concluding that there is a change of momentum of the fields in IFMs. Analysing the factors involved in the probability of emission of lowenergy photons, we argue that they do not yield meaningful contributions to the probabilities of the IFMs.
An analysis is presented of the possible existence of the second anomalous dipole moment of Dirac’s particle next to the one associated with the angular momentum. It includes a discussion why, in spite of his own derivation, Dirac has doubted about its relevancy. It is shown why since then it has been overlooked and why it has vanished from leading textbooks. A critical survey is given on the reasons of its reject, including the failure of attempts to measure and the perceived violations of time reversal symmetry and charge–parity symmetry. It is emphasized that the anomalous electric dipole moment of the pointlike electron (AEDM) is fundamentally different from the quantum field type electric dipole moment of an electron (eEDM) as defined in the standard model of particle physics. The analysis has resulted into the identification of a third type Dirac particle, next to the electron type and the Majorana particle. It is shown that, unlike as in the case of the electron type, its second anomalous dipole moment is real valued and is therefore subject to polarization in a scalar potential field. Examples are given that it may have a possible impact in the nuclear domain and in the gravitational domain.
Measurements are shown to be processes designed to return figures: they are effective. This effectivity allows for a formalization as Turing machines, which can be described employing computation theory. Inspired in the halting problem we draw some limitations for measurement procedures: procedures that verify if a quantity is measured cannot work in every case.
The formalism of general probabilistic theories provides a universal paradigm that is suitable for describing various physical systems including classical and quantum ones as particular cases. Contrary to the usual norestriction hypothesis, the set of accessible meters within a given theory can be limited for different reasons, and this raises a question of what restrictions on meters are operationally relevant. We argue that all operational restrictions must be closed under simulation, where the simulation scheme involves mixing and classical postprocessing of meters. We distinguish three classes of such operational restrictions: restrictions on meters originating from restrictions on effects; restrictions on meters that do not restrict the set of effects in any way; and all other restrictions. We fully characterize the first class of restrictions and discuss its connection to convex effect subalgebras. We show that the restrictions belonging to the second class can impose severe physical limitations despite the fact that all effects are accessible, which takes place, e.g., in the unambiguous discrimination of pure quantum states via effectively dichotomic meters. We further demonstrate that there are physically meaningful restrictions that fall into the third class. The presented study of operational restrictions provides a better understanding on how accessible measurements modify general probabilistic theories and quantum theory in particular.
In a recent manuscript, Gelman & Yao (2020) claim that “the usual rules of conditional probability fail in the quantum realm” and purport to support that statement with the example of a quantum doubleslit experiment. The present note recalls some relevant literature in quantum theory and shows that (i) Gelman & Yao’s statement is false; in fact, their quantum example confirms the rules of probability theory; (ii) the particular inequality found in the quantum example can be shown to appear also in very nonquantum examples, such as drawing from an urn; thus there is nothing peculiar to quantum theory in this matter. A couple of wrong or imprecise statements about quantum theory in the cited manuscript are also corrected.
We demonstrate the power of the black hole mass gap as a novel probe of fundamental physics. New light particles that couple to the Standard Model can act as an additional source of energy loss in the cores of populationIII stars, dramatically altering their evolution. We investigate the effects of two paradigmatic weakly coupled, lowmass particles, axions and hidden photons, and find that the pulsational pair instability, which causes a substantial amount of mass loss, is suppressed. As a result, it is possible to form black holes of $72\msun$ or heavier, deep inside the black hole mass gap predicted by the Standard Model. The upper edge of the mass gap is raised to $>130{\rm M}_\odot$, implying that heavier black holes, anticipated to be observed after LIGO’s sensitivity is upgraded, would also be impacted. In contrast, thermally produced heavy particles would remain in the core, leading to the tantalizing possibility that they drive a new instability akin to the electronpositron pair instability. We investigate this effect analytically and find that stars that avoid the electronpositron pair instability could experience this new instability. We discuss our results in light of current and upcoming gravitational wave interferometer detections of binary black hole mergers.
Time at the Planck scale ($\sim 10^{44}~\mathrm{s}$) is an unexplored physical regime. It is widely believed that probing Planck time will remain for long an impossible task. Yet, we propose an experiment to test the discreteness of time at the Planck scale and show that it is not far removed from current technological capabilities.
This article is an introduction to two currently very active research programs, the Conformal Bootstrap and Scattering Amplitudes. Rather than attempting full surveys, the emphasis is on common ideas and methods shared by these two seemingly very different programs. In both fields, mathematical and physical constraints are placed directly on the physical observables in order to explore the landscape of possible consistent quantum field theories. We give explicit examples from both programs: the reader can expect to encounter boiling water, ferromagnets, pion scattering, and emergent symmetries on this journey into the landscape of local relativistic quantum field theories. The first part is written for a general physics audience. The second part includes further details, including a new onshell bottomup reconstruction of the $\mathbb{CP}^1$ model with the FubiniStudy metric arising from resummation of the $n$point interaction terms derived from amplitudes.
One of the most striking features of the epistemological situation of Quantum Mechanics is the number of interpretations and the many schools of thought, with no consensus on the way to understand the theory. In this article, I introduce a distinction between orthodox interpretations and heterodox interpretations of Quantum Mechanics: the orthodox interpretations preserve all the quantum principles while the heterodox interpretations replace at least one of them. Then, I argue that we have strong empirical and epistemological reasons to prefer orthodox interpretations to heterodox interpretations. The first argument is that all the experiments on the foundations of Quantum Mechanics give a high degree of corroboration to the quantum principles and, consequently, to the orthodox interpretations. The second argument is that the scientific progress needs a consensus: this consensus is impossible with the heterodox interpretations, while it is possible with the orthodox interpretations. Giving the preference to the orthodox interpretations is a reasonable position which could preserve both a consensus on quantum principles and a plurality of views on Quantum Mechanics.
PRL: General Physics: Statistical and Quantum Mechanics, Quantum Information, etc.
Author(s): Jad C. Halimeh and Philipp Hauke
Currently, there are intense experimental efforts to realize lattice gauge theories in quantum simulators. Except for specific models, however, practical quantum simulators can never be finetuned to perfect local gauge invariance. There is thus a strong need for a rigorous understanding of gaugein…
[Phys. Rev. Lett. 125, 030503] Published Wed Jul 15, 2020
Bingtian Ye, Francisco Machado, Christopher David White, Roger S. K. Mong, and Norman Y. Yao

PRL: General Physics: Statistical and Quantum Mechanics, Quantum Information, etc.
Author(s): Bingtian Ye, Francisco Machado, Christopher David White, Roger S. K. Mong, and Norman Y. Yao
A tremendous amount of recent attention has focused on characterizing the dynamical properties of periodically driven manybody systems. Here, we use a novel numerical tool termed “density matrix truncation” (DMT) to investigate the latetime dynamics of largescale Floquet systems. We find that DMT…
[Phys. Rev. Lett. 125, 030601] Published Wed Jul 15, 2020
Some philosophers argue that nonpresentist Atheories (i.e. the views that the tenses—past, present, and future—are objective features and that not only present things exist) problematically imply that we cannot know that this moment is present. The problem is usually presented as arising from the combination of the Atheoretic ideology of a privileged presentness and a nonpresentist ontology. The goal of this essay is to show that the epistemic problem can be rephrased as a pessimistic induction. By doing so, I will show that the epistemic problem, in fact, stems from the Atheoretic ideology alone. Hence, once it is properly presented, the epistemic problem presents a serious threat to all Atheories.
PhilsciArchive: No conditions. Results ordered Date Deposited.
Sheridan, Erin (2020) A Man Misunderstood: Von Neumann did not claim that his entropy corresponds to the phenomenological thermodynamic entropy. [Preprint]
]]>Weekly Papers on Quantum Foundations (28)
https://ijqf.org/archives/6032
Sat, 11 Jul 2020 02:36:33 +0000https://www.ijqf.org/?p=6032 Read more → ]]>
Photonic de Broglie waves (PBWs) via twomode entangled photon pair interactions on a beam splitter show a pure quantum feature which cannot be obtained by classical means14. Although PBWs have been intensively studied for quantum metrology513 and quantum sensing1425 over the last several decades, their implementation has been limited due to difficulties of highorder NOON state generation4. Recently a coherence version of PBWs, the socalled coherence de Broglie waves (CBWs), has been proposed in a pure classical regime of an asymmetrically coupled MachZehnder interferometer (MZI)26. Unlike PBWs, the quantumness of CBWs originates from the cascaded quantum superposition of the coupled MZI. Here, the first CBWs observation is presented in a pure classical regime and discussed for its potential applications in coherence quantum metrology to overcome conventional PBWs limited by higherorder entangled photons. To understand the quantum superpositionbased nonclassical features in CBWs, various violation tests are also performed, where asymmetrical phase coupling is the key parameter for CBWs.
Logical inference leads to one of the major interpretations of probability theory called logical interpretation, in which the probability is seen as a measure of the plausibility of a logical statement under incomplete information. In this paper, assuming that our usual inference procedure makes sense for every set of logical propositions represented in terms of commuting projectors on a given Hilbert space, we extend the logical interpretation to quantum mechanics and derive the Born rule. Our result implies that, from the epistemological viewpoints, we can regard quantum mechanics as a natural extension of the classical probability.
We study stabilizer quantum errorcorrecting codes (QECC) generated under hybrid dynamics of local Clifford unitaries and local Pauli measurements in one dimension. Building upon 1) a general formula relating the errorsusceptibility of a subregion to its entanglement properties, and 2) a previously established mapping between entanglement entropies and domain wall free energies of an underlying spin model, we propose a statistical mechanical description of the QECC in terms of “entanglement domain walls”. Free energies of such domain walls generically feature a leading volume law term coming from its “surface energy”, and a subvolume law correction coming from thermodynamic entropies of its transverse fluctuations. These are most easily accounted for by capillarywave theory of liquidgas interfaces, which we use as an illustrative tool. We show that the informationtheoretic decoupling criterion corresponds to a geometric decoupling of domain walls, which further leads to the identification of the “contiguous code distance” of the QECC as the crossover length scale at which the energy and entropy of the domain wall are comparable. The contiguous code distance thus diverges with the system size as the subleading entropic term of the free energy, protecting a finite code rate against local undetectable errors. We support these correspondences with numerical evidence, where we find capillarywave theory describes many qualitative features of the QECC; we also discuss when and why it fails to do so.
Quantum mechanics in the Wignervon Neumann interpretation is presented. This is characterized by 1) a quantum dualism between matter and consciousness unified within an informational neutral monism, 2) a quantum perspectivism which is extended to a complementarity between the Copenhagen interpretation and the manyworlds formalism, 3) a psychophysical causal closure akin to Leibniz parallelism and 4) a quantum solipsism, i.e. a reality in which classical states are only potentiallyexisting until a conscious observation is made.
In Einstein’s general relativity, gravity is mediated by a massless metric field. The extension of general relativity to consistently include a mass for the graviton has profound implications for gravitation and cosmology. Salient features of various massive gravity theories can be captured by Galileon models, the simplest of which is the cubic Galileon. The presence of the Galileon field leads to additional gravitational radiation in binary pulsars where the Vainshtein mechanism is less suppressed than its fifthforce counterpart, which deserves a detailed confrontation with observations. We prudently choose fourteen welltimed binary pulsars, and from their intrinsic orbital decay rates we put a new bound on the graviton mass, $m_g \lesssim 2 \times 10^{28}\,{\rm eV}/c^2$ at the 95% confidence level, assuming a flat prior on $\ln m_g$. It is equivalent to a bound on the graviton Compton wavelength $\lambda_g \gtrsim 7 \times 10^{21}\,{\rm m}$. Furthermore, we extensively simulate times of arrival for pulsars in orbit around stellarmass black holes and the supermassive black hole at the Galactic center, and investigate their prospects in probing the cubic Galileon theory in the near future.
We explore the interplay of matter with quantum gravity with a preferred frame to highlight that the matter sector cannot be protected from the symmetrybreaking effects in the gravitational sector. Focusing on Abelian gauge fields, we show that quantum gravitational radiative corrections induce Lorentzinvarianceviolating couplings for the Abelian gauge field. In particular, we discuss how such a mechanism could result in the possibility to translate observational constraints on Lorentz violation in the matter sector into strong constraints on the Lorentzviolating gravitational couplings.
We develop a formalism to compute the gravitational multipole moments and ratios of moments of nonextremal and of supersymmetric black holes in four dimensions, as well as of horizonless microstate geometries of the latter. For supersymmetric and for Kerr black holes many of these multipole moments vanish, and their dimensionless ratios are illdefined. We present two methods to compute these dimensionless ratios, which for certain supersymmetric black holes agree spectacularly. We also compute these dimensionless ratios for the Kerr solution. Our methods allow us to calculate an infinite number of hitherto unknown parameters of Kerr black holes, giving us a new window into their physics.
PRL: General Physics: Statistical and Quantum Mechanics, Quantum Information, etc.
Author(s): Iliya Esin, Alessandro Romito, and Yuval Gefen
A new measurement protocol is defined for a MachZehnder geometry with a finite signal when quantum interference is concerned, but vanishes for classical waves or particles.
[Phys. Rev. Lett. 125, 020405] Published Fri Jul 10, 2020
An analysis is presented of the possible existence of the second anomalous dipole moment of Dirac’s particle next to the one associated with the angular momentum. It includes a discussion why, in spite of his own derivation, Dirac has doubted about its relevancy. It is shown why since then it has been overlooked and why it has vanished from leading textbooks. A critical survey is given on the reasons of its reject, including the failure of attempts to measure and the perceived violations of time reversal symmetry and charge–parity symmetry. It is emphasized that the anomalous electric dipole moment of the pointlike electron (AEDM) is fundamentally different from the quantum field type electric dipole moment of an electron (eEDM) as defined in the standard model of particle physics. The analysis has resulted into the identification of a third type Dirac particle, next to the electron type and the Majorana particle. It is shown that, unlike as in the case of the electron type, its second anomalous dipole moment is real valued and is therefore subject to polarization in a scalar potential field. Examples are given that it may have a possible impact in the nuclear domain and in the gravitational domain.
We show that in the presence of the torsion tensor \(S^k_{ij}\), the quantum commutation relation for the fourmomentum, traced over spinor indices, is given by \([p_i,p_j]=2i\hbar S^k_{ij}p_k\). In the Einstein–Cartan theory of gravity, in which torsion is coupled to spin of fermions, this relation in a coordinate frame reduces to a commutation relation of noncommutative momentum space, \([p_i,p_j]=i\epsilon _{ijk}Up^3 p_k\), where U is a constant on the order of the squared inverse of the Planck mass. We propose that this relation replaces the integration in the momentum space in Feynman diagrams with the summation over the discrete momentum eigenvalues. We derive a prescription for this summation that agrees with convergent integrals: \(\int \frac{d^4p}{(p^2+\varDelta )^s}\rightarrow 4\pi U^{s2}\sum _{l=1}^\infty \int _0^{\pi /2} d\phi \frac{\sin ^4\phi \,n^{s3}}{[\sin \phi +U\varDelta n]^s}\), where \(n=\sqrt{l(l+1)}\)and \(\varDelta \)does not depend on p. We show that this prescription regularizes ultravioletdivergent integrals in loop diagrams. We extend this prescription to tensor integrals. We derive a finite, gaugeinvariant vacuum polarization tensor and a finite running coupling. Including loops from all charged fermions, we find a finite value for the bare electric charge of an electron: \(\approx 1.22\,e\). This torsional regularization may therefore provide a realistic, physical mechanism for eliminating infinities in quantum field theory and making renormalization finite.
PhilsciArchive: No conditions. Results ordered Date Deposited.
de Waal, Elske and ten Hagen, Sjang L. (2020) The Concept of Fact in German Physics around 1900: A Comparison between Mach and Einstein. Physics in Perspective, 22 (2). pp. 5580. ISSN 14226944
Mesoscale modeling is often considered merely as a practical strategy used when information on lowerscale details is lacking, or when there is a need to make models cognitively or computationally tractable. Without dismissing the importance of practical constraints for modeling choices, we argue that mesoscale models should not just be considered as abbreviations or placeholders for more “complete” models. Because many systems exhibit different behaviors at various spatial and temporal scales, bottomup approaches are almost always doomed to fail. Mesoscale models capture aspects of multiscale systems that cannot be parameterized by simple averaging of lowerscale details. To understand the behavior of multiscale systems, it is essential to identify mesoscale parameters that “code for” lowerscale details in a way that relate phenomena intermediate between microscopic and macroscopic features. We illustrate this point using examples of modeling of multiscale systems in materials science (steel) and biology (bone), where identification of material parameters such as stiffness or strain is a central step. The examples illustrate important aspects of a socalled “middleout” modeling strategy. Rather than attempting to model the system bottomup, one starts at intermediate (mesoscopic) scales where systems exhibit behaviors distinct from those at the atomic and continuum scales. One then seeks to upscale and downscale to gain a more complete understanding of the multiscale system. The cases highlight how parameterization of lowerscale details not only enables tractable modeling but is also central to understanding functional and organizational features of multiscale systems.
The uncertainty on measurements, given by the Heisenberg principle, is a quantum concept usually not taken into account in General Relativity. From a cosmological point of view, several authors wonder how such a principle can be reconciled with the Big Bang singularity, but, generally, not whether it may affect the reliability of cosmological measurements. In this letter, we express the Compton mass as a function of the cosmological redshift. The cosmological application of the indetermination principle unveils the differences of the HubbleLemaître constant value, \(H_0\), as measured from the Cepheids estimates and from the Cosmic Microwave Background radiation constraints. In conclusion, the \(H_0\)tension could be related to the effect of indetermination derived in comparing a kinematic with a dynamic measurement.
Measurements are shown to be processes designed to return figures: they are effective. This effectivity allows for a formalization as Turing machines, which can be described employing computation theory. Inspired in the halting problem we draw some limitations for measurement procedures: procedures that verify if a quantity is measured cannot work in every case.
PhilsciArchive: No conditions. Results ordered Date Deposited.
Stergiou, Chrysovalantis (2020) Empirical Underdetermination for Physical Theories in C* Algebraic Setting: Comments to an Arageorgis’s Argument. [Preprint]
The breakdown of superconductivity is described as a reduction in the amplitude of the order parameter or a breakdown in phase coherence of Cooper pairs. This Review Article highlights recent results that show both mechanisms may be at play simultaneously.
PhilsciArchive: No conditions. Results ordered Date Deposited.
Murgueitio Ramírez, Sebastián and Teh, Nicholas (2020) Abandoning Galileo’s Ship: The quest for nonrelational empirical significance. The British Journal for the Philosophy of Science.
]]>A New Theorem in Biquaternion Field and Its Applications in Quantum Mechanics
https://ijqf.org/archives/6029
Wed, 08 Jul 2020 11:30:42 +0000https://www.ijqf.org/?p=6029http://viXra.org/abs/2007.0048
]]>