Home › Forums › 2015 International Workshop on Quantum Foundations › Panel Discussion › What are the most pressing problems? and how to solve them?
Tagged: locality, psi-epistemic, psi-hybrid, psi-ontic, superposition principle
- This topic has 29 replies, 8 voices, and was last updated 5 years ago by Quantum Speculations.
March 29, 2015 at 8:48 am #2177editorKeymaster
The forthcoming Quantum Foundations Workshop 2015 aims to address the most pressing problems in the foundations of quantum theory today. Then what are these most pressing problems? and how to solve them? can we solve any of them when celebrating the 100th anniversary of quantum mechanics?
No doubt, different persons may give different answers to these questions. But the answers of leading experts will be more helpful for other researchers, especially for young researchers. Thus I would like to invite every member of IJQF to answer these questions, and in particular, to answer the similar questions in his or her research field such as his or her favorite approach to quantum theory. I plan to publish these answers in the July issue of IJQF.March 29, 2015 at 9:01 am #2181Carlo RovelliParticipant
There is still much confusion about quantum theory. I believe that the origin of the confusion is due to the mistake of taking the quantum state (the wave function) as a description of how things *are*, instead of what it really is: a way of coding quantum events and a devise for predicting future ones. The mistake of interpreting the state realistically leads to the apparent mysteries of the “measurement problem”, and the “collapse of the wave function”, and to misunderstanding what quantum theory is actually telling us about the world; namely that in the real world there are less real events than in classical mechanics, and that there is an essential discreteness in nature: there is only one distinguishable states per Planck volume in phase space, and not an infinite number, as in classical theory.
Carlo RovelliMarch 29, 2015 at 9:10 am #2182
Is the macroscopic ‘classical’ world also governed by the laws of quantum mechanics? If so, how is the classical situation to be explained starting from quantum mechanics? If not, what modifications of quantum mechanics are expected on the road from microscopic to macroscopic physics?
Does decoherence help us understand and resolve the difficulties of quantum foundations, and if so how?
Does this resolve the fundamental problems of quantum mechanics, or does it simply create more problems?
As in Bell inequalities or Bohmian mechanics. Do they help us understand quantum mechanics, or are they a dead end?
How does one understand information in quantum mechanical terms? Does an information theoretical approach help us understand quantum mechanics?
Are there such things as instantaneous nonlocal influences? If so, how does this fit with special relativity? If not, how shall we understand quantum correlations as in EPR-Bohm?
Is measurement essential to any formulation of quantum mechanics, or can measurements be described using fundamental quantum principles that make no reference to measurements? If the latter, what are those principles, and how can they be used to describe measurements?
What shall we do with superpositions of different measurement outcomes (Schrodinger cat states)?
What do measurement outcomes (pointer positions) tell us about the (microscopic) situation that existed before the measurement took place?
+probabilities in quantum mechanics
What is the right way to introduce and discuss probabilities in the framework of quantum mechanics? (This is connected with lots of other things:
decoherence, information, measurement, time development.)
+relativity and quantum mechanics
1. Can quantum mechanics and special relativity be combined into a single coherent theory?
2. Can quantum mechanics and general relativity be combined, possibly by modifying one or the other or both?
The proposals of Ghirardi, Pearle, etc. for modifying the Schrodinger equation by adding a stochastic term. Does this help us understand quantum mechanics or is it simply another dead end? What are its chances of being confirmed by experiments?
Is the time development of quantum systems governed by Schrodinger’s
(deterministic) equation, or is it, in whole or in part, stochastic (probabilistic)?
Does the quantum wavefunction represent reality (ontic), or only our knowledge of, or information about, reality (epistemic), or perhaps something else?
My position on these:
I believe the classical physics is a very useful approximation to a more fundamental quantum physics which applies at every length scale. The connection has been worked out in part by Gell-Mann and Hartle, and by Omnes, but much more could be done.
I don’t think decoherence by itself resolves the quantum measurement problem. On the other hand, I consider it valuable for understanding how classical physics arises out of quantum mechanics, and hope that further research will provide a more precise understanding of decoherence.
In my opinion consistent histories provides a much more satisfactory solution than Everett (and many worlds) to the quantum mysteries.
I consider this a dead end. Experiments have not been kind to Bell inequalities, and consistent histories have put an end to the (supposed) nonlocality that pervades Bohmian mechanics.
I have published various papers in the technical quantum information literature, and at least one paper in the quantum foundations literature, on what I believe to be the proper quantum mechanical way (using consistent
histories) to extend Shannon’s information theory to the quantum domain.
The earlier enthusiasm in the foundations community for unraveling quantum paradoxes using information theory seems to have run into trouble and I am not surprised: they had no answer to Bell’s question: “Information ABOUT WHAT?”
I think the quantum world is local, and have published a long paper, which has not yet been challenged (perhaps because no one has read it!), demonstrating this. I think Bell was wrong.
The fundamental formulation of quantum mechanics given by consistent histories makes no mention of measurement, and applying it to the standard von Neumann measurement model disposes (in my opinion) of Schrodinger’s cat, and tells one that the experimentalists are right when they believe their apparatus measures what it was designed to do. So I regard this problem as solved, even though no doubt more could be said.
+probabilities in quantum mechanics
The consistent histories approach was initiated in order to deal with probabilities in the quantum context, and I think it has solved the problem–while I allow that there may be alternative or better approaches.
+relativity and quantum mechanics
1. Special relativity. If Gell-Mann and Hartle had thought there was any conflict between special relativity and decoherent histories (their version of consistent histories) they would probably never have published their work.
I have confirmed the compatibility of the two using my own approach. There is no difficulty once one has a consistent way of introducing probabilities in quantum theory.
2. General relativity. This is outside my domain of competence, so I am hoping that Jim Hartle will come up with a solution …
The problems which motivated the development of GRW and related have, in my opinion, been solved by consistent histories, so I am no longer concerned about them. Of course, any genuine and confirmed deviation from standard quantum mechanics is worth a trip to Stockholm. but I doubt if this one is going to materialize.
My answer is that it is in general stochastic, though in particular circumstances one can think of it as deterministic.
As indicated at a previous workshop, the most common use of the quantum wavefunction is to calculate probabilties so a ‘pre-probability’ in my terminology, though in some circumstances it can indicate a quantum state (which is best regarded as the corresponding ray in Hilbert space).April 6, 2015 at 8:58 am #2212
There is indeed considerable confusion about the most pressing problems of quantum theory and, depending on them, their possible solutions. It is even more unfortunate that this confusion seems to be accompanied by a certain amount of prejudice (for or against some kinds of proposals). Since it seems that statements starting with ”I believe that …” are not very helpful in this debate, let me suggest first to categorize different existing proposals. Everybody may then decide (and tell us) to which category his own proposal belongs (in many cases that may be obvious), or, alternatively, explain why he would dismiss some of these categories for being ”not at all promising” or even inconsistent. I think that every individual proposal that does not explicitly postulate the superposition principle (that is, the origin of the wave function in configuration space and beyond) as part of its kinematics should at least indicate how it would justify the well established general applicability (or possible limits) of this most important principle of quantum mechanics. (Or is this already a prejudice?). Similarly, it should indicate its position regarding non-locality if this is not obvious.
The following is meant as a rough suggestion to begin with, and it can, of course, be subject to change or an introduction of subcategories (but I would prefer not to go into too much detail at this point):
1. Ontic wave function postulated as the exclusive kinematics (psi-complete)
a) collapse of the wave function (modified dynamics)
b) Everett (universal unitarity)
2. Classical kinematics (ontic particles and fields) under novel (non-local) dynamics
a) pilot wave (psi-ontic but not psi-complete)
b) other novel (such as stochastic) dynamics
3. New kinematics (”hidden variables”)
a) including the wave function (psi-ontic but not psi-complete again, but not just classical variables as in 2a)
b) excluding the wave function (psi-epistemic: psi to be explained by ensembles or else)
b1) nonlocal dynamics
b2) nonlocal kinematics other than wave function
b3) both nonlocal
4. Denial of any underlying microscopic reality (pragmatic approach only, that is, ”shut up and calculate”?)
5. Anything else (such as no concepts of kinematics and dynamics any more)
It may be too simple yet (or incomplete), but that question may become part of the debate. For the historians among us it may also be fun (and probably not always trivial) to classify great quantum physicists of the past (or their versions at various times) according to this or a similar scheme. However, it seems that we cannot decide between all these possibilities without any novel empirical evidence. Are they, therefore, presently no more than a matter of taste?April 7, 2015 at 5:09 pm #2217
I agree with Zeh, as I posted elsewhere http://www.ijqf.org/archives/2144, that “It is even more unfortunate that this confusion seems to be accompanied by a certain amount of prejudice (for or against some kinds of proposals).” I also agree “that we cannot decide between all these possibilities without any novel empirical evidence.” So, my particular “prejudice” is that interpretations lead to new physics, e.g., new approaches to quantum gravity, which can then be tested. Otherwise, it is really just “a matter of [metaphysical] taste.”
As for where to classify Relational Blockworld per Zeh’s taxonomy, I would start with 3b, as a realist psi-epistemic account http://www.ijqf.org/archives/2087. I’m not sure where its adynamical global constraint is located within his further subcategories. Perhaps Dieter would make that assessment for us?April 15, 2015 at 1:12 am #2239
With all due respect to Carlo (whose work on emergence I find very interesting): as I have noted in reply to Mark’s post under the upcoming workshop, it is certainly not necessarily a “mistake” to take the quantum state as ontic just because people have not succeeded in solving the measurement problem with the usual approaches to QM. I do hope we can refrain from casting aspersions on other points of view by calling them ‘prejudices’ or ‘confusion’. I certainly don’t agree with some approaches but I try to adduce specific logical arguments when critizing them, and methodological considerations in proposing my own approach and suggesting why I think it is a better way than others. Clearly there is a huge diversity of views on how to interpret QM, but I think we can do better than using pejoratives and (at least borderline) ad hominem descriptions to characterize those with which we do not agree.
Also, I have to note regarding Dieter’s proposed taxonomy, one does not need to modify the basic quantum dynamics to account for collapse. So one needs a 1(c): collapse in a direct-action picture in which absorption precipitates collapse (no ad hoc modification to the basic Schrodinger evolution)April 16, 2015 at 10:06 am #2241
Thanks, Ruth, your second paragraph is precisely what I tried to suggest and what hopefully will help us not to misunderstand each other. Can you recommend a paper (or part thereof) where the transactional model is generally defined (if possible without relying on pedagogical examples a la Wheeler-Feynman, that is, without using classical particles and fields – as you seem to indicate)? I am not yet sure that the model can be dynamically consistent – but that should not be discussed at this point.
I have not sufficiently understood Mark Stuckey’s propal in order to criticize or suggest an improvement of his own classification (so I hope his may help us to better understand him). For all solutions belonging to 3b, I would, in particular, like to understand explicitly how they explain the general success of the superposition principle (including wave mechanics).
Another model for which I woud like to know precisely where to put it in my taxonomy according to its proponents is ´´consistent histories´´.
If you like, you may subdivide 1a (collapse theories) as
a1) Pearle, GRW
a2) gravitationally induced (Penrose, Diosi)
a3) mind induced (Wigner)
You may have noticed that I did not explicitly list decoherence, since imo it belongs to 1b if regarded as a solution of the measurement problem (although it has also to be taken into account as a partial but unavoidable and important step in other psi-ontic proposals).April 17, 2015 at 3:31 am #2242
Thanks Dieter for your question. My development of TI (as PTI) is based on the Davies theory, a direct-action version of QED:
Davies, P. C. W. (1970). “A quantum theory of Wheeler-Feynman Electrodynamics,” Proc. Cam. Phil. Soc. 68, 751.
Davies, P. C. W. (1971).”Extension of Wheeler-Feynman Quantum Theory to the Relativistic Domain I. Scattering Processes,” J. Phys. A: Gen. Phys. 4, 836.
Davies, P. C. W. (1972).”Extension of Wheeler-Feynman Quantum Theory to the Relativistic Domain II. Emission Processes,” J. Phys. A: Gen. Phys. 5, 1025-1036.
Davies notes that the Feynman propagator can be decomposed into a time-symmetric part and a singular part. I identify the time-symmetric part as describing internal lines (virtual photons) and the singular part as external lines (real photons). Real photons (Fock states or offer waves in TI) prompt an absorber response (‘confirmation’) while the virtual photons do not. (It is the response from absorbers that transforms the time-symmetric propagator into the Feynman propagator.) This is discussed in http://arxiv.org/abs/1312.4007April 17, 2015 at 1:54 pm #2245
Thanks, Ruth – although I did not recognize a universal ontic wave function that explains definite measurement outcomes.April 17, 2015 at 6:37 pm #2246
In TI the quantum state is ontic, but quantum states describe offer waves of specific micro-degrees of freedom excited states of the underlying field. Since collapse occurs when an offer wave is absorbed–at the micro-level–there is no universal quantum state. A quantum state is not the correct description of the spacetime universe, since it is not a quantum object. This explains the emergence of the macro-level (in my interp., spacetime itself) as objects (clusters of actualized spacetime events) supervenient on actualized transactions (collapses). For a review of this point about spacetime emergence, and relevant references, see http://transactionalinterpretation.org/2015/03/10/a-unified-account-of-relativistic-and-non-relativistic-quantum-theory/
This is also discussed in my CUP book: http://www.cambridge.org/us/knowledge/discountpromotion/?site_locale=en_US&code=L2TIQM
And in conceptual terms in my new ICP book: http://www.worldscientific.com/worldscibooks/10.1142/p993
I should add that I certainly agree that decoherence occurs (as a deductive consequence of QM) in objects to which quantum states legitimately apply (such as bound states of fields, e.g. atoms). But I disagree that it is needed for a solution to the measurement problem in PTI, in which collapse generates spacetime events (classical phenomena).
Thanks again for your interest.April 19, 2015 at 12:59 pm #2250
We’d be interested in how you classify RBW per your taxonomy, Dieter. Along those lines, Silberstein and I will try to explain the principle of superposition per RBW, as an example of your 3b (hidden variable, psi-epistemic) classification. [Sorry, we’re not sure how to do this for a general 3b case.]
In our view, the fundamental ontological entity is a 4-dimensional spacetimesource element that corresponds to a particular experimental configuration from beginning (emission event) to end (detection event). The game of physics is then to find the probability amplitude for the 4D distribution of spacetimesource elements. Naturally, the path integral is our choice for computing this probability amplitude and, obviously, there is no superposition of possible outcomes in this God’s eye (4D) view because the outcome is know. But, we could collect all possible outcomes with their amplitudes and write them collectively as a superposition state (called the wave function). This would be the natural way to think about an experiment as 3D time-evolved beings, since we don’t know which outcome will obtain in any given trial of the experiment. Further, the amplitudes might be time dependent, given that we might not know when exactly they will occur. Of course, one uses the SE for obtaining this wave function and it can be derived as a ‘time foliation’ of the path integral.
Does this adequately explain the principle of superposition as it is understood in our particular 3b case? Or, have we missed your point?April 20, 2015 at 9:42 am #2251
Mark, I don’t understand your model sufficiently (and it would not be my job) to define its meaning and intentions, but you have already indicated what you may have in mind. (That’s enough for a tentative classification.) So I am not yet ready for a discussion in detail, but I wonder how you describe or explain your “events” (and their probabilities) within the fundamental dynamics of your model. Are they (explicitly or tacitly) assumed to be determined by your hidden variables? Perhaps I will find out myself when I read more carefully.April 20, 2015 at 9:53 am #2252
One more re-Mark: If you presume the path integral, you certainly presume the wave function (or the SPP). But what is its interpretation if you also presume hidden variables?April 21, 2015 at 11:22 am #2254
The hidden variables in RBW are the (graphical) spacetimesource element and the adynamical global constraint. We take a God’s eye (4D) view, so our hidden variables don’t “bring about events.” Per Geroch,
“There is no dynamics within space-time itself: nothing ever moves therein; nothing happens; nothing changes. In particular, one does not think of particles as moving through space-time, or as following along their world-lines. Rather, particles are just in space-time, once and for all, and the world-line represents, all at once, the complete life history of the particle.”
So, in the God’s eye view, we’re just trying to explain the 4D patterns, e.g., the relative frequency of occurrence for spin up and down outcomes in the many trials of the 3-particle GHZ experiment. Thus, each spacetimesource element relating source (emission event) to sink (specific detector events) in the 4D experimental configuration has a probability amplitude providing its frequency of occurrence in the overall 4D pattern of outcomes for that particular experiment. The adynamical global constraint provides a rule for computing that probability amplitude in the context of the path integral formalism (think lattice gauge theory, since the spacetimesource element is graphical). I wouldn’t call this probability amplitude a “wave function,” since it’s computed via the path integral using future boundary conditions (specific outcomes), not the SE. But, that’s semantics. There is a conceptual overview in the Introduction of the paper (9 pp).April 21, 2015 at 8:08 pm #2257
Mark, I hope your proposal and its relation to the empirical world and to our generally quite successful concepts and theories (including quantum mechanics) will be further discussed and explained in the main workshop.
Regarding its classification, my “hidden variables” were certainly meant to be understood in a causal picture (which you seem to be calling a prejudice), while my point 5 mentions the case of “no concepts of kinematics or dynamics any more”. This may fit better, although we may also explicitly invent a category of “acausal (and/or retrocausal) interpretations”. But I don’t want to dominate this debate about classification – so let me shut up for a while!June 29, 2015 at 12:50 am #2438
The most pressing problem of Quantum Theory is to find a consistent modification of Quantum Mechanics (QM) satisfying the following conditions
(i) in the modified QM the derivation of Bell inequalities is not possible
(ii) in the modified QM the locality is an axiom
(iii) experimental consequences of the modified QM are the same as experimental consequences of the standard QM
(iv) in the modified QM it is possible to give a local explanation of the EPR correlations
The possible solution to this problem is given by the modified QM (modQM) introduced in (https://ijqf.org/wp-content/uploads/2015/06/201503.pdf or http://vixra.org/pdf/1503.0109v1.pdf).
The main point is the rejection of the co-called von Neumann’s axiom: each ensemble which is in the pure state is homogeneous (equivalently: all elements of this ensemble are in the same individual state). In modQM we postulate the opposite anti-von Neumann’s axiom: each two individual states must be orthogonal.July 13, 2015 at 9:45 pm #2695
Quantum theory is about the ensembles and about the individual outcomes of measuring systems. But I think that in quantum mechanics there are less real events than in clsxsical mechanics but there are more real events than in the classical statistical mechanics. The reason is the fact that quantum mechanics is reversible probabilistic theory while classical statistical mechnics is the irreversible probabilistic mechanics. This is the main difference !!! Thus quantum mechanics has features that cannot be found in the classical statistical mechanics. Details of my opinions can be found in https://ijqf.org/wp-content/uploads/2015/06/201503.pdf.
Your Jiri SoucekJuly 13, 2015 at 9:55 pm #2698
You have writen “how shall we understand quantum correlations as in EPR-Bohm?”. This is the basic question. If any local quantum mechanics could be possible, it must at the first place to give the local explanation of the EPR correlations !! Of course, the first goal is to exclude the derivation of Bell inequality. But the local explanation of EPR correlations is necessary !! I tried to do this in http://vixra.org/pdf/1502.0088v1.pdf but this needs to be checked. The goal to create the local quantum mechanics is the main problem of quantum physics !!
Your Jiri SoucekJuly 13, 2015 at 10:20 pm #2704
You have written “Is measurement essential to any formulation of quantum mechanics, or can measurements be described using fundamental quantum principles that make no reference to measurements?”. I comment that if measurement should be the part of the quantum theory, than you will need something as the concept of the observation. observation means that some “observable” systems can be observed and its individual state can be “observed”. This is clear for classical systems. Without the observation the measurement problem cannot be solved. The possibility to observe the individual state of some systems is necessary for any solution of the measurement problem. I have tried to solve this in https://ijqf.org/wp-content/uploads/2015/06/201503.pdf . I think that the measurement problem can be solved only in the theory like the modified quantum mechanics. In the standard quantum mechanics there is no progress in the solution of this problem.
Your Jiri SoucekJuly 13, 2015 at 10:37 pm #2705
You have written “Can quantum mechanics and special relativity be combined into a single coherent theory?” But this is neccesary !! The reality cannot be inconsistent !! The microworld is described by the quantum mechanics and the special relativity. Without the consistency of these two theories the quantum theory cannot exists. But it exists and this means that in real applications QM and special relativity are mutually consistent. To achieve this consistensy is the task for theorists. But this means that the discussion of quantum nonlocality (or Bell nonlocality) is impossible !! The problem is how to find the possibility for the quantum theory to be local. I have tried to do this in the paper attached to this note.
Your Jiri SoucekJuly 13, 2015 at 11:04 pm #2707
You have written “The consistent histories approach was initiated in order to deal with probabilities in the quantum context, and I think it has solved the problem–while I allow that there may be alternative or better approaches.”
I think that the consistent description of probabilities in quantum mechanics needs more. Probability approach to quantum mechnaics must at the first step to explain why it is possible that the real physical outputs depend on the “which way information” !! This means that real physics depends on the things like the information !! This is shocking !! I have been trying to do this in arXiv:1008.0295. In this approach I have tried to build the extended probability theory appropriate for the describing quantum mechanics. I hope I have arrived at the point to understand the point of the problém. But this theory is not complete. The details can be found in http://arxiv.org/pdf/1008.0295.pdf.July 14, 2015 at 3:48 am #2718
It is not necessary for quantum mechanics to be local to provide consistency with relativity.
If QM is taken as describing a pre-spacetime domain from which spacetime emerges, there is no conflict.
I provide such an account in PTI (e.g. http://www.cambridge.org/9780521764155)July 15, 2015 at 12:31 am #2748
In reply to your #2698 and #2705:
The most accessible work I have published on EPR-Bohm is in “EPR, Bell, and Quantum Locality”, Am. J. Phys. 79 (2011) 954. arXiv:1007.4281. An additional and more technical treatment of nonlocality is in “Quantum Locality,” Found. Phys. 41 (2011) 705; arXiv:0908.2914. In these papers, as in Chs. 23 and 24 of my book CONSISTENT QUANTUM THEORY I expose Bell’s error, which has led many to the incorrect conclusion that quantum theory is nonlocal in the sense of mysterious superluminan influences. I showed many years ago how quantum mechanics interpreted using consisent histories is consistent with special relativity; see “Consistent resolution of some relativistic quantum paradoxes”, Phys. Rev. A 66 (2002) 062101. arXiv:quant-ph/0207015
In reply to your #2704
If one accepts that science should be based on observations and experiments and measurements, then of course any fundamental physical theory should allow for them, but they need not be part of the foundations, part of the axioms. Many people including me find standard quantum mechanics with ‘measurement’ used as a primitive notion, an axiom, unsatisfactory: measurements actually carried out in the laboratory should be describable using quantum concepts which make no reference to them. This is done in Chs. 17 and 18 of my book. Observations of the sort made by astronomers can be treated in a similar way. I do not think it helpful to replace ‘measurement’ by ‘observation’ as a fundamental concept: one is simply renaming the difficulty.
In reply to your #2707 (extended probability theory)
The consistent histories approach employs Kolmogorov probabilities and, so far as we know at present, resolves all quantum paradoxes. Before proposing alternatives I think you should take a look at what has already been successfully done using ordinary probability theory.
Bob GriffithsJuly 15, 2015 at 9:35 pm #2781
thank You for Your comments.
But I cannot agree with Your position with repect to the probability theory. The Kolmogorov probability theory cannot be the model for QM since it does not offer any possibility for the reversible time evolution. The evolution in the standard probability theory is strictly uni-directional. This is exactly the advantage of the extended probability theory that it offers the comfortable way how to describe the time reversible evolution. I am sure that the possibility to represent QM as a probabilistic theory needs the use of the extended probability theory.
To the concept of the observation. It is the fact that axioms for the concept of the observation in the modified QM are quite different from the axioms for the concept of the measurement in the standerd QM. In the modified QM the measurement proces is the standard internal QM proces of the special type. But the external observation proces in the modified QM is something completely different from the external measurement proces in the standard QM. In each theory the external proces of the obseervation must be defined (at least implicitely) – without this no physical information can be obtained. The external proces of the observation (perhaps hidden) must be assumed. In fact, the concept of the observation makes possible to cut the von Neumann chain. The second basic difference consists in the fact that the measurement is related to the ensemble while the observation is related to the indvidual system.
Your Jiri SoucekJuly 16, 2015 at 6:59 pm #2804
Dear professor Zeh,
I would like to comment Your reply #2212
You have written “I think that every individual proposal that does not explicitely postulate the superposition principle … should at least indicate how it would justify the well established general applicability … of this most important principle of quantum mechanics.” My comment. In fact, there are two superposition principles. The individual superposition principle applicable to individual states (i.e. states of individual systems, i.e. ontic states) and the collective superposition principle applicable to collective states (i.e. states of ensembles, i.e. epistemic states). In the psi-ontic situation the collective superposition principle implies the individual superposition principle but this may be false in the situation which is not psi-ontic. (In such a situation Your statement might be partially a prejudice.) The collective superposition principle is generally considered as true. My proposal (http://vixra.org/pdf/1503.0109v1.pdf), the modified QM is not psi-epistemic but it is psi-hybrid, i.e. ontic-epistemic which means that some wave functions describe individual (ontic) states (typically individual states form the orthogonal base of the Hilbert space) while other wave functions describe collective (epistemic) states. In this situation the collective superposition principle holds, while the individual superposition principle does not hold (in fact, the anti-superposition princople holds). This is consistent position since the experimental proofs of the superposition principle are always concerned with ensembles. QM is a probabilistic theory and predicts only probabilities which can be tested only on ensembles. Moreover, I think that considerations concerning individual states cannot be experimentally tested since the standard QM and the modified QM have the same experimental consequences.
My position with respect to the locality: I think that locality should be an axiom of QM (especially with respect to Special Relativity). I hope that I have proved that my proposal, that the modified QM is local.
My position in Your taxonomy: 3b`4 – this means 3b, psi-hybrid but not psi-epistemic and completely local. I hope I have been able to show the locality of the modified QM.
Your Jiri SoucekJuly 16, 2015 at 10:05 pm #2814
Regarding your #2781. I do not defend the use of probability theory in quantum textbooks, which leaves much to be desired. But the consistent histories approach used a time symmetric formulation of probability in the very first paper I published on the subject in 1984, and that is still true, at least in the way I formulate it. Let me suggest you take a look at some of the items I have listed in the Consistent Histories topic, and if you want to dig deeper I can suggest relevant chapters in my book. Before you maintain that it cannot be done, let me suggest you take a look at a place where it has (I claim) been done. My approach may contain mistakes, in which case I would benefit from your pointing them out, but if not it is a counterexample to your assertion.
I do not deny that ‘observation’ may play a special role in your modified QM, but why is this an advantage if no similar concept is required in either classical physics or quantum physics interpreted using consistent histories?
Bob GriffithsJuly 19, 2015 at 6:33 pm #2903
I think You use the different concept of the probability theory than me. You say that in CH approach the time symmetric formulation of probability is used. OK, but in my concept of the probability theory this is impossible, since in my concept of the probability theory the evolution is strictly uni-directional. It is clear that we use different concepts of the probability theory. I think this is the point.
But nevertheless, there are similarities. In the extended theory of probability, the basic concept is the concept of a context. The context is a maximal set of mutually compatible events. In each context the extended probabiity is reduced to the standard probability. Each experiment must be associated with certain context: only events from this context can be observed in a given experiment. Thus the context can be understood as an analog of your framework. But I need much more time to study the CH approach.
Jiri SoucekJuly 20, 2015 at 1:57 pm #2912
My use of probability theory is standard (Kolmogorov), but the sample space is quantum mechanical (projective decomposition of the Hilbert space identity), and the single framework rule is strict. It may be that your ‘context’ is similar to my ‘framework’. But my approach is time symmetric, so there must still be a difference someplace.
Bob GriffithsMay 1, 2018 at 8:12 am #4732editorKeymaster
I think one of these pressing problems is to determine what physical state is eligible to represent a measurement result. And a deep analysis of the psychophysical connection is still needed to solve this problem.May 12, 2018 at 12:23 am #4807Quantum SpeculationsParticipant
I still think the psychophysical connection is one of these problems. See my recent paper:
- You must be logged in to reply to this topic.
Comments are closed, but trackbacks and pingbacks are open.