Home › Forums › 2015 International Workshop on Quantum Foundations › Panel Discussion › Are there any pressing problems?
July 18, 2015 at 1:31 pm #2867H. Dieter ZehParticipant
I am a bit surprised that most participants of this workshop argue as though they never heard about decoherence. After all, decoherence describes all phenomena that were traditionally attributed to a collapse of the wave function – a process that was genuinely meant to solve the measurement problem. It does so (in its way) in terms of the microscopically well established unitary dynamics (that is, without any novel and hypothetical assumptions), and it has in several cases even been quantitatively confirmed. Is that an irrelevant achievement?
Physical theories have to be empirical, that is, they have to be able to serve practical purposes. Therefore, if a theory works “for all practical purposes” and in a consistent manner, this is almost all one may hope for. All one may try in addition is to possibly better understand WHY a theory works in its sense, but even the actual absence of (or disagreement on) such a further explanation would not represent an argument to dismiss a pragmatically successful theory. (Even Copenhagen was pragmatically successful, but it was not conceptually consistent.)
So I don’t feel any pressure to invent or study novel hypotheses without any novel empirical evidence.
Dieter ZehJuly 18, 2015 at 6:56 pm #2877
Dear Dieter Zeh,
I think there is one old hypothesis which was not analyzed yet: the nonrealism option from the dichotomy nonlocality vs. nonrealism. The nonlocality option was analyzed in many studies in last 50 years but there are only few papers that analyze the nonrealism option in some concrete explicit models. I think that such non-equilibrium is very bad. In fact, the nonlocality option did not show much success – old problems rest unsolved. I think that this is a mistake since the nonrealism option could offer new perspectives and new possibilities to solve old problems.
I tried to develop ecplicit proposals in this direction (cited in the attached notes) and perhaps something interesting can be found there.
I would like to comment Your opinion expressed in the reply #2212 that a proposal containing possible changes in some building blocks of QM like superpositon principle, for example, should be well justified. I try to do this. In attached notes I hope I have proved that (i) the predictions of modified QM are the same as predictions of the standard QM and (ii) the nonrealism option implies the necessity to abandon the individual superposition principle (“the superposition of individual states represents an individual state”) – here the term individual state means the state of the individual system (i.e. the ontic state). This means that without the possibility to change something in the superposition principle the nonrealism option is not realizable. I hope that the abandonment of the superposition principle in my papers on the modified QM (i.e. using the anti-von Neumann axiom) can be in this way justified. Your opinion on this matter would be very helpful for me.
Your Jiri SoucekJuly 19, 2015 at 8:54 am #2896H. Dieter ZehParticipant
I think my answer to your modified QM is contained in my last sentence above.
The empirically well established wave function that I am referring to is defined in configuration space, and hence nonlocal. If it is used consistently (as in decoherence theory), you may call it “real” (if you like). Indeed, individual states of isolated quantum systems (such as the He-atom – but also Bell states) are completely described by such nonlocal wave functions, while statistical properties occur only in connection with a (true or apparent) collapse. Von Neumann had replaced the kinematical dualism (wave/particle) by a dynamical one.
In the here much discussed Bohmian mechanics, in particular, the wave function is also used consistently (and was hence regarded as real by John Bell). As well known to Bohmians, it therefore also leads to decoherence. Without the thus arising autonomous branches of the wave function, they could not consistently speak of “empty components”. The trajectory is then merely used to define “our quantum world” (a tiny component of the global wave function) as the “occupied” branch. The precise definition of this branch is a matter of convenience. However, since the trajectory is unobservable, this selection by means of a trajectory remains a model-dependent hypothesis; the “subjective selection” of a specific branch from many, many others can be done without using any trajectories. In this – but only in this – sense I agree with the Copenhageners. As the other components would nonetheless “exist” under these assumptions, you are back at Everett!
Best, DieterJuly 19, 2015 at 5:41 pm #2902
in my study I only tried to analyze the nonrealism option using the concrete model. The nonrealism possibility exists already 50 years, but it was not explored. Thus the question is old, only the proposed solution is new.
The realism is equivalent to the von Neumann axiom (the wave function represents the individual state) hence in each nonrealism model the von Neumann axiom must be false and then also individual superposition principle must be false. I did not take the psi-epistemic position but the psi-hybrid position (some wave functions represent individual systems). In my analysis I have found the suprising fact that many old problems can be relatively simply solved in the modified QM (as expected, nobody believes that this is possible).
Your expectation that this needs the new empirical evidence cannot be satisfied in this case since the modified QM and the standard QM give the same predictions. The fact that two very different theories can give the same prdictions is strange but it seems that it is true (i.e. these theories are empirically indistinguishable). This implies that neither the von Neumann axiom nor the anti-von Neumann axiom have some empirical consequences – but the explanation power of these theories is different.
There is a question if the psi-epistemic variant can be realized (no-go theorems). If modified QM is consistent (I hope it is) then this is an example of the psi-epistemic model.
Your opinion that the wave function is nonlocal (I agree) is the kernel of the classical argument against von Neumann: the nonlocal wave function cannot represent the individual cat which is local (it can perhaps represent the ensemble of cats). The classical argument is, of course, the old Einsten`s example.
I think that the nonrealism option should be studied seriously since the opposite option, the nonlocality, was not fruitful.
Your JiriJuly 19, 2015 at 10:03 pm #2906Ken WhartonMember
As I see it, the biggest pressing problem is how to make sense of QM and GR in the same consistent framework. (I’m not sure if this counts as a “practical purpose” in most people’s accounts, but it does in mine.) Certainly, many people don’t think we need to wait for an ability to collect empirical evidence in situations where both QM and GR are relevant before we try to develop such a theoretical framework.
Now, most people don’t see this as a challenge for Quantum Foundations; they view it as a challenge for people figuring out how to quantize gravity, extending the QM/QFT formalism into GR’s domain. But the most plausible steps along this path seem to give up on the most cherished lessons of GR (choosing some global foliation, denying the reality of spacetime, etc.), and have generally failed to make it work. This raises the question of whether the key to cracking this problem is a Quantum Foundations issue after all. (Smolin mentioned this as a possibility near the end of ‘The Trouble with Physics’.)
Hardly anyone will even acknowledge there is another path towards solving this big QM/GR problem, but there clearly is. This other path starts with foundational approaches that explain quantum phenomena via some ontology that lives in ordinary spacetime. Given this starting point, GR-based approaches could in principle be applied to even entangled particles; it would not be “quantizing gravity”, it would be “spacetimeing quanta”. Bob Griffiths thinks this can essentially be done with consistent histories; Travis Norsen is trying to do this by extending lessons from Bohmian mechanics; I think it can be done with retrocausal hidden variable models. (For my account, see https://www.youtube.com/watch?v=qVUdrCooGA8 )
Quantum Gravity has gotten all the attention on this front, but it could be that the solution to this pressing problem will lie in the very foundational issues that have been discussed in this forum.July 19, 2015 at 11:10 pm #2908
I understand your question QM/GR. But there is also the simpler problem: how to make sense of QM and SR in the same consistent framework. Up to now, QM and Special Relativity cannot be considered in the same framework since QM is not local (at least this is the general opinion). At the moment there are not many proposals for the local QM. One of them is the Consistent History approach and another is the modified QM which I have proposed in the section nonlocality and relativity. At each case there is a preliminary problem of QM/SR before looking at the QM/GR problem.
- You must be logged in to reply to this topic.