The article of my young countrymen Lindgren and Liukko is the latert one and has received a considerable attention in web. What is proposed comes clear from the title of the popular article: "A new interpretation of quantum mechanics suggests that reality does notdepend on the person measuring it".p> I skimmed the article through. First general arguments for why I am not enthusiastic.
- Uncertainty Principle emerges naturally if one accepts the classical description of particle as point of space-time is abstracted: one does not anymore consider point-like particles but wave functions: one makes transition from 3-space to function space defined in it. Second quantization means a further abstraction step. In logic second quantization has as analog the formation of statements about statements about statements. Sticking to classical physics means refusal to accept these abstractions: I find it hard to believe that we with our limited cognitive skills are the end end of cognitive evolution.
- The basic objection against classical interpretation is the experimental support for the non-classical correlations predicted by quantum theory and not producible in ordinary statistical theory such as thermo dynamics.
- Statistics in 4-D Minkowski space is considered. Ordinary classical picture assuming deterministic dynamics would of course use 3-D statistical distribution.There would be no dynamics but complete randomness. In particular, one would have no conservation laws. No energy, no momentum etc... One however speaks of momentum and energy.
- The tricky argument making several ad hoc assumptions lead to equality: ordinary Uncertainty Principle is inequality.
- One can also wonder how the Planck constant emerges from this picture. If one has a temperature like parameter (note that the distribution is 4-D) - call it T - then a combination of mass m and T should define a parameter with dimensions of Planck constant. P==sqrt(Tm) has dimensions of momentum p. What constant X could have dimensions of x so that one would have XP=hbar. One must bring in hbar - say by introducing X= hbar/sqrt(Tm). The analog hbar should therefore appear in the statistical distribution as a parameter. Why hbar should appear in this distribution.
- The introduction of probability distribution as something fundamental instead of Schroedinger equation looks to me very weird but is understandable. The decision to stay at the lowest level of abstraction inevitably leads to ad hoc constructs.
One modifies the standard unitary dynamics of the density matrix such that it leads to an eigenstate of the measured observables characterized by a set of mutually commuting observables. The proposed modification of the dynamics for the density matrix (not for quantum states) depends on what observables are measured. Mimicry is in question.
Ironically, one does not get rid of non-determinism! One does not however introduce (possibly random) dynamics of hidden variables. Randomness is assigned with the initial state described by hidden variables and characterized by measured observables: the non-determinism of state function reduction is replaced with that associated with the choice of the measured observables. The reduction occurs when the experiment begins! One cannot avoid the non-determinism by any trick!
By taking average over random variables one obtains the Born rule for the proposed modification of the dynamics making sense for a finite-D quantum system. It is not clear to me whether the proposed dynamics of the density matrix can be lifted to that for the states of the system formed by the observer and the observed system. The modification also means a loss of energy conservation: this is true for all observables since the expectation values of the observables change in the process.
For a summary of earlier postings see Latest progress in TGD.