Friday, October 18, 2019

Dark matter nightmare is not due to wrong equations but wrong philosophy

Sabine Hossenfelder wrote an article "Dark matter nightmare: What if we are just using the wrong equations?" (see this) suggesting that we are using wrong equations in general relativity. The point is that the field equations of general relativity are nonlinear and the averaging is not consistent with the non-linearity. Average metric does not obey Einstein's equations.

This is true but to my view cannot solve the problem. Far from source region the relative error made in averaging distance in range [r,r+Deltar] with the mean value of r and r+Deltar is of order Deltar/r and far from sources very small. One can write perturbation series in powers Delta r/r and higher contributions go rapidly to zero. One can also argue that Einstein's equations emerge at long length scale limit of the theory so that the argument would not work. One must replace Einstein equations with something more deeper.

I am convinced that the problem is not at the level of numerics but much deeper and much more general. It is about the basic philosophy behind recent day theoretical physics dominated by reductionism and physicalism. During years I have learned that also other fields of physics and also chemistry, biology, neuroscience, and consciousness are full of deep problems.

What I call naive length scale reductionism is the mother of all problems that we are not able to solve.

  1. Superstring models were hoped to be the last jewel in the crown of reductionism. With one huge step we would have jumped directly to Planck scale. It did not happen. Superstring models led to landscape problem. Then it turned out that landscape hoped to explain the small value of cosmological constant and its sign very probably does not exist. One must accept that superstring theory predicts huge cosmological constant with wrong sign. I other worlds: superstrings are dead since single wrong prediction is enough to kill the theory.

  2. The wrong path was however chosen much earlier. GUTs providing extremely naive generalization of standard model were the fatal step preceding superstrings. The fact that GUTs could not make any testable prediction except proton decay which did not happen, should have put bells ringing. But already standard model view about quark color is very probably wrong: for instance, axion difficulty and the failure to describe hadrons suggests strongly this. Why standard model gauge group is not at all understood.

  3. In spirit of reductionism nuclear physics have been pretended to be a mature branch of physics although it is just a collection of models each explaining some aspect of nuclear physics. 10 years ago emerged a very serious problem with the fact that nuclear abundances deduce from the physics of solar interior are different from those defined for solar spectrum and meteorites. "Cold fusion" and nuclear transmutations have been known for about century and have been labelled as pseudoscience but the situation has changed and even new technologies are under development. The transition from nucleons to nuclei is not understood: we do not even know what nuclei are!

  4. Atomic physics works rather nicely but also here there are some anomalies. The transitions from atomic to molecular physics to chemistry to organic chemistry to biochemistry are not really understood. Biochemistry fails to explain the coherence of bio-systems and cannot say anything about extreme effectiveness of bio-catalysis and the magic ability of reacting bio-molecules to find each other in molecular soup. Reductionism seems to be therefore an illusion.

  5. The tragic implication of the length scale reductionism is that scientists began to believe that they can solve deep problems of their own field without any knowledge about problems in other fields of physics. Particle physics know only about particle physics, nuclear physics only about nuclear physics, there is a camp of general relativists, there is a camp of superstringers,... The new information technology makes it possible to produce huge amounts of highly interesting data but specialists close their eyers for most of it. Here is the real reason for the stagnation of theoretical physics lasted now for more than four decades. And what is worst is that although the existing models are known to be wrong the work with them continues: the funding must go on.

In this kind of situation we should challenge the old dogmas. We must challenge even quantum theory and General Relativity, the great achievements of the previous century.
  1. Quantum measurement theory is plagued by a paradox forcing to give up ontology altogether in Copenhagen interpretation and to regard quantum theory as a mere collection of calculational recipes. In TGD framework one is forced to construct quantum measurement theory on basis of zero energy ontology. One implication is that classical theory is exact part of quantum theory and one can understand why quantum world manages to look classical. One deep implication is that ordinary state function reduction changes the arrow of time. For this there is now direct evidence from the experiments of Minev et al. The outcome not only solves the basic paradox of quantum measurement theory but extends physics to a theory of conscious experience and also makes radical and testable predictions. This of course means leaving the narrow confines of physicalism.

  2. General relativity is plagued by the conceptual problems due to the loss of Poincare symmetries so that the notions of energy and momentum are not defined in principle. This has led to endless variety of ad hoc proposals. Since the existence of conserved energy implies the existence of S-matrix, this could explain the failure to quantized GRT. String models were very promising attempt and the identification of string world sheets as 2-surfaces allowed to identify 4-momentum as Noether charges.

    What about replacement of 4-geometry with 4-surfaces in suitably chosen higher-D imbedding space with Poincare symmetries acting as isometries? When this question is made, it is easy to end up with TGD. Standard model symmetries emerge from this framework as prediction emerging from both number theoretical vision and twistor lift of TGD. Quark color differs from QCD color and in the simplest scenario this difference makes it possible to regard leptons as local composite of 3-quarks analogous to super partners of quarks. SUSY in TGD sense is essential element of second quantization of fermions.

    The study of classical TGD leads to identification of stringlike entities - cosmic strings -as basic 4-D surfaces. During cosmic expansion they thicken to magnetic flux tubes which serve as building bricks of galaxies. Twistor lift leads to length scale dependent cosmological constant and predicts that cosmic expansion takes place in jerks involving reduction of cosmological constant implyin acceleration period followed by slowing down.

    One outcome is an explanation of galactic dark matter predicting automatically the flat velocity spectrum of stars. No exotic particles are needed and there is no halo, whose existence has been challenged by data. There are also galaxies which apparently lack dark matter: also this can be understood. Galactic dark matter could be actually dark energy consiting of magnetic and volume energy of monopole flux tubes formed from cosmic strings. Quantum classical correspondence suggests that dark energy and dark matter can be identified. ‎

    The phases of ordinary particles with non-standard value of effective Planck constant heff= nh0 behave in many aspects like dark matter. The existence of these phases follows from number theoretic vision following from the question: what could be the correlates of cognition. Also this question goes outside the confines of physicalism. The outcome is what I call adelic physics, and the integer n corresponds to the dimension of extension of rationals defining the adele. This leads to a dramatic breakthrough in quantum biology according to TGD. The implications are seen in all scales since quantum coherence is possible in all scales and I have been busily deducing the implications.

If some colleague would bother to learn about TGD and asked what is the secret of the success of TGD I would give a simple answer. Challenge the old dogmas! This has led to a new view about space-time giving at QFT limit Einsteinian space-time and Einstein's equations in long length scales. The equations at fundamental level are very different from Einstein's equations and extremely simple and number theory becomes part of physics. TGD forces to give up physicalism and reductionism is replaced by fractality. To me this shows that tinkering with old equations leads nowhere: one must start from much deeper level.

For a summary of earlier postings see Latest progress in TGD.

Articles and other material related to TGD.

No comments: