Thursday, April 22, 2021

The rational and intuitive modes of problem solving from the TGD point of view

Reza Rastmanesh sent me a link to an article with title "The Impact of the Mode of Thought in Complex Decisions: Intuitive Decisions are Better". The following musings are inspired by this article.

As one learns from the article, it seems that problem solving and decision making rely on two basic approaches which correspond to right-left brain dichotomy.

  1. Rational thinking in the ideal case error free provided the basic assumptions are correct and data are reliable. It is however mechanical and cannot lead to "eurekas". Computers can nowadays do it more reliably than humans. In mathematics Goedel's theorem tells that mere rational deduction is not enough for basic arithmetics: in a given axiomatic system there is an infinite number of non-provable truths.
  2. Intuitive approach is less reliable but can be much faster, is holistic, based on affect rather than cold rationality, and can lead to new insights which only afterwards can be deduced but possibly only by adding some new basic assumption. In this case one can speak of a discovery.

What looks paradoxical is that besides induction of affective mood favoring intuitive problem solving, distraction is one way to induce intuitive thought. In TGD framework, the interpretation would be that distraction forces to give up the attempt to solve the problem at the level conscious to me - I am simply too stupid- , and delegates the problem to a higher level of the hierarchy (layers of magnetic body) representing higher level of abstraction (see this) and a more holistic view. This would make it possible to solve the problem.

A real life example about the connection with distraction is in order. In problem solving mood, I find that simple tasks of everyday life become difficult. I decide to do something, start to do this but decide to do also something at the same time, do it, and then realize that I do not remember what I had decided to do primarily, and even that I had decided to do something. I have seriously asked myself whether these are the first signs of dementia. The fact however is that this has been always the case - more or less.

My friends however tell me that there is no reason to worry, I am just what is called "absent minded professor". Perhaps I am indeed just absent minded - or almost permanently distracted - but certainly never a professor if this depends on colleagues.

I have many times experienced in real life that intuitive approach is more reliable than rational thinking when one must make decisions. I still find it difficult to confess that I have been cheated many times but I must do it now. I have felt from the beginning that this is happening but my rational mind has forced myself to believe that this is not the case. I have not wanted to insult the swindlers by somehow suggesting that I am not quite sure about their real motives.

Sleeping over night would be a basic example of this delegation of the problem to a higher intelligence. From personal experience sleeping over night is for me almost the only manner to get new ideas and solve problems which do not reduce to mere mechanical calculations. Often the problem and its solution pop up simultaneously during morning hours and going to the computer makes it possible to write out the details. The attempt to solve a problem by hard thinking later during the day does not lead anywhere.

An example about this relates to my own work. As some new idea has emerged, I have sometimes given it up after some rational thought. Later it has however turned out that the idea made sense after all but for different reasons that I had thought.

A concrete example relates to dark matter idenfied as heff=n×h0≥h phases of ordinary matter at magnetic body in the recent TGD based model.The problem was the following.

Blackman and many others observed at seventies that ELF radiation in EEG range has strange effects on the behavior of vertebrates visible also physiologically. These effects looked quantal. This however does not make sense in standard quantum theory since energies are incredibly small and far below the thermal energies. For this reason mainstream refused to take the effects seriously and it was forgotten.

  1. My first proposal was based on the notion of many-sheeted space-time. Perhaps the photons and ions responding to them were at space-time sheets at which the temperature is extremely low so that the thermal objection does not bite.
  2. Then I entered a different idea. Perhaps the value of Planck constant varies and one has a very large value heff=n×h0 of the effective Planck constant. n would correspond to the number of identical space-time sheets for the space-time surfaceas a covering space. This led to a nice theory and later I could deduce it from a number theoretic vision unifying real and various p-adic physics to adelic physics describing correlates of both sensory experience and cognition.

As I thought about this during last night, a question popped up. Could this original approach be correct after all? Could the heff approach be wrong? This would destroy 15 years of work: horrible! Or could these two approaches be consistent? This turned out to be the case!

  1. The temperature at flux tubes and flux quanta of the magnetic body (MB) is in general below Hagedorn temperature TH dictated by the flux tube thickness: the reason is that the number of geometric degrees of freedom is infinite. Flux tube behaves in good approximation like string and the notion of TH emerged in string models. For instance, in living matter TH corresponds to the physiological temperature, around 37 degrees Celsius for humans.
  2. TH is associated with dark matter with heff=n×h0, and n is the number of space-time sheets of the covering. TH characterizes n-sheeted structure. What is the temperature at a single sheet of covering?
  3. Thermal energy is proportional to the temperature. For an n-sheeted structure one has by the additivity of thermal energy for different identical sheets TH =n×TH(sheet) implying

    TH(sheet) =TH/n.

    For the huge values of heff and thus of n, T(sheet)H(sheet) is indeed extremely small! The original explanation is consistent with the number theory based explanation! Trust your intuition! But be however cautious!

For a summary of earlier postings see Latest progress in TGD.

Articles and other material related to TGD.

Tuesday, April 13, 2021

Does muon's anomalous anomalous magnetic moment imply new physics?

Lepton universality predicts that the magnetic moments of leptons should be the same apart from the corrections due to different masses. Leptons have besides the magnetic moment predicted by Dirac equation also anomalous magnetic moment which is predicted to come from various radiative corrections.

The standard model predictions for the anomalous magnetic moments of the electron are ae= (ge-2)/2= .00115965218091 and aμ =(gμ-2)/2= .00116591804.

The anomalous magnetic moments of electron and muon differ by .1 per cent. This breaking of universality is however due to the different masses of electron and muon rather than different interactions.

1. The finding of the Fermilab experiment

The breaking of universality could also come from interactions and the Fermilab experiment (see this) and earlier experiments suggest this. The experiment shows that in the case of muon the magnetic moment differs by from the predicted: the deviation from the standard model prediction is 2.5×10-4 per cent. This indicates that there might be interactions violating the lepton universality. Besides the problem with the muon's magnetic moment, which differs from that of the electron, there is also a second problem. The decays of B mesons seem to break universality of fermion interactions: indications for the breaking of universality have emerged during years so that this is not new.

The measurement result involves various sources of error and one can estimate the probability that the measurement outcome is due to this kind of random fluctuations. The number of standard deviations tells how far the measurement result is from the maximum of the probability distribution. The deviation is expressed using standard deviation as a unit. Standard deviation is essentially the width of the distribution. For instance, 4 standard deviations tells that the probability that the result is random fluctuation is .6 per cent. For 5 standard deviations from predicted is .0001 per cent and is regarded as the discovery limit.

2. Theoretical uncertainties

There are also theoretical uncertainties related to the calculation of magnetic moment. There are 3 contributions: electroweak, QCD, and hadronic contributions. The electroweak and QCD corrections are "easily" calculable. The hadronic contributions are difficult to estimate since perturbative QCD does not apply at the hadronic energies. There are groups which claim that their estimation of hadronic contributions produces a prediction consistent with the Fermilab finding and the earlier findings consistent with the Fermilab finding.

The prediction based on experimentally deduced R ratio characterizing the rate for the decay of a virtual photon  to  a qquark pair allows to estimate the hadronic contribution and gives a prediction for hadronic contributions which is in conflict with experimental findings. On the other hand, the calculations based on lattice QCD give a result consistent with the experimental value (see this). Should one trust experiment or theory?

3. Is a wider perspective needed?

To my opinion, one should see the problem from a bigger perspective than a question about how accurate the standard model is.

  1. Standard Model does not explain fermion families. Also GUTs fail in this respect: the mass ratios of fermions vary in the range spanned by 11 orders of magnitude. This is not a small gauge symmetry breaking but something totally different: mass scale is the appropriate notion and p-adic length scale hypothesis provides it.
  2. One must also challenge the belief that lattice QCD can describe low energy hadron physics. There might be much deeper problems than the inability to compute hadronic contributions to g-2. Perturbative QCD describes only high energy interactions and QCD might exist only in the perturbative sense.The fact is that low energy hadron physics is virtually existent. Saying this aloud of course irritates lattice QCD professionals but the reduction of QCD to thermodynamics in the Euclidian space-time looks to me implausible. There are deep problems with Wick rotation.

    For instance, massless dispersion relation E2-p2= 0 in M4 translates to E2+p2 =0 in E4: massless fields disappear completely since one has only E=0,p=0 zero mode. There are similar problems with the massless Dirac equation. For the massive case the situation is not so bad as this. There is the strong CP problem caused by instantons and a problem with multiplication of spinor degrees of freedom since the 4-D cube has the topology of 4-torus and allows 16 spinor structures.

    Quarks explain only a few per cent of hadron mass just as ordinary matter explains only a few percent of mass in cosmology. Hadron physics might therefore involve something totally new and color interaction could differ from a genuine gauge interaction.

    4. What TGD can say about family replication phenomenon?

    In TGD framework, the topological explanation of family replication phenomenon identifying partonic 2-surfaces as fundamental building blocks of elementary particles provides the needed understanding and predicts 3 different fermion generations corresponding to 3 lowest general: sphere, torus, and sphere with two handles (see this).

    Conformal Z2 symmetry for partonic 2-surfaces is present for the lowest 3 genera but not for the higher ones for which one must talk about many handle states with continuous mass spectrum. p-Adic thermodynamics allows to estimate the masses of new boson by simple scaling arguments and Mersenne prime hypothesis.

    In the TGD framework the two findings can be seen as indications for the failure of lepton universality. Besides 3 light fermion generations TGD also predicts 3 light generations for electroweak bosons, gluons, and Higgs. These generations are more massive than weak bosons and p-adic length scale hypothesis also allows to estimate their masses.

    The couplings of the lightest generations to the gauge bosons obey fermion universality (are identical) but the couplings of the 2 higher generations cannot do so since the charge matrices of 3 generations must be orthogonal to each other. This predicts breaking of fermion universality which in quantum field theory approximation comes from the loops coupling fermions to the 2 higher boson generations.

    This prediction is a test for TGD based topological view about family replication phenomenon in terms of the genus of partonic 2-surface: partonic 2-surface can be sphere, torus or sphere with two handles. TGD also explains why higher generations are experimentally absent.

    5. What does TGD say about low energy hadron physics?

    There is also the question about whether QCD catches all aspects of strong interactions. In TGD color magnetic flux tubes carry Kaehler magnetic energy and volume energy parametrized by length scale dependent cosmological constant so that a connection with cosmology indeed emerges. The reconnections of U-shaped flux tubes give rise to the TGD counterparts of meson exchanges of old-fashioned hadron physics. See this .

    Color group need not be a gauge group but analogous to a Kac-Moody group or Yangian group (only non-negative conformal weights). In TGD framework SU(3) at the level of M4xCP2 is not a gauge symmetry but acts as isometries of CP2 and fermions do not carry color as analog of spin but as angular momentum like quantum number. At the level of compelexified M8 SU(3) is a subgroup of G2 acting as octonion automorphism and defines Yangian replacing the local gauge group.

    For the TGD based model see this and this.

    For a summary of earlier postings see Latest progress in TGD.

    Articles and other material related to TGD.

Friday, April 09, 2021

EEG and the structure of magnetosphere

Roughly 15 years  ago I proposed the idea that Earth's  magnetosphere (MS) could serve as a sensory canvas in the sense that biological systems, in particular the vertebrate brain, could have sensory representations realized at the "personal" magnetic body (MB)  closely associated with the MS of the Earth. EEG would make communications to  and control by MB possible. 

 At that time I did not yet  have  the idea about number theoretical realization of the  hierarchy of Planck constants heff=nh0 in the framework of adelic physics fusing the physics of sensory experience and cognition. This hierarchy is crucial for understanding the basic aspects of living matter such as metabolism, coherence in long scales, correlates of cognition, and even evolution.

Also the concept of zero energy ontology (ZEO) forming now the basis of the quantum TGD was missing although there was already the about communication to past using negative energy signals. ZEO is now in a central role in the understanding of self-organization - not only the biological one. The new view about time predicting that time reversal occurs in ordinary state function reductions (SFRs) allows to understand homeostasis as self-organized quantum criticality. 

For these reasons it is interesting to consider the notion of sensory canvas from the new perspective. This article discusses besides  the earlier ideas about the MS  also the proposal that it is possible to associate EEG bands to the regions of MS via the correspondence between EEG   frequency with the distance of the region from Earth.   Also the idea  that the structure of MS could be a  fractal analog of the vertebrate body is tested quantitatively by comparing various scales involved.

See the article EEG and the structure of magnetosphere or the chapter with the same title.

For a summary of earlier postings see Latest progress in TGD.

Articles and other material related to TGD . 


Three alternative generalizations of Nottale's hypothesis in TGD framework

Nottale's gravitational Planc constant ℏgr= GMm/v0 contains  the velocity parameter v0 as the only parameter. In the perturbative expansion  of  the scattering amplitudes β0=v0/c appears  in the role of fine structure constant.    

There is however a problem.

  1. The model  for the effects of ELF radiation on vertebrate brain  inspired by  a generalization of Nottale's hypothesis by replacing the total mass M in the case of Earth by MD≈ 10-4ME suggests that in this case the dark particles involved couple only to a part of mass identifiable as dark mass MD.
  2.   Since only GM appears in the basic formulas, the  alternative option is that the value of G is reduced to GD. This conforms with the fact that in the  TGD framework CP2 length is the fundamental parameter  G is a prediction of the theory and therefore can vary. 
  3. A further option is that the parameter β0=v0/c≤ 1 is variable and equals to β0=1 or to a value not much smaller than 1, say β0=1/2.
These three options are  critically discussed and compared. The cautious conclusion is that the the third option is the most plausible one.

See the article Three alternative generalizations of Nottale's hypothesis in TGD framework or the chapter About the Nottale's formula for hgr and the relation between Planck length and CP2 length.

For a summary of earlier postings see Latest progress in TGD.

Articles and other material related to TGD. 


Friday, April 02, 2021

Does Goedel's incompleteness theorem hold true for reals?

I have many times wondered whether the incompleteness theorem extends to real numbers, which are usually the stuff used by physics. There is a very nice discussion of this point here. Strongly recommended.

Real numbers and all algebraically closed number fields such as complex numbers and algebraic numbers are complete. All truths are provable. If physics is based on complex numbers or algebraic numbers, Goedel's theorem has no direct implications for physics.This however implies that integers cannot be characterized using the axiomatics of these number fields since if this were the case, Gdel's incompleteness theorem would not hold true for integer arithmetics. One can also say that Goedel numbers for unprovable theorems are not expressible as a natural number but are more general reals or complex numbers.

Since algebraic numbers are complete, a good guess is that algebraic numbers label all true statements about integer arithmetics and also about arithmetics of algebraic integers for extensions of rationals.

In TGD adelic physics definescorrelates for cognition. Adeles for the hierarchy labelled by algebraic extensions (perhaps also extensions involving roots of e since ep is p-adic number). These are not complete and Goedel's incompleteness theorem applies to them. Only at the never achievable limit of algebraic numbers the system becomes complete. This would strongly suggest a generalization of Turing's view about computation by replacing integer arithmetics with a hierarchy of arithmetics of algebraic integers associated with extensions of rationals. See this article .

For a summary of earlier postings see Latest progress in TGD.

Articles and other material related to TGD.