https://matpitka.blogspot.com/2015/01/

Wednesday, January 28, 2015

Maintenance problem for the Earth's magnetic field

In Science Daily there was an interesting popular article about what might be called maintenance problem for Earth's magnetic field, which has very important functions such as serving as a shield against cosmic rays which is very dangerous for life.

The understanding of the Earth's magnetic field, call it BE for shortly, is indeed still far from complete. One problem is to understand why it can exist stable at all. The idea is that the convective heat flow from the core of Earth provides the needed energy to compensate for the dissipation. Also the understanding of the orientation reversals of BE is poor I remember that numerical simulations can reproduce them.

The popular article explained work by Zhang, Cohen and Haule in which the problem related to the maintenance of convective flow is claimed to be solved. If the conductivity and thus heat conductivity in core is low enough, heat conduction is replaced with convection and this creates the flow of charge too and one obtains convective roll pattern which gives rise to the current taking care that BE is preserved. The problem is that the conductivity is too high in the metal core. The proposal is that an improved model for the conductivity taking care of electron-electron scattering cures the problem. Knowing how hypish science communications are nowadays, I would not take this claim as final truth.

The problem requires study of Maxwell's equations (for explicit equations see this).

  1. The first basic equation for BE is Faraday' equation stating that time derivative of BE is the rotor of electric field. This is true in TGD too as is also the equation stating that there are no magnetic monopoles. In TGD CP2 topology however allows monopole fluxes, which can exist without any generating currents. Second basic equation is Ohm's law saying that current is proportional to electric field: the proportionality constant is conductivity σ.

    Together these equations give a partial differential equation for BE containing diffusion term proportional to the Laplacian of BE with a coefficient inversely proportional to conductivity. Since finite conductivity means dissipation of energy, one can expect that in absence of energy feed, the current and magnetic field gradually disappear. According to a naive estimate this would take few thousands of years. This does not of course happen. Note however that the polarity of BE can change in time scale varying from .1 My to 50 My.

  2. Energy is needed to maintain BE and the current generating it. The energy source would be the heat flowing from Earth's core to the surface. If conductivity and thus thermal conductivity (electron current carries also energy and thus heat) in the interior is not too high, the diffusion of heat proportional to conductivity is not high enough to carry out the heat and convection sets on and the stuff begins to boil. The boiling together with Earth's rotation causes rolling flow of the current above core around the earth, and this convective current generates BE and keeps it alive.

  3. The problem has been that the electron conductivity in the metal core is too high in the core to allow convective currents. The authors of paper claim that the existing model for the electronic conductivity in metals involving only electron atom scattering contains an additional contribution from electron-electron scattering and that this term cures the situation. It improves the situation but one can still remain skeptic whether this term is really enough.

    What worries me too is that the direction of magnetic axis can differ a lot from that of the rotation axis of Earth. How to understand this difference. Also the apparent randomness of the orientation reversals looks strange.

What this has to do with TGD? The basic problem is still to understand the orientation reversals and it is not quite clear that even stability problem has been solved completely.
  1. TGD differs from Maxwell's theory in that monopole fluxes are possible and realized as flux tubes for which cross section is closed 2-surface carrying magnetic charge. Also flux quant are possible carrying monopole flux are possible. These flux quanta would carry also dark matter.

    Could the Earth's magnetic field contains also a dark contribution, call it BD from monopole fluxes? Could the interaction of visible matter with dark matter be essential for maintaining BE and for its orientation reversals? Could Magnetic Mother Gaia do the orientation reversals "intentionally" in order to not lose the magnetic shield against cosmic radiation?

  2. The essental point is that monopole fluxes require no currents to generate them. This contribution would give approximately topologically quantized variant of dipole field with flux quanta which could be either flux tubes or sheets and return flux would be along magnetic axis. Suppose that sheets are in question. There would be also corresponding electric field E= v× BD at the flux sheets according to the dynamo mechanism. v would be the rotation velocity of the dark particles.

    There would be radial Lorentz force F=q v× BD driving charged dark particles radially outwards if the sign of the velocity is current. Could the presence of the Lorentz force causing dark current help to initiate radial convective current of ordinary charged particles bringing hot matter to the surface and cooled matter downwards and in this manner give rise to the convective heat transfer?

  3. What about changes of the polarity of the Earth's magnetic flux quanta? Could they be induced by the changes of the direction of the dark magnetic field at dark flux sheets (say). If the flux sheets of monopole field carry a current of rotating charged dark charged particles, the rotation direction changes as the flux quantum changes its orientation. This guarantees a minimal convective flux is radial and towards the surface. Angular momentum would be changed to its negative in full orientation reversal and the increment of the angular momentum would go by angular momentum conservation to the ordinary matter, perhaps mostly to the ordinary electrons and generate electron currents having twice the original dark angular momentum. After this ordinary electrons would start to dissipate the just inherited newly angular momentum again.

  4. What could induce the changes of the orientation of BD? Could the two contributions to the total magnetic field be regarded as two magnetic dipoles having dipole interaction realized as a torque proportional to the cross product of the dipole moments. This would cause orientation changes of the dipoles and thus of the fields. The torque vanishes if the dipoles are parallel and dipoles gradually become parallel by dissipation for BE. This does not seem to be the mechanism.

    Could it be that the dynamics of the dark magnetic flux quanta is purely quantal and induces the dynamics of BE by angular momentum conservation? As the strength of the BE becomes too weak to shield Earth from cosmic radiation, Magnetic Mother Gaia takes the lead and turns its magnetic body to to a new orientation, which by angular momentum conservation forces the ordinary electrons to a rotation around new magnetic axis and much brisker BE is regenerated in new direction. Magnetic Mother Gaia takes would take good care of his prodigal son!

See the chapter About Strange Effects Related to Rotating Magnetic Systems of "TGD and Fringe Physics" or the article The maintenance problem for Earth's magnetic field .

Tuesday, January 27, 2015

Biochemical communications as a prerequisite for dark photon communications?

In Quantum Biology, coherence and decoherence there have been innumerable links to various hot topics in biology about which I know virtually nothing. I have managed to catch only some key notions like behavior, nutrients, nutrinogenomics, nutrinogenetic signaling, pheromone, hormones, ecology. For instance, nutrients are found to have epigenetic effect on gene expression: they affect behaviour! Pheromones have effects on behavior. I understand that molecular biologist try to reduce these effects to chemical communications and biochemical pathways.

I cannot of course say anything interesting about this horribly complex molecular biology except that I believe that something immensely important could be missing: the notion of magnetic body carrying dark matter and controlling also biochemistry. My attempts to understand rely on the conceptual framework, which is the triple (magnetic body (MB), biological body (BB), environment) replacing the pair (BB, environment) in the usual approach. MBs are the intentional agents affecting other MBs or BBs and being affected by them. So I must try to understand these concepts in terms of these notions which I dare to regard as physical.

The first thing I can do is to assign my pet notions to words like behaviour, nutrient, pheromone, hormones, etc.. and try to understand whether these horrible biochemical complexities could reflect something very simple at the deeper level.

  1. TGD Universe obeys Zero Energy Ontology (ZEO) predicting that basic objects can be regarded as 4-dimensional surfaces associated with pairs of 3-surfaces at opposite boundaries of causal diamonds (CDs), which by strong form holography can be reduced to correspond pairs of collections partonic 2-surfaces and their 4-D tangent space data. Time-locality however remains and behavior is assigned to time evolution of magnetic body (MB).

    Quantum self organisation replaces this 4-D magnetic body with new one in each quantum jump so that our geometric future and past are not fixed but evolve to an asymptotic self-organization pattern. In ZEO quantum jumps define sequences of state function reductions on fixed boundary of CD (and are analogous to repeated measurement which however affect only the part of zero energy state associated with this CD boundary fixed whereas in ordinary ontology the entire state would remain unaffected). The acts of free will have as quantum counterparts the quantum jumps in which state function reduction agains to the opposite boundary of CD and the arrow of geometric time changes at that particular level in the hierarchy of CDs. Self corresponds to a sequences of state function reductions at fixed boundary and dies when the boundary changes. There is entire hierarchy of selves corresponding to hierarchy of CDs and sub-selves correspond to mental images of self.

  2. Behavior pattern having as correlate 4-D magnetic body inside given CD is a very general notion: already DNA replication, transcription, translation, biochemical pathways, associations in nervous system, our behaviors,.. are all induced by behaviour patterns represented by 4-D magnetic bodies in appropriate time and length scales.

  3. If MB wants to affect behaviour it must affect MB at the lower level or BB directly. Reconnection by using U-shaped magnetic flux tubes to generate double flux tube connection is the basic mechanism for this and identifiable as correlate for directed attention. Stable reconnection requires that cyclotron frequencies of dark charge particles at flux tubes and therefore also the strengths of magnetic fields and thicknesses are same for the reconnected U-shaped flux tubes. The signalling is thus based on dark photons which can transform to ordinary photons identified as bio-photons. Dark photons can in this manner affect biochemistry since their energy spectrum is in the energy range of excitations of biomolecules (ranging from .5 eV (metabolic energy quantum) to visible and UV).

    In the general situation several frequencies are involved and serve as kind of passwords. The model for musical harmony and genetic code in terms of bio-harmony relying on icosahedral Hamilton's cycles predicts that DNA codons and amino-acids correspond to 3-chords defining what might be called bio-harmony: in fact, 256 different bio-harmonies are predicted. The corresponding frequencies can be in the range of audible frequencies and it is quite possible that music of dark photons is realized in biology. Each molecule could correspond and produce its own collection of chords, maybe even melody somewhat like the characters in the operas of Wagner!

  4. This process would operate at all levels. Basic biomolecules are scanning their environment using these U shaped flux tubes and reconnecting. Immune system tries to detect invader molecules using reconnection followed by resonant exchange of dark photons, followed by the reduction of heff shortening the length of the flux tube and bringing the unlucky invader near the immune soldier to be mercilessly destroyed. This scanning can be done also with positive intentions: DNA replication, transcription to mRNA, translation, etc.. are examples
    of this.

  5. This picture leads to an interpretation what happens , when biomolecule attaches to a receptor. Biomolecule - say neural transmitter or hormone - is the end of (a potential) communication line - plug formed by the U-shaped flux tube. When it attaches to the receptor, the owner of the receptor is plugged to the web and can send and receive dark photons resonantly to the receiver. Therefore bio-chemical communications are at deeper level not yet communications but only sending of plugs making possible real communication by dark photons.

In the following some more or less random comments inspired by this proposal.
  1. States have both armies and diplomats. Immune system is the counterpart of army trying to detect invaders and destroy them. There must be also a system trying to find potential friends and collaborators. We indeed co-operate with bacteria and the significance of this aspect seem to be increasingly realized. For instance, in Quantum Biology, coherence and decoherence there is a link to an article about collaboration some exotic sea animal with micro-organisms: the animal actively builds connections to the micro-organisms and here also reconnection mechanism is highly suggestive.

  2. Hormones are usually regarded as purely chemical means of communication. I have proposed that they make possible communication using dark photon signals propagating along communication lines defined by flux tubes. The attachment of biomolecule to a receptor in cell membrane is for plugging in: the biomolecule in receptor can be connected by flux tube pair to quite specific biomolecules or magnetic bodies.

  3. The effects of psychedelics and entheogens such as naturally occurring psylocybin (see also this) could involve even flux tubes connections to distant civilizations or higher level conscious entities! This sounds of course totally outlandish but is not my original proposal;-). In ZEO finite light velocity is not a problem since signals can travel also backwards in time). In this case the lengths of flux tubes would be very long and extend to distant galaxies.

  4. Pheromones (see this link as example) are like hormones except that they affect the behaviour of another member of the same species (or more generally?) by inducing epigenetic influences. Female butterfly emits pheromones and male receives them and connection is established by magnetic flux tube to the female's BB or MB. After than male flies to the direction in which the connection becomes stronger. This mechanism is same as used by birds in TGD Universe to find from Africa to the same place in Norther Finland every year;-). Also food odors have epigenetic influences. The same plugin model
    applies also to these effects: chemical signals are actually plugs connecting to the web and making possible signalling
    by dark photons.

  5. Also nutrients could fit nicely to this picture if also they are plugs connecting BB to some MB rather than just source of metabolic energy. To stay alive means is to stay connected to the web;-). Magnetic zombies die soon! The MB in question could be not only personal MB but also the dark MB of Mother Gaia as I have suggested (see this . The explanation of the Pioneer and Flyby anomalies (see this ) allow to consider a concrete identification as approximately spherical flux sheet carrying dark matter and having radius of Moon's orbital radius. The density of the dark matter would be universal and about .8 kg/m2. It would be approximately spherical and involve also a flux tube through the magnetic axis so that closed flux lines would result. This is of course just an innocent suggestion.

  6. Even mushrooms communicate via underground network analogous to neural networks and formed with roots and mycellium in forests (see this) and magnetic flux tube networks could be naturally at the background.

To sum up, communications would be a crucial aspect of being intelligent living system and the proposal is that magnetic body carrying dark matter and photons plays key role in these communications.



Monday, January 26, 2015

Could sparticles have same p-adic mass scale as particles and be dark matter in TGD sense?

I already wrote about how the sloppy thinking concerning symmetries can be seen as one of the reasons for the dead end in recent day theoretical physics which crystalllizes in landscape and multiverse catastrophes. As I explained, the detailed realization of TGD counterpart of SUSY has been a long-standanding problem although it has been clear that the realization of SUSY is in terms of right handed neutrinos, and that the breaking mechanism of SUSY, and more generally, that of particle massivation, relies on the mixing of massless fermions induced by the properties of modified gamma matrices.

I attach below most the earlier text about SUSY in TGD but as a considerably more detailed version. The main suggestion is that particles and sparticles have the same p-adic length scale, maybe even same masses, but that stability condition favors large heff for sparticles so that they corresponde to dark matter in TGD sense. One of the implications is that sparticles could play a role in biology.

LHC results suggest MSSM does not become visible at LHC energies. This does not exclude more complex scenarios hiding simplest N=1 to higher energies but the number of real believers is decreasing. Something is definitely wrong and one must be ready to consider more complex options or totally new view abot SUSY.

What is the situation in TGD? Here I must admit that I am still fighting to gain understanding of SUSY in TGD framework. That I can still imagine several scenarios shows that I have not yet completely understood the problem and am working hardly to avoid falling to the sin of sloppying myself. In the following I summarize the situation as it seems just now.

  1. In TGD framework N=1 SUSY is excluded since B and L and conserved separately and imbedding space spinors are not Majorana spinors. The possible analog of space-time SUSY should be a remnant of a much larger super-conformal symmetry in which the Clifford algebra generated by fermionic oscillator operators giving also rise to the Clifford algebra generated by the gamma matrices of the "world of classical worlds" (WCW) and assignable with string world sheets. This algebra is indeed part of infinite-D super-conformal algebra behind quantum TGD. One can construct explicitly the conserved super conformal charges accompanying ordinary charges and one obtains something analogous to N=∞ super algebra. This SUSY is however badly broken by electroweak interactions.

  2. The localization of induced spinors to string world sheets emerges from the condition that electromagnetic charge is well-defined for the modes of induced spinor fields. There is however an exception: covariantly constant right handed neutrino spinor νR: it can be de-localized along entire space-time surface. Right-handed neutrino has no couplings to electroweak fields. It couples however to the left handed neutrino by induced gamma matrices except when it is covariantly constant. Note that standard model does not predict νR but its existence is necessary if neutrinos develop Dirac mass. νR is indeed something which must be considered carefully in any generalization of standard model.

Could covariantly constant right handed neutrinos generate SUSY?

Could covariantly constant right-handed spinors generate exact N=2 SUSY? There are two spin directions for them meaning the analog N=2 Poincare SUSY. Could these spin directions correspond to right-handed neutrino and antineutrino. This SUSY would not look like Poincare SUSY for which anticommutator of super generators would be proportional to four-momentum. The problem is that four-momentum vanishes for covariantly constant spinors! Does this mean that the sparticles generated by covariantly constant νR are zero norm states and represent super gauge degrees of freedom? This might well be the case although I have considered also alternative scenarios.

What about non-covariantly constant right-handed neutrinos?

Both imbedding space spinor harmonics and the modified Dirac equation have also right-handed neutrino spinor modes not constant in M4. If these are responsible for SUSY then SUSY is broken.

  1. Consider first the situation at space-time level. Both induced gamma matrices and their generalizations to modified gamma matrices defined as contractions of imbedding space gamma matrices with the canonical momentum currents for Kähler action are superpositions of M4 and CP2 parts. This gives rise to the mixing of right-handed and left-handed neutrinos. Note that non-covariantly constant right-handed neutrinos must be localized at string world sheets.

    This in turn leads neutrino massivation and SUSY breaking. Given particle would be accompanied by sparticles containing varying number of right-handed neutrinos and antineutrinos localized at partonic 2-surfaces.

  2. One an consider also the SUSY breaking at imbedding space level. The ground states of the representations of extended conformal algebras are constructed in terms of spinor harmonics of the imbedding space and form the addition of right handed neutrino with non-vanishing four-momentum would make sense. But the non-vanishing four-momentum means that the members of the super-multiplet cannot have same masses. This is one manner to state what SUSY breaking is.

What one can say about the masses of sparticles?

The simplest form of massivation would be that all members of the super- multiplet obey the same mass formula but that the p-adic length scales associated with them are different. This could allow very heavy sparticles. What fixes the p-adic mass scales of sparticles? If this scale is CP2 mass scale SUSY would be experimentally unreachable. The estimate below does not support this option.

One can even consider the possibility that SUSY breaking makes sparticles unstable against phase transition to their dark variants with heff =n× h. Sparticles could have same mass but be non-observable as dark matter not appearing in same vertices as ordinary matter! Geometrically the addition of right-handed neutrino to the state would induce many-sheeted covering in this case with right handed neutrino perhaps associated with different space-time sheet of the covering.

This idea need not be so outlandish at it looks first.

  1. The generation of many-sheeted covering has interpretation in terms of breaking of conformal invariance. The sub-algebra for which conformal weights are n-tuples of integers becomes the algebra of conformal transformations and the remaining conformal generators do note represent gauge degrees of freedom anymore. They could however represent conserved conformal charges still.

  2. This generalization of conformal symmetry breaking gives rise to infinite number of fractal hierarchies formed by sub-algebras of conformal algebra and is also something new and a fruit of an attempt to avoid sloppy thinking. The breaking of conformal symmetry is indeed expected in massivation related to the SUSY breaking.
The following poor man's estimate supports the idea about dark sfermions and the view that sfermions cannot be very heavy.
  1. Neutrino mixing rate should correspond to the mass scale of neutrinos known to be in eV range for ordinary value of Planck constant. For heff/h=n it is reduced by factor 1/n, when mass kept constant. Hence sfermions could be stabilized by making them dark.

  2. A very rough order of magnitude estimate for sfermion mass scale is obtained from Uncertainty Principle: particle mass should be higher than its decay rate. Therefore an estimate for the decay rate of sfermion could give a lower bound for its mass scale.

  3. Assume the transformation νR→ νL makes sfermion unstable against the decay to fermion and ordinary neutrino. If so, the decay rate would be dictated by the mixing rate and therefore to neutrino mass scale for the ordinary value of Planck constant. Particles and sparticles would have the same p-adic mass scale. Large heff could however make sfermion dark, stable, and non-observable.
A rough model for the neutrino mixing in TGD framework

The mixing of right- and left handed neutrinos would be the basic mechanism in the decays of sfermions. The mixing mechanism is mystery in standard model framework but in TGD it is implied by both induced and modified gamma matrices. The following argument tries to capture what is essential in this process.

  1. Conformal invariance requires that the string ends at which fermions are localized at wormhole throats are light-like curves. In fact, light-likeness gives rise to Virasosoro conditions.

  2. Mixing is described by a vertex residing at partonic surface at which two partonic orbits join. Localization of fermions to string boundaries reduces the problem to a problem completely analogous to the coupling of point particle coupled to external gauge field. What is new that orbit of the particle has corner at partonic 2-surface. Corner breaks conformal invariance since one cannot say that curve is light-like at the edge. At the corner neutrino transforms from right-handed to left handed one.

  3. In complete analogy with Ψbar;γtAtΨ vertex for the point-like particle with spin in external field, the amplitude describing nuRL transition involves matrix elements of form νbarRΓt(CP2)ZtνL at the vertex of the CP2 part of the modified gamma matrix and classical Z0 field.

    How Γt is identified? The modified gamma matrices associated with the interior need not be well-defined at the light-like surface and light-like curve. One basis of weak form of electric magnetic duality the modified gamma matrix corresponds to the canonical momentum density associated with the Chern-Simons term for Kähler action. This gamma matrix contains only the CP2 part.

The following provides as more detailed view.
  1. Let us denote by ΓtCP2(in/out) the CP2 part of the modified gamma matrix at string at at partonic 2-surface and by Z0t the value of Z0 gauge potential along boundary of string world sheet. The direction of string line in imbedding space changes at the partonic 2-surface. The question is what happens to the modified Dirac action at the vertex.

  2. For incoming and outgoing lines the equation

    D(in/out)Ψ(in/out)= pk(in,out)γk Ψ(in/out) ,

    where the modified Dirac operator is D(in/out)=Γt(in/out)Dt, is assumed. νR corresponds to "in" and νR to "out". It implies that lines corresponds to massless M4 Dirac propagator and one obtains something resembling ordinary perturbation theory.

    It also implies that the residue integration over fermionic internal momenta gives as a residue massless fermion lines with non-physical helicities as one can expect in twistor approach. For physical particles the four-momenta are massless but in complex sense and the imaginary part comes classical from four-momenta assignable to the lines of
    generalized Feynman diagram possessing Euclidian signature of induced metric so that the square root of the metric determinant differs by imaginary unit from that in Minkowskian regions.

  3. In the vertex D(in/out) could act in Ψ(out/in) and the natural idea is that νRL mixing is due to this so that it would be described the classical weak current couplings νbarR ΓtCP2(out)Z0t(in)νL and νbarR ΓtCP2(out)Z0t(in)νL.

To get some idea about orders of magnitude assume that the CP2 projection of string boundary is geodesic circle thus describable as Φ= ω t, where Φ is angle coordinate for the circle and t is Minkowski time coordinate. The contribution of CP2 to the induced metric gtt is Δ gtt =-R2ω2.
  1. In the first approximation string end is a light-like curve in Minkowski space meaning that CP2 contribution to the induced metric vanishes. Neutrino mixing vanishes at this limit.

  2. For a non-vanishing value of ω R the mixing and the order of magnitude for mixing rate and neutrino mass is expected to be R∼ ω and m∼ ω/h. p-Adic length scale hypothesis and the experimental value of neutrino mass allows to estimate m to correspond to p-adic mass to be of order eV so that the corresponding
    p-adic prime p could be p≈ 2167. Note that k=127 defines largest of the four Gaussian Mersennes MG,k= (1+i)k-1 appearing in the length scale range 10 nm - 2.5 μm. Hence the decay rate for ordinary Planck constant would be of order R∼ 1014/s but large value of Planck constant could reduced it dramatically. In living matter reductions by a factor 10-12 can be considered.

See also the pdf article What went wrong with symmetries?.

Sunday, January 25, 2015

Individual nucleons inside nuclei do not behave according to predictions

Individual nucleons do not behave in nuclei as the existing theory predicts (see the popular article). This is a conclusion reached by an international team of scientists which has published their findings as article article in Phys. Rev. Letters).

I am not a nuclear physicists but have proposed what I call nuclear string model. Despite this I to try to understand what has been found and what nuclear string model can say about the findings.

Background and results

There are many models of atomic nuclei and each of them explains some aspects of nucleus. Nucleus can be modelled rigid body or as a kind of quantum liquid. In the prevailing average field approach the presence of other nucleons is described in terms of a potential function and calculates the states of individual nucleons in this potential using Schrödinger equation. It is essential that nucleons are assumed to be independent.

The model taking potential function to be that of harmonic oscillator is surprisingly successful but one must introduce corrections such as spin-orbit coupling in order to understand the energy spectrum. In this approach the notion of nuclear shell emerges. In atomic physics and chemistry the closed shells do not participate to the interaction and the outermost shell characterized by valence dictates to a higher degree the chemical properties of atom. Valence is positive if outer shell contains particles. Valence if negative if some of them are lacking. Something similar is to be expected also now. In this case full shells correspond to magic numbers for protons and neutrons separately (note that protons and neutrons seem to behave rather independently, something highly non-trivial!). The nuclei with valence +1 or -1 would correspond to almost magic nuclei.

One generally accepted correction to the harmonic oscillator model is inspired by the assumption that heavier nuclei can be described as a kind of blob of condensed matter obeying equation of state allowing to introduce notions like acoustic waves and surface waves. The nucleon at the unfilled shell would reside at the surface of this blob. The blob has vibrational excitations characterized by multipolarity (spherical harmonic characterized by angular momentum quantum numbers and the radial part of the oscillation amplitude. These excitations give rise to analogs of surface waves in water. Valence nucleons interact with the oscillations and affect the energy levels of the valence nucleons. The predictions of this model are calculable.

The team has studied almost doubly magic nuclei with valence equal to plus or -1 and calculated the effects on the energy levels of the nucleon and found that the observed effects are signficantly smaller than the predicted ones. This finding challenges both the mean field approach or the idea that nucleus can be idealized as a condensed matter like system or both.

Nuclear string model

In TGD framework ordinary model of nucleus is replaced with what I call nuclear string model. The basic ideas of the nuclear string model are following.

  1. Nuclei consist of string like objects: protons and neutrons connected by color magnetic flux tubes form string like objects, perhaps separately. The color magnetic flux tubes would would be meson-like objects and could even carry net color. They are either neutral (quark and antiquark at the ends of flux tube have opposite charges) or carry em charge.

    This predicts a large number of exotic states. The exotic states cannot be distinguished chemically from the isotopes of the nucleus. The energy scale of the excitations could be in keV range: in this case their existence could explain the annual variation of nuclear decay rates which seems to be caused by X rays from Sun. Second possibility is that the energy scale is in MeV range for nuclear energies.

    This would be new nuclear physics and perhaps relevant also to the cold fusion. The energy scale would derive from the string tension of the color magnetic flux tube. The lengths of the color magnetic flux tubes corresponding to keV scale would be rather long and correspond color magnetic bodies of the nucleons. If this is the case then the color magnetic energy of the system would depend only weakly on the positions of the nucleons of string inside nuclear volume. This assumption might allow to understand the anomalous finding that the charge radius of proton is smaller than predicted.

    The presence of long flux tubes might allow to understand the anomalous finding that the charge radius of proton is smaller than predicted. u and d quarks are known to be light and have masses in the range 5-20 MeV. The TGD based model for elementary particles (see this) suggests that quarks correspond to closed flux tubes consisting of two portions at parallel space- time sheets with ends connected by wormhole contacts and with monopole magnetic flux rotating in the tube. Uncertainty principle suggests that the length of the flux tube structure is of the order of Compton length of the quark. The constituents of proton would be larger than proton itself! The paradox disappears if the Compton length is assigned with the magnetic flux tube connecting the two wormhole contacts associated with quark and rather near to each other and much shorter than the flux tube.

    Flux tubes with Compton lengths corresponding to 10 keV photon energy would be however 3 orders of magnitude longer (10 nm). This could be due to the scaling by heff/h ≈ 103. These flux tubes could also correspond to the flux tubes connecting neighboring nucleons of nuclear strings. The dark magnetic flux tubes of this length associated with neighboring nuclei could reconnect and bind nuclei to form lattice like structures. This process and thus dark nuclear physics could play a key role in in the formation of condensed matter phases as it is proposed to play also in living matter.

  2. These strings of nucleons could topologically condense to larger magnetic flux tubes but
    could still touch also the nuclear spacetime sheet as suggested by the success of harmonic oscillator model. In biological length scales the assumption that effective Planck constant characterizing dark matter phase equals heff=n× h equals to gravitational Planck constant GMm/v0, where v0 is a parameter with dimensions of velocity, implies that cyclotron frequencies are universal (no dependence on particle mass m) but also implies that particles with different masses correspond to different value of effective Planck constant so that living system would perform spectroscopy putting particles (elementary particles,atoms, ions, molecules,..) neatly at different dark space-time sheets! If the nucleons inside nuclei are dark in this sense protons and neutrons would be at different flux tubes since their masses are slightly different.

  3. Nucleus could consist of several - possibly knotted - closed flux tubes containing some number of nucleons each. An attractive hypothesis is that these flux tubes correspond to nuclear shells so that full flux tubes would correspond to full shells and define separate units. In semiclassical approximation this would mean that nuclear string is localized at the surface of sphere.

  4. Nuclear string defines a closed non-intersecting curve going through the vertices of polyhedron with n vertices known as Hamilton cycle. If color magnetic flux tubes are long, it is convenient to consider a curve defined by line segments connecting the neighboring nucleons of the nuclear string.

    The notion of Hamilton cycle is well-defined for any graph so that it makes sense for any polyhedron. It is enough that the cycle is consistent with the underlying graph structure allowing to say which vertices are nearest neighbours (they need not be neighbours in the metric sense but only in the sense of homology that is ends of the same edge).

    In the case of Platonic solids the rotational symmetries preserving Platonic solid generate finite number of Hamilton cycles of same shape from a given one and it is natural to define Hamilton cycles as equivalence classes of cycles with same shape. For instance, for icosahedron one has 17 Hamilton cycles and for 11 cycles one has symmetry group Zn, n∈ 6,4,2 and the cycles obtained from them by rotations. In this case one can however say that independent particle approximation is given up and one considers equilibrium configurations analogous to those of molecules. Nuclear string however orders the nucleons and brings in additional information. Hamilton cycles make sense also for the deformations of icosahedron since it is only the homological nearness that matters. Note however that the allowed deformations of metric Hamilton cycle must be such that the edges do not intersect: in other words the deformation of nuclear string is not self intersecting.

  5. If the nucleons can perform only small oscillations around the vertices of a polyhedron, independent particle assumption fails badly. One would however have collective wave function for orientations of the polyhedron. In this case Platonic solids or their homological generalization define good candidates for full shells.

How does nuclear string model relate to the shell model?

In the mean field approximation particles move independently in a potential describing the effects of the other nucleons. The basis for N-nucleon wave functions can be constructed as products of those associated with individual nucleons. Under what conditions nuclear string model is consistent with independent particle approach?

  1. At classical level the independent motion of nucleons (along elliptic orbits in harmonic oscillator approximation) of the nuclear string would give rise to a rather complex motion of nuclear string leading to self intersections unless the flux tubes have much longer length scale than the nucleus. In this case nucleus would be like seed from which flux tubes would emerge like a plant and self intersections could be avoided but the motion of nucleons could induce local braiding of the strands emanating from nucleons. This is indeed what has been assumed. Note that the (say) U shaped flux tubes connecting neighboring nucleons could reconnect with the similar tubes associated with other nuclei so that the motions of nucleons would give rise to genuine global braiding.

  2. Harmonic oscillator states would induce wave function in the space of string configurations having interpretation as Hamilton cycles associated with polyhedron with N vertices whose positions can vary, also in the radial direction although semiclassical shell model would force particles at the same radius. TGD allows to consider a collective localization at spherical shells: this would be rather long range correlation but consistent with the spirit of shell model. A more general approximation would be the localization to a union of spherical shells associated with the maxima of radial wave function.

  3. In independent particle model basis wave functions are products. This is not consistent with the assumption that nucleons arrange to form a string unless the nearest neighbour nucleons at string can have arbitrary angular distance along the sphere: this would hold true exactly at the limit of vanishing string tension.

    The longer the angular distance, the higher the color magnetic energy of the string. This energy would give rise to correlations inducing the mixing of harmonic oscillator wave functions. This would be the minimal breaking of independent particle approximation and would describe possibly new kind of nuclear forces between neighboring nucleons of the nuclear string as color magnetic forces.

    If the color magnetic interaction corresponds to MeV scale, the length scale of the flux tubes is electron's Compton length and even in this case considerably longer than nuclear radius and independent particle approximation would not be badly broken. In this case the interpretation in terms of strong force might make sense. Even for the flux tubes which length of order Compton length for u and d quarks the flux tubes are much longer than the distance between nucleons.

    If the energy scale of exotic nuclei is 1-10 keV as the variation of the nuclear decay rates seemingly induced by the variations of X ray flux from Sun suggests, the color magnetic energy would be rather small and independent particle approximation would even better than in previous case. This is expected to be the case if the color magnetic flux tubes correspond to the length scale assignable to 1-10 keV scale and thus long so that the positions of nucleons inside nucleus do not matter. 10 keV scale would in fact correspond to photon wavelength about 1 Angstrom - size of atom - so that a new interaction between nuclear and atomic physics is predicted. Note that classical and quantal pictures are consistent with each other.

Semiclassical considerations

One can consider the situation also semi-classically.

  1. Nuclear shells correspond in the Bohr model based on harmonic oscillator potential to spheres with radii fixed by Bohr's quantization rules. Wave functions are indeed concentrated also around the classical radius but for principal quantum number n one obtains n +1 local maxima (see this). The wave function at given shell would be localized at n+1 surfaces rather than single surface, which is definitely a non-classical aspect. The probability density however concentrates mostly to the shell with the largest radius so that for large values of n the semiclassical approximation becomes better.

    One can of course ask, whether this picture contains deeper seed of truth expressible in terms of space-time topology. This would conform with the TGD based idea that matter resides on geometric shells: this idea is suggested already by the model for a final state of star predicting that mass is concentrated on shell. In many-sheeted space- time one expects an onion-like structure made of these shells.

    The TGD based proposal is that in solar system planets would be accompanied by this kind of dark matter shells with radii predicted approximately by Bohr rules. TGD based explanation for Pioneer and Flyby anomalies (see this) predicts the same surface density of dark matter at these shells as deduced for the effective surface density of dark matter in the case of galactic nucleus. Of course, nucleons inside nuclei cannot correspond to dark matter unless the value of heff/n=n is small. Otherwise the size of nucleus would be too large.

  2. In the semiclassical approximation the radii of the sphere at which the vertices of polyhedron are located would correspond to the radii of nuclear shells. An approximation in which one treats the angular degrees of freedom quantally using independent particle model and radial degree of freedom collectively looks reasonable and would allow to keep the rotational symmetries but would mean giving up the additional symmetries making if possible to solve harmonic oscillator model exactly. With this assumption nuclear strings would reside at spheres.

Could magic numbers be understood in terms of Platonic solids?

Harmonic oscillator model predicts the numbers of nucleons for magic nuclei as sums of numbers of nucleons for the full shells involved but the predictions are not quite correct. One can however modify the model to get the observed magic numbers. Could these numbers be consistent with the idea that a full shell corresponds to a Platonic solid such that closed nuclear string, which can connect only neighboring vertices goes through its vertices without intersecting itself?

  1. This kind of curves are known as Hamilton cycles and icosahedral and tetrahedral Platonic cycles are in a key role in TGD inspired vision about bio-harmony predicting that DNA sequences have interpretation as sequences of 3-chords of what I call bio-harmony realizing genetic code (see this).

    One can also consider replacing metric Platonic solid with combinatorial objects in which neighboring vertices are defined to be ends of the same edge which can be rather long. This option is consistent with independent particle model in angular degrees of freedom.

  2. If the polyhedron is Platonic solid (cube, octahedron, tetrahedron,icosahedron, dodecahedron) the number of nucleons at given shell would be equal the number of vertices of the Platonic solid. One can of course consider more complex scenarios. One could consider adding nucleons also to the centers of edges and faces and even superpose different Platonic solids associated with the same sphere. Same Platonic solid could also appear as scaled variants.

  3. One could consider building the nuclei by adding new spherical layers gradually and assuming that the nucleons are at the vertices (one could consider also putting them in the centers of the faces). The lowest magic numbers are 2,8,20,28,50,82,126,184 and are reproduced of shells have n=2,6,12,8,22,32,44,58. In standard approach one can say that each oscillator wave function corresponds to two spin directions so that the proper number to consider would be m=n/2. The values of m would be be m=1,3,6,4,11,16,22,29. For nuclear strings n is the correct number of nuclear strings are not allow to intersect themselves so that single point of string cannot contain two nucleons. Also protons and neutrons can be assumed to belong to different nuclear strings.

Could one understand the numbers n in terms of Platonic solids?

  1. n=2 would correspond to line segment with 2-vertices. n=6 would correspond to octahedron. n=12 would correspond to icosahedron. n=8 would correspond to cube. Note that tetrahedron, the only self-dual Platonic solid, predicting n=4 iand dodecahedron with n=20 are missing fron the list.

  2. After this the situation does not look simple: dodecahedron would predict n=20 instead of n=22=4+6+12. The interpretation in terms of a composite tetrahedron + icosahedron could be considered. Tetrahedron would contain nucleons both at its vertices (4) and edges (6) and icosahedron at its vertices. This looks however rather tricky and the general model does not of course predict Platonic solids.

These findings would suggest that the independent particle model is not a good approximation for light nuclei for which a model as a molecule like entity with rather rigid position of nucleons can be considered if Platonic solids are taken as metric objects.

The experimental findings from TGD point of view?

On basis of the experimental findings it is far from clear whether one can model nuclei as objects characterized by continuous nucleon densities and obeying some thermodynamical equation of state from which the dynamics describing the oscillations of nucleon densities can be deduced.

  1. Suppose that nuclear shells make in TGD framework sense also as geometric objects, that is as (say) spherical space-time sheets containing the nuclear string for which the nucleons at vertices behave independently in angular degrees of freedom. In this kind of model the approximation as condensed matter blob is not the thing that comes first into mind. It would be like modelling of solar system by replacing planets by introducing planet density and oscillations of this density.

  2. If the shell contains only single particle, the collective wave function for the radius of the sphere associated with shell co-incides with single particle wave function. In this case one cannot say that the particle is at the surface of nucleus.

  3. There is no direct interaction with the oscillations of the full shell in the lowest order since the shells correspond to different space-time sheets. The interaction is only in terms of potential functions assignable to the large space-time sheet.

See also the pdf article Individual nucleons inside nuclei do not behave according to predictions.

Saturday, January 24, 2015

What went wrong with symmetries?

Theoretical physics is in deep crisis. This is not bad at all. Crisis forces eventually to challenge the existing beliefs. Crisis gives also hopes about profound changes. In physical systems criticality means sensitivity, long range fluctuations and long range correlations, and this makes phase transition possible. In TGD framework life emerges at criticality!

The crisis of theoretical physics has many aspects. The crisis relates closely to the sociology of science and to the only game in the town attitude. The prevailing materialistic philosophy of science combined with the naive length scale reductionism form part of the sad story. The seeds of the crisis were sown in birthdays of quantum mechanics. The fathers of quantum theory were well aware that quantum measurement theory is the Achilles heel of the newborn quantum theory but later the pragmatically thinking theoreticians labelled questioning of the basic concepts as "philosophy" not meant for a respectable physicist.

The recent quantum measurement theory is just a collection of rules and observer still remains an outsider. To my view the proper formulation of quantum measurement theory requires making observer a part of systems. This means that physics must be extended to a theory of consciousness.

This raises several fundamental challenges and questions. How to define "self" as a conscious entity? How to resolve the conflict between two causalities: that of field equations and that of "free will"? What is the relationship between the geometric time of physicist and the experienced time? How is the arrow of time determined and is it always the same? The evidence that living matter is macroscopic quantum system is accumulating: is a generalization of quantum theory required to describe quantum systems? What about dark matter: can we understand it in the framework of existing quantum theory? This list could be continued.

In the following I will not consider this aspect more but restrict the consideration to an important key notion of recent day theoretical physics, namely symmetries. Physical theories rely nowadays on postulates about symmetries and there are many who say that quantum theory reduces almost totally group representation theory. There are refined mathematical tools making possible to derive the implications of symmetries in quantum theory such as Noether's theorem. These technical tools are extremely useful but it seems that methodology has replaced critical thought.

By this I mean that the real nature of various symmetries has not been considered seriously enough and that this is one of the basic reasons for the recent dead end. In the following I describe what I see as the mistakes due to sloppy thinking (maybe "sloppying" might be shorthand for it) and discuss briefly the TGD based solution of the problems involved.

This sloppiness manifests itself already in general relativity, in standard model there is no unification of color and electroweak symmetries and their different character is not understood, GUT approach is based on naive extension of gauge group and makes problematic predictions, supersymmetry in its standard form predicted to become visible at LHC energies is now strongly dis-favoured experimentally, and superstring model led to landscape catastrophe what has left is AdS/CFT correspondence which has not led to victories. Could it be that also conformal invariance should be re-considered seriously: a non-trivial generalization to 4-D context is highly desirable so that 10-D bulk would be replaced by 4-D space-time in the counterpart of AdS/CFT duality.

Energy problem of GRT

Energy and momentum are not well-defined notions in General Relativity. The Poincare symmetry of flat Minkowski space is lost and one cannot apply Noether's theorem so that the identification of classical conserved charges is lost and one can talk only about local conservation guaranteed by Einstein's equations realizing Equivalence Principle in weak form.

In quantum theory this kind of situation is highly unsatisfactory since Uncertainty Principle means that momentum eigenstates are delocalized. This is sloppy thinking and the fact that quantization is to high extend representation theory for symmetry groups might well explain the failure of the attempts to quantize general relativity.

TGD was born as a reaction to the challenge of constructing Poincare invariant theory of gravitation. The identification of space-times as 4-surfaces of some higher- dimensional space of form H=M4× S lifts Poincare symmetries from space-time level to the level of imbedding space H.

In this framework GRT space-time is an approximate macroscopic description obtained by replacing the space-time sheets of many-sheeted space-time with single piece of M4, which is slightly curved. Gravitational fields -deviations of induced metric from Minkowski metric- are replaced with their sum for various sheets. Same applies to gauge potentials. Einstein's equations express the remnants of Poincare symmetry for the GRT space-time obtained in this manner.

In superstring models one actually considers 10-D Minkowski space so that the lifting of symmetries is possible. Also the compactification (say Calabi-Yau) to M4× C still have Poincare symmetries. But after that one has 10- D gravitation and the same problems that one wanted to solve by introducing strings! School example about sloppying!

Is color symmetry really understood?

Many colleagues use to think that standard model is a closed chapter of theoretical physics. This is a further example of sloppy thinking.

  1. Standard model gauge group is product of color and electro-weak groups which are totally independent. The analogy with Maxwell's equations is obvious. Only after Maxwell and Einstein they could be seen as parts of single tensor representing gauge field.

  2. QCD and electroweak interactions differ in crucial manner. Color symmetry is exact (no Higgs fields in QCD) whereas electroweak symmetry is broken, and QCD is asymptotically free unlike electroweak interactions. In QCD color confinement takes place at low energies and remains still poorly understood.

Again TGD approach suggests a solution to these problems in terms of induced gauge field concept and a more refined view about QCD color.
  1. S=CP2 has color group SU(3) as isometries and electroweak gauge group as holonomies: hence CP2 unifies these symmetries just like Maxwell's theory unified electric and magnetic fields. Note that the choice of H= M4× CP2 is not adhoc: its factors are the only 4- D spaces allowing twistor spaces with Kähler structure.

  2. One can understand also the different nature of these symmetries. Color group represents exact symmetries so that symmetry breaking should not take place. Holonomies are tangent space symmetries and broken already at the level of CP2 geometry and does not therefore give rise to genuine Noether symmetries. One can however assign broken electroweak gauge symmetries to the holonomies.

    The isometry group defines Kac-Moody algebra in quantum TGD and color group acts as Kac-Moody group rather than gauge group. The differences is very delicate since only the central extension of Kac-Moody algebra distinguishes it from gauge algebra.

  3. Color is not spin-like quantum number as in QCD but colored states correspond to color partial waves in CP2 rather. Both leptons and quarks allow colored excitations which are however expected to be very heavy.

Is Higgs mechanism only a parameterization of particle masses?

The discovery of Higgs at LHC was very important step of progress but did not prove Higgs mechanism as a mechanism of massivation as sloppy thinkers believe. Fermion masses are not a prediction of the theory: they are put in by hand by assuming that Higgs couplings are proportional to the Higgs mass. It might well be that Higgs vacuum expectation value is the unique quantum field theoretic representation of particle massivation but that QFT approach cannot predict the masses and that the understanding of the massivation requires transcending QFT so that one describing particles as extended objects. String models were the first step to this direction but one step was not enough.

In TGD framework more radical generalization is performed. Point-like particle is replaced with a 3-surface and particle massivation is described in terms of p-adic thermodynamics, which relies on very general assumptions such as a non-trivial generalization of 2-D conformal invariance to 4-D context to be discussed later, p-adic thermodynamics, p-adic length scale hypothesis, and mapping of the predictions for p-adic mass squared to real mass squared by what I call anonical identification. In this framework Higgs vacuum expectation value parametrizes the QFT limit already described and is calculable from generalized Feynman diagrammatics.

GUT approach as more sloppy thoughts

After the successes of standard model the naive guess was that theory of everything could be constructed by a simple trick: extend the gauge group to a larger group containing standard model gauge group as sub-group. One can do this and there is a refined machinery allowing to deduce particle multiplets, effective actions, beta functions, etc.. There exists of course an infinite variety of Lie groups and endless variety of GUTs have been proposed.

The view about the Universe provided by GUTs is rather weird looking.

  1. Above weak mass scale there should be a huge desert of 14 orders of magnitudes containing no new physics! This is like claiming that the world ends at my backyard.

  2. Only the sum of baryon and lepton numbers would be conserved and proton would be unstable. The experimental lower limit for proton lifetime has been however steadily increasing and all GUTs derived from superstring models share a fine tuning to keep proton alive.

  3. Standard model gauge group seems to be all that is needed: there are no indications for larger gauge group. Fermion families seem to be copies of each other with different mass scales. Also the mass scales of these fermions differ dramatically and forcing them to multiplets of single gauge group could also be sloppy thinking. One would expect that the masses differ by simple numerical factors but they do not.

From TGD viewpoint the GUT approach is un-necessary.
  1. In TGD quarks and leptons correspond to different chiralities of imbedding space spinors. 8-D chiral invariance implies that quark and lepton numbers are separately conserved so that proton does not decay - at least in the manner predicted by GUTs.
    CP2 mass scale is of same order of magnitude as the mass scale assigned to the super heavy additional gauge bosons mediating proton decay.

  2. Family replication phenomenon does not require extension of gauge group since fermion families correspond to different topologies for partonic 2-surfaces representing fundamental particles (genus-generation correspondence). Note that the orbits of partonic 2-surfaces correspond to light-like 3-surface at which the induced metric changes its signature from Euclidian to Minkowskian: these surfaces or equivalently the 4-surfaces with Euclidian signature can be regarded as lines of generalized Feynman diagrams. The three lowest genera are special in the sense that they always allow Z2 as global conformal symmetry whereas higher genera allow this symmetry only in case of hyper-elliptic surfaces: this leads to an explanation for the experimental absence of higher genera. Higher genera could be more naturally many particle states with continuum mass spectrum with handles taking the role of particles.

  3. p-Adic length scale hypothesis emerging naturally in TGD framework allows to understand the mass ratios of fermions which are very un-natural if different fermion families are assumed to be related by gauge symmetries.

Supersymmetry in crisis

Supersymmetry is very beautiful generalization of the ordinary symmetry concept by generalizing Lie-algebra by allowing grading such that ordinary Lie algebra generators are accompanied by super-generators transforming in some representation of the Lie algebra for which Lie-algebra commutators are replaced with anti-commutators. In the case of Poincare group the super-generators would transform like spinors. Clifford algebras are actually super-algebras. Gamma matrices anti-commute to metric tensor and transform like vectors under the vielbein group (SO(n) in Euclidian signature). In supersymmetric gauge theories one introduced super translations anti-commuting to ordinary translations.

Supersymmetry algebras defined in this manner are characterized by the number of super-generators and in the simplest situation their number is one: one speaks about N=1 SUSY and minimal super-symmetric extension of standard model (MSSM) in this case. These models are most studied because they are the simplest ones. They have however the strange property that the spinors generating SUSY are Majorana spinors- real in well-defined sense unlike Dirac spinors. This implies that fermion number is conserved only modulo two: this has not been observed experimentally. A second problem is that the proposed mechanisms for the breaking of SUSY do not look feasible.

LHC results suggest MSSM does not become visible at LHC energies. This does not exclude more complex scenarios hiding simplest N=1 to higher energies but the number of real believers is decreasing. Something is definitely wrong and one must be ready to consider more complex options or totally new view abot SUSY.

What is the situation in TGD? Here I must admit that I am still fighting to gain understanding of SUSY in TGD framework. That I can still imagine several scenarios shows that I have not yet completely understood the problem and am working hardly to avoid falling to the sin of sloppying myself. In the following I summarize the situation as it seems just now.

  1. In TGD framework N=1 SUSY is excluded since B and L and conserved separately and imbedding space spinors are not Majorana spinors. The possible analog of space-time SUSY should be a remnant of a much larger super-conformal symmetry in which the Clifford algebra generated by fermionic oscillator operators giving also rise to the Clifford algebra generated by the gamma matrices of the "world of classical worlds" (WCW) and assignable with string world sheets. This algebra is indeed part of infinite-D super-conformal algebra behind quantum TGD. One can construct explicitly the conserved super conformal charges accompanying ordinary charges and one obtains something analogous to N=∞ super algebra. This SUSY is however badly broken by electroweak interactions.

  2. The localization of induced spinors to string world sheets emerges from the condition that electromagnetic charge is well-defined for the modes of induced spinor fields. There is however an exception: covariantly constant right handed neutrino spinor νR: it can be de-localized along entire space-time surface. Right-handed neutrino has no couplings to electroweak fields. It couples however to the left handed neutrino by induced gamma matrices except when it is covariantly constant. Note that standard model does not predict νR but its existence is necessary if neutrinos develop Dirac mass. νR is indeed something which must be considered carefully in any generalization of standard model.

Could covariantly constant right-handed spinors generate exact N=2 SUSY? There are two spin directions for them meaning the analog N=2 Poincare SUSY. Could these spin directions correspond to right-handed neutrino and antineutrino. This SUSY would not look like Poincare SUSY for which anticommutator of super generators would be proportional to four-momentum. The problem is that four-momentum vanishes for covariantly constant spinors! Does this mean that the sparticles generated by covariantly constant νR are zero norm states and represent super gauge degrees of freedom? This might well be the case although I have considered also alternative scenarios.

Both imbedding space spinor harmonics and the modified Dirac equation have also right-handed neutrino spinor modes not constant in M4. If these are responsible for SUSY then SUSY is broken.

  1. Consider first the situation at space-time level. Both induced gamma matrices and their generalizations to modified gamma matrices defined as contractions of imbedding space gamma matrices with the canonical momentum currents for Kähler action are superpositions of M4 and CP2 parts. This gives rise to the mixing of right-handed and left-handed neutrinos. Note that non-covariantly constant right-handed neutrinos must be localized at string world sheets.

    This in turn leads neutrino massivation and SUSY breaking. Given particle would be accompanied by sparticles containing varying number of right-handed neutrinos and antineutrinos localized at partonic 2-surfaces.

  2. One an consider also the SUSY breaking at imbedding space level. The ground states of the representations of extended conformal algebras are constructed in terms of spinor harmonics of the imbedding space and form the addition of right handed neutrino with non-vanishing four-momentum would make sense. But the non-vanishing four-momentum means that the members of the super-multiplet cannot have same masses. This is one manner to state what SUSY breaking is.

  3. The simplest form of massivation would be that all members of the super- multiplet obey the same mass formula but that the p-adic length scales associated with them are different. This could allow very heavy sparticles. What fixes the p-adic mass scales of sparticles? If this scale is CP2 mass scale SUSY would be experimentally unreachable.

  4. One can even consider the possibility that SUSY breaking makes sparticles unstable against phase transition to their dark variants with heff =n× h. Sparticles could have same mass but be non-observable as dark matter not appearing in same vertices as ordinary matter! Geometrically the addition of right-handed neutrino to the state would induce many-sheeted covering in this case with right handed neutrino perhaps associated with different space-time sheet of the covering.

    This idea need not be so outlandish at it looks first. The generation of many.sheeted covering has interpretation in terms of breaking of conformal invariance. The sub-algebra for which conformal weights are n-tuples of integers becomes the algebra of conformal transformations and the remaining conformal generators do not represent gauge degrees of freedom anymore. They could however still represent conserved conformal charges.

    This generalization of conformal symmetry breaking gives rise to infinite number of fractal hierarchies formed by sub-algebras of conformal algebra and is also something new and a fruit of an attempt to avoid sloppy thinking. The breaking of conformal symmetry is indeed expected in massivation related to the SUSY breaking.

Have we been thinking sloppily also about super-conformal symmetries?

Super string models were once seen as the only possible candidate for the TOE. By looking at the proceedings of string theory conferences one sees that the age of super strings is over. Landscape problem and multiverse do not give much hopes about predictive theory and the only defence for super string models is as the only game in the town. Super string gurus do not know about competing scenarion but this is not a wonder given the fact that publishing of competing scenarios has been impossible since superstrings have indeed been the only game in the town! One of the very few almost-predictions of superstring theory was N=1 SUSY at LHC and it seems that it is already now excluded at LHC energies.

AdS/CFT correspondence is a mathematical outcome inspired by super-string models. One of the variants of its variants states that there is duality between conformal theory in M4 appearing as boundary of 5-D AdS and string theory in 10-D space AdS5× S5. A more general duality would be between conformal theory in Mn and 10-D space AdSn+1× S10-n-1. For n=2 the CFT would give conformal theory at 2-D Minkowski space for which conformal symmetries (actually their hypercomplex variant) form an infinite-D group. Duality has interpretation in terms of holography but the notion of holography is much more general than AdS/CFT.

AdS/CFT have been applied to nuclear physics but nothing sensational have been discovered. AdS/CFT have been tried also to explain the finding that what was expected to be QCD plasma behaves very differently. The first findings came from RHIC for heavy ion collisions and LHC has found that the strange effects appear already for proton heavy ion collisions. Essentially a deviation from QCD predictions is in question and in the regime where QCD should be a good description. AdS/CFT has not been a success. AdS/CFT is now applied also to condensed matter physics. At least hitherto no dramatic successes have been reported.

This leads to ask whether sloppy thinking should be blamed again. AdS/CFT is mathematically rather sound and well-tested but is the notion of conformal invariance behind it really the one that applies to real world physics?

  1. In TGD framework the ordinary conformal invariance is generalized so that it becomes 4-D one: of course, the ordinary finite-dimensional conformal group in M4 is not in question. The basic observation is that light-like 3-surfaces are metrically 2-dimensional and that this leads to a generalization of conformal transformations. One can locally express light-like 3-surfaces as X2× R and what happens is that the conformal transformations of X2 are localized with respect to the light-like coordinate of R. Light-like orbits of partonic 2-surfaces carrying elementary particle quantum numbers would have this extended conformal invariance.

  2. This is not all. In zero energy ontology (ZEO) the diamond like intersections of future and past directed light-cones - causal diamonds (CDs) are the basic objects. The space-time surfaces having 3-D ends at the boundaries of CD are the basic dynamical units. The boundaries of CD are pieces of δ M4+/-× CP2. The boundary δ M4+/- = S2×R+ is light-like 3-surface and thus allows a huge extension of conformal symmetries: with complex coordinate of S2 and light-like radial coordinate playing the roles of complex coordinate for ordinary conformal symmetry.

    Besides this there is a further analog of conformal symmetry. The symplectic transformations of δ M4+/-× CP2 can be regarded as symplectic transformations of S2× CP2 localized with respect to the light-like coordinate of R+ defining the analog of the complex coordinate z. In TGD Universe a gigantic extension of the conformal symmetry of superstring models experiences applies.

  3. Even these extended symmetries extend to a multi-local (loci correspond to partonic 2-surfaces at boundaries of CD) Yangian variant. Yangian symmetry is very closely related to quantum groups studied for decades but again without serious consideration of the question "Why quantum groups?". The hazy belief has been that they somehow emerge at Planck length scale, which itself is a hazy notion based solely on dimensional analysis and involving Planck constant and Newton's constant characterizing macroscopic gravitation.

    In TGD framework hyper-finite factors of type II1 emerge naturally at the level of WCW since fermionic Fock space provides a canonical representation for them and their inclusions provide an elegant description for finite measurement resolution: the included algebra generates states which are not experimentally distinguishable from the original state.

  4. Against this it is astonishing that AdS/CFT duality has very simple generalization in TGD framework and emerge from a generalization of General Coordinate Invariance (GCI) implying holography. Strong form of GCI postulates that either the space-like 3-surfaces at the ends of causal diamonds or the light-like orbits of partonic 2-surfaces can be taken as 3-surfaces defining the WCW: this is just gauge fixing for general coordinate invariance. If this is true then partonic 2-surfaces and their 4-D tangent space data at the boundaries of CD must code for physics. One would have strong form of holography. This might be too much to require: string world sheets carrying induced spinor fields are present and it might be that they cannot be reduced to data at partonic 2-surfaces.

    In any case, for this duality the 10-D space of AdS/CFT duality would be replaced with space-time surface. Mn would be replaced with the light-like parton orbits and/or space-like ends of CD. Surprisingly, this holography would be very much like holography in its original form!

For references see the chapter TGD and M-theory of "Overview about TGD" or the article What went wrong with symmetries?.

Wednesday, January 21, 2015

TGD explanation of the anomalous decay of Higgs to τ-μ pair and anomalies of B meson decays

Lubos mentions 2.5 sigma anomaly (that is something to be not taken seriously) in the decay of Higgs to τ-μ pair or its charge conjugate not allowed by standard model. Lubos mentions a model explaining the anomaly and also other anomalies related to semileptonic decays of neutral B meson in terms of double Higgs sector and gauged Lμ-Lτ symmetry. In a more recent posting Lubos mentions another paper explaining the anomaly in terms of a frightingly complex E6 gauge model inspired by heterotic strings.

TGD suggests however an amazingly simple explanation of the τ-μ anomaly in terms of neutrino mixing.
As a matter fact, after writing the first hasty summary of the childishly simple idea discussed below but still managing to make mistakes;-), I became skeptic: perhaps I have misunderstood what is meant by anomaly. Perhaps the production of τ-μ pairs is not the anomaly after all. Perhaps the anomaly is the deviation from the prediction based on the model below. It however seems that my hasty interpretation was correct. This brings in my mind a dirty joke about string theorists told only at late hours when superstring theorists have already gone to bed. How many super string theorists it takes to change the light bulb? Two. The first one holds the light bulb and the second one rotates the multiverse.

Model for the h→ μ-τc anomaly in terms of neutrino mixing

To my humble opinion both models mentioned by Lubos are highly artificial and bring in a lot of new parameters since new particles are introduced. Also a direct Yukawa coupling of Higgs to τ-μ pair is assumed. This would however break the universality since lepton numbers for charged lepton generations would not be conserved. This does not look attractive and one can ask whether the allowance of transformation of neutrinos to each other by mixing known to occur could be enough to explain the findings assuming that there are no primary flavor changing currents and without introducing any new particles or new parameters. In the hadronic sector the mixing for quarks D type quarks indeed explains this kind of decays producing charged quark pair of say type cuc. In TGD framework, where CKM mixing reduces to topological mixing of topologies of partonic 2-surfaces, this option is especially attractive.

  1. In standard model neutrinos are massless and have no direct coupling to Higgs. Neutrinos are however known to have non-vanishing masses and neutrino mixing analogous to CKM mixing is also known to occur. Neutrino mixing is enough to induce the anomalous decays and the rate is predicted completely in terms of neutrino mixing parameters and known standard physics parameters so that for a professional it should be easy to made the little computer calculations to kill the model;-).

  2. In absence of flavor changing currents only WLiνj vertices can produce the anomaly. The h→ μ-τc or its charge conjugate would proceed by several diagrams but the lowest order diagram comes from the decay of Higgs to W pair. If Higgs vacuum expectation value is non-vanishing as in standard model then Higgs could decay to a virtual W+W- pair decaying to τμ pair by neutrino exchange. Decay to Z pair does not produce the desired final state in accordance with the absence of flavor changing neutral currents in standard model. Triangle diagram would describe the decay. Any lepton pair is possible as final state. Neutrino mixing would occur in either W vertex. The rates for the decays to different lepton pairs differ due to different mass values of leptons which are however rather small using Higgs mass as as scale. Therefore decays to all lepton pairs are expected.

  3. In higher order Higgs could decay lepton pair to lepton pair decaying by neutrino exchange to W pair in turn decaying by neutrino exchange to lepton pair. As as special case one obtains diagrams Higgs decays τ pair with final state preferentially ντ exchange to W+W- pair decaying by τ neutrino exchange to μ-τc pair. The CKM mixing parameter for neutrino mixing would in either the upper vertices of the box. Note that Z0 pair as intermediate state does not contribute since neutral flavor changing currents are absent.
The proposed mechanism should be at work in any generalization of standard model claiming to explain neutrino masses and their mixing without flavor changing neutral currents. If the observed anomaly is different from this prediction, one can start to search for new physics explanations but before this brane constructions in multiverse are not perhaps the best possible strategy.

What about the anomalies related to B meson decays?

The model that Lubos refers to tries to explain also the anomalies related to semileptonic decays of neutral B meson. Neutrino mixing is certainly not a natural candidate if one wants to explain the 2.5 sigma anomalies reported for the decays of B meson to K meson plus muon pair. Lubos has a nice posting about surprisingly many anomalies related to the leptonic and pion and kaon decays of neutral B meson. Tommaso Dorigo tells about 4-sigma evidence for new physics in rare G boson decays. There is also an anomaly related to the decay of neutral B meson to muon pair reported by Jester. In the latter case the the decay can proceed via W or Higgs pair as intermediate state. The coupling h→ bsc resulting through CKM mixing for quarks by the same mechanism as in the case of leptons must have been taken into account since it is standard model process.

TGD predicts M89 hadron physics as a p-adically scaled up variant of ordinary M107 hadron physics with hadron mass scale scaled up by factor 512 which corresponds to LHC energies. Could it be that the loops involve also quarks of M89 hadron physics. A quantitative modelling would require precise formulation for the phase transition changing the p-adic prime characterizing quarks and gluons.

One can however ask whether one might understand these anomalies qualitatively in a simple manner in TGD framework. Since both leptons and quarks are involved, the anomaly must related to W-quark couplings. If M89 physics is there, there must be radiatively generated couplings representing the decay of W to a pair of ordinary M107 quark and M89 quark. A quark of M89 hadron physics appearing as a quark exchange between W+ and W- in box diagram would affect the rates of B meson to kaon and pion. This would affect also the semileptonic decays since the the photon or Z decaying to a lepton pair could be emitted from M89 quark.

But doesn't Higgs vacuum expectation vanish in TGD?

While polishing this posting I discovered an objection against TGD approach that I have not noticed earlier. This objection allows to clarify TGD based view about particles so that I discuss it here.

  1. In standard model the decay of Higgs decays to gauge bosons is described quite well by the lowest order diagrams and the decay amplitude is proportional to Higgs vacuum expectation. In TGD p-adic mass calculations describe fermion massivation and Higgs vacuum expectation vanishes at the fundamental level but must make sense at the QFT limit of TGD involving the replacement of many-sheeted space-time with single slightly curved region of Minkowski space defining GRT space-time. Various gauge fields are sums of induced gauge fields at the sheets.

  2. Note that the decays of Higgs to W pairs with a rate predicted in good approximation by the lowest order diagrams involving Higgs vacuum expectation have been observed. Hence Higgs vacuum expectation must appear as a calculable parameter in the TGD approach based on generalized Feynman diagrams. In this approach the vertices of Feynman diagrams are replaced with 3-D vertices describing splitting of 3-D surface, in particular that of partonic 2-surfaces associated with it and carrying elementary particle quantum numbers by strong form of holography. The condition that em charge is well-defined requires that the modes of the induced spinor fields are localized at string world sheets at which induced W fields vanish. Also induced Z fields should vanish above weak scale at string world sheets. Thus the description of the decays reduces at microscopic level to string model with strings moving in space-time and having their boundaries at wormhole contacts and having interpretation as world lines of fundamental point-like fermions.

  3. Elementary particles are constructed as pairs of wormhole contacts with throats carrying effective Kähler magnetic charge. Monopole flux runs along first space-time sheet, flows to another space-time sheet along contact and returns back along second space-time sheet and through the first wormhole contact so that closed magnetic flux tube is obtains. Both sheets carry string world sheets and their ends at the light-like orbits of wormhole throats are carriers of fermion number.

  4. This description gives non-vanishing amplitudes for the decays of Higgs to gauge boson pairs and fermion pairs. Also the couplings of gauge bosons to fermions can be calculated from this description so that both the gauge coupling strengths and Weinberg angle are predicted. The non-vanishing value of the coupling of Higgs to gauge boson defines the Higgs vacuum expectation which can be used in gauge theory limit. The breaking of weak gauge symmetry reflects the fact that weak gauge group acts as holonomies of CP2 and is not a genuine symmetry of the action. Since weak gauge bosons correspond classical to gauge potentials, the natural conjecture is that the couplings are consistent with gauge symmetry.

  5. Massivation of particles follows from the fact that physical particles are composites of massless fundamental fermions whose light-like momenta are in general non-parallel. It seems however possible to regarded particles as massless in 8-D sense. At classical level this is realized rather elegantly: Minkowskian and Euclidian regions give both a contribution to four-momentum and the contribution from the lines of generalized Feynman diagrams is imaginary due to the Euclidian signature of the induced metric. This gives rise to complex momenta and twistor approach suggests that these momenta are light-like allow real mass squared to be non-vanishing. Also the massivation of light particles could be described in this manner.

    This description would conform with M8-H duality at momentum space level: at imbedding space level one would have color representations and at space-time level representations of SO(4) associated with mass squared=constant sphere in Euclidian three space: this would correspond to the SU(2)L×SU(2)R dynamical symmetry group of low energy hadronic physics.

See the chapter New Particle Physics Predicted by TGD: Part I or the article Some comments about τμ anomaly of Higgs decays and anomalies of B meson decays.

Monday, January 19, 2015

Answers to some frequently occurring questions

Philip Cook formulated in his email some questions, which to my opinion are to the point, and I thought I could put the questions and answers to the blog. I attach the core part of my response in a slightly edited form. There are also some little additions.


I am a proponent of what I call TGD (Topological Geometrodynamics), which could be seen as a generalisation of superstring models obtained by replacing strings with 3-D surfaces representing 3-spaces at microscopic level: space time is a surface in 8-D H=M4xCP2. TGD could be also seen as a solution to the energy problem of General Relativity: the notion of four-momentum is ill-defined in GRT since the symmetries of empty Minkowski space of special relativity are lost so that Noether's Theorem does not apply (NT says that every infinitesimal symmetry corresponds to conserved quantity).

I am not involved with tetryonic geometry although I have proposed a model correlating Platonic (icosahedral and tetrahedral) geometries, music (12-note scale--number of icosahedral vertices), and genetic code (20 amino-acids-20 icosahedral faces and 64 DNA codons as 3-chords of bioharmony). The model predicts correctly vertebrate genetic code and predicts also alternative code realized in biology involving 21st and 22nd amino-acids.

One obtains rather beatiful sounding chord artificial music as random sequences of chords of bio-harmonies assuming that subsequent chords have at least one note in common. An exciting possibility is that this rule could apply to real DNA sequences: this is a testable prediction. There are 256 different bio-harmonies obtained as combinations of 3 basic harmonies of different types defined by icosahedral Hamiltonian cycles with symmetries for which symmetry group is cyclic Zn, n=6,4,2: hexagon, square, line segment) and defines the type of the harmony: the number of cycles is 11+11=22 when one counts also the direction in which one goes through the cycle. A single step along cycle corresponds to quint. These harmonies are rather complex since there are 64 basic 3-chords (rock'n roll harmony has three chords whereas simplest songs for children make it with two chords;-)).

Below brief answers to the questions.

  1. How do you cover the tachyon?

    Tachyon does not belong to the spectrum of real particles in TGD. Tachyonic ground states are possible for elementary particles but physical states are non-tachyonic.

  2. How do you cover the graviton?

    Graviton is one particular particle: all the particles predicted by TGD have similar topological and geometric structure of string like object - closed magnetic flux tube carrying monopole flux- and are accompanied by string world sheet- now imbedded inside 4-D space-time surface.

  3. How do you handle color from QCD?

    Color corresponds to isometries of CP2 factor of the 8-D imbedding space H= M4xCP2 in which space-times are 4-surfaces. TGD predicts that also leptons can have coloured excitations and they could be light. Color is in TGD like orbital angular momentum whereas in QCD in it is like spin. Standard model symmetries are geometrized in terms of isometric and vielbein group of H. Weak gauge potentials are projections of components of CP2 spinor connection. Color gauge potentials those of CP2 Killing vector fields.

    Many-sheeted space-time approximated by GRT space-time by replacing sheets with single slightly curved region of M4 gives the ordinary gauge potentials as sums of induced spinor connections for sheets. Also the metric of GRT space-time is obtained in analogous manner from the metrics of space-time sheets.

  4. When you describe the proton and neutron the fact that the number of quanta in the base tetrahedra can be more - does this imply that there are heavier versions of protons and neutrons.

    My model of proton and neutron does not involve tetrahedra. It involves color magnetic flux tubes carrying most of the rest mass of baryon and currents quark masses are very low. I have considered also a dual description based on static quarks giving most of the baryon mass. Static valence quark would correspond to current quark plus associated flux tube.

  5. What is your view on Dark Matter?

    Dark matter in TGD framework correspond to a hierarchy of ordinary matter in phases in which Planck constant heff=n× h is integer multiple of ordinary Planck constant. These phases are macroscopically quantum coherent for large values of heffand play central role in biology by making macroscopic quantum coherence possible since quantum scales are scaled up by n=heff/h. Dark matter would make living matter living.

    Galactic dark matter does not form 3-D large halo but one has necklace like structures: a thread representing cosmic string (magnetic flux tube) and carrying dark energy as magnetic energy containing galaxies like pearls in necklace. Constant velocity spectrum for distant starts follows automatically from the gravitational potential of the cosmic string: the motion along cosmic string direction is free, this is a specific prediction).

    A model Pioneer and Flyby anomalies allows to conclude that dark matter can be concentrate on 2-D surfaces with universal density .8 kg/m2. The estimate for effective surface density in galactic nucleus is essentially same. TGD makes also predictions: for instance, in Earth Moon system the analog of Pioneer anomaly should be observed. TGD explains all findings (that I know) appearing as anomalies in CDM and MOND models.

  6. What is your view on Dark Energy and the Cosmological Constant?

    Dark energy correspond to magnetic energy at magnetic flux quanta (tubes and sheets: space-time sheets of particular kind) carrying dark matter as large heff phases. Cosmological constant provides effective description at GRT limit of TGD, which I briefly summarised above.

    At more microscopic limit the negative "pressure" associated with magnetic flux tubes allows to understand the accelerated expansion. There is an accelerating expansion also during the TGD counterpart of inflationary phase identifiable as a transition period from a phase in which one has gas of string like objects (2-D M4 projection) to a phase in which one has space-time sheets (4-D M4 projection).

    Periods of accelerating expansion would quite generally correspond to periods of criticality and imbeddable-to-H critical cosmologies are unique apart from duration.

  7. How do you cover the anyon as proposed for hidden variables by people such as Bryan Sanctuary from Mc Gill University (Youtube videos kIysfrByZHE, vXRq1QTrtSg, ABnIvcvn2bc, PcwQWa-rV9Y)

    I have proposed a model of anyons and FHE based on dark variants of electrons having thus non-standard Planck constant heff=n×h explaining fractional charges.

  8. In 'A Hidden Dimension, Clifford Algebra, and Centauro Events - 0703.0053v1.pdf' by Carl Brannen the particle 'Binon' is described. Is this the same as the quanta that you refer to?

    No. TGD predicts a hierarchy of scaled up versions of standard model physics both weak and QCD type physics. Also leptonic variants of QCD type physics for which there exists evidence from anomalies discovered in heavy ion collisions already at seventies. These copies are labelled by Mersenne primes Mn= 2n-1. Ordinary hadron physics corresponds to M107 and at LHC energies a copy labelled by M89 should be discovered and this year is particularly exciting at LHC. There is already now evidence for M89 hadron physics, for instance the anomalous behaviour of what was thought to be QCD plasma (already at RHIC and later at LHC). The mass scale of hadrons of this physics is roughly 512 that of ordinary hadrons. In Centauro events hadrons of M89 hadron physics would be created and decay to ordinary hadrons.

  9. What is your attitude to Bohm Pilot Wave theory?

    Pilot wave theory was an attempt to save determinism of classical physics. This was very natural at the time when Pilot Wave theory was proposed. This is however also ironic since Bohm took consciousness seriously and one might have expected that he accepts free will.

    The idea was to reproduce by pilot wave approach the basic (and classically extremely strange looking) facts of quantum measurement theory. The attempt failed. The approach has also fatal mathematical problems: the non-linear structure involved in separation of modules and phase of Schroedinger amplitude produces horrible singularities in quantum field theory and fermionic statistics is an insurmountable problem.