https://matpitka.blogspot.com/2017/01/

Tuesday, January 31, 2017

The future of physics according to Nima-Arkani Hamed

Nima Arkani-Hamed has a series of popular talks, 7 hours altogether (!), about the future of physics. Talks are excellent and I recommend them for any-one who want to get a grasp about what is happening in the forefront. Nima believes that after a century of rather conservative applications of ideas of quantum theory and relativities we are at the verge of a real revolution. Not a revolution in social sense with some idiot becoming a new leader taking the world into chaos but a revolution which conform with what we already know and provides totally new and unexpected insights. He calls this attitude conservative radicalism. I emphasize this notion and it is sad to see what radical conservatism is doing for the world just now.

Revolutions must begin as solutions to profound problems. Nima Arkani emphasizes two key problems of recent day theoretical physics. Nima believes that space-time is doomed and formulates second basic problem as question "Why a macroscopic Universe?".

One can see identify the basic problems in very many manners and I have somewhat different identifications. I would add to the list of really big problems also those of neuroscience, biology, and consciousness. There is also a profound problem in the sociology of science due to the uncritical beliefs in naive length scale reductionism and materialism. To me the tragic story of superstring models is a convincing proof that length scale reductionism is dead and it is time to bring fractality to the fundamental physics.

I will compare Nima's problems to some of my personal problems in the following. Maybe reader is inspired to identify his or her own personal problems;-).

Space-time is Doomed

Nima believes that the notion of space-time is useful but redundant: it is doomed, we must get rid of it.

  1. One can assign to this idea holography, which is a very nice notion. Holography assumes that physics can be described in terms of lower-dimensional basic entities, which would be 3-D surfaces determining space-time.

  2. Emergent space-time is a stronger, now fashionable, view, which unfortunately seems unable to avoid circular arguments: one starts from 2-D surfaces to get 3-space but already starting point implicitly assumes 3-space.

  3. Holography would be even stronger in TGD framework. Adelic physics predicts that 2-D (rather than 3-D data) combined with number theoretical discretization of space-time surface in H=M4× CP2 determines the space-time surface and quantum states in zero energy ontology (ZEO).

Nima would like to replace spacetime with twistors.

  1. Twistor Grassmann approach has turned out to be extremely powerful and simplifies dramatically the calculations in supersymmetric gauge theories but has its problems: particles should be massless. There is
    also the closely related problem of infrared cutoff. String models and TGD suggest the idea that masslessness should be generalized: in TGD all particles would be massless in H=M4× CP2 but massive
    in M4. This require generalization of twistor approach.

  2. Twistors are a notion very tightly rooted to space-time geomery - especially that of M4, and I find it very difficult to get rid of them. Twistor space can be seen a bundle structure with space-time as base space so that twistors become rather ad hoc objects if one forgets the space-time. Second problem is that twistors work nicely only for empty Minkowski space. This leads to problems in the twistorialization of gravity.

Personally also I do believe that something is indeed redundant but that it is not space-time. Rather, I would doom the idea about space-time and classical particles as independent entities. I see classical particles are space-time quanta, pieces of space-time identified as 4-D surface. Also classical fields decompose to spacetime quanta - the notion of field body comes out of this.

The topologically simple Einsteinian space-time would be replaced with topologically extremely complex object in all scales: many-sheeted spacetime as surface in certain 8-D space-time H. H =M4× CP2 is given and extremely simple and explains standard model physics. Also the dynamics of space-time surface is extremely simple locally. Globally the situation is totally different. This view changes entirely our interpretation about what we see: we would see the wild topology of space-time surface as various objects of external world just by our bare eyes!

Important point: this revolutionary reinterpretation is not possible without lifting the symmetries of Special Relativity from space-time to imbedding space H= M4× CP2: Symmetries move the space-time in H rather than point of space-time inside space-time. GRT view about space-time is quite too rigid to allow particle physics. H and entire TGD was motivated by the energy problem of GRT, which to me is a big problem.

What about twistors in TGD? One cannot replace space-time with twistors. At classical level one can however replace space-time surface with its 6-D twistor bundle having space-time as base space. This gives rise to the twistor lift of TGD - possibly only for H=M4× CP2 (!!) - does and leads to very powerful and correct predictions allowing to understand how Planck length and cosmological constant emerge from TGD. The point is that M4 and CP2 are the only 4-D spaces allowing twistor space with Kähler structure. TGD is mathematically completely unique as also physically. The huge Yangian symmetries related to twistor amplitudes discovered by Nima and others generalize in TGD and give in ZEO excellent hopes about the construction of scattering amplitudes as representations of super-symplectic Yangian.

Why macroscopic space-time?

In GRT based cosmology it is difficult to understand why macroscopic space-time rather than only Planck length sized objects should exist. Also I see this as a real problem.

To have all possible scales one would need something scale invariant at the fundamental level. GRT the abstract 4-D space-time cannot give it. In TGD the imbedding space M4× CP2 does so. M4 has infinite size, and one can scale the size of space-time surfaces in M4 up and down. This means in particular that one obtains macroscopic space-time.

A more refined formulation is in terms of zero energy ontology (ZEO).

  1. In ZEO causal diamonds of form CD× CP2 are key objects. CD is intersection of future and past directed light-cones of M4, Penrose diagram is good illustration. CD represents kind of perceptive field for a conscious entity.

  2. Twistor lift implies that action determining space-time surface contains volume term (cosmological constant) and all space-time surfaces as preferred extremals of action are minimal surface extremals of so called Kähler action. The action would be infinite for infinitely sized space-time surface.

  3. CD has however finite size and the action remains finite: hence ZEO is forced by twistor lift. CDs form a fractal scale hierarchy. Cosmological constant Λ obeys discrete coupling constant evolution like all coupling strengths (so that radiative corrections vanish). Λ is inversely proportional to p-adic length scale squared and becomes small in long p-adic scales so that space-time surfaces inside arbitrarily large CDs having finite action become possible. One can have macroscopic space-time. By the way, particle mass scales define one fundamental problem of standard physics and p-adic length scale hypothesis, which I conjecture to follow from adelic physics, would solve this problem.

  4. In cosmology there is also the problem why background temperature is so exactly constant although the distant regions in very early times have not been able to communicate with each other in order to reach thermodynamical equilibrium. There is also the problem of dark matter and energy. In TGD framework these problems are solved by the hierarchy if Planck constants heff/h=n emerging from adelic physics and implying quantum coherence - in particular gravitational one - on all scales. Entire Universe would be like a living organism in this picture.

For a summary of earlier postings see Latest progress in TGD.

Articles and other material related to TGD.





























Sunday, January 29, 2017

Narcissism, ethics, and adeles

The recent situation in USA and in the World - we cannot isolate ourselves from the rest of the world - forces to ask questions about ethics and moral. Are Good and Evil only illusions as materialist would say and propose to stop worrying about Trump and to concentrate on business. Materialistic could make this more precise by continuing in slightly irritated tone that there are no free choices - this is against laws of physics. Non-materialist like me would however continue to ask whether the choices between Good and Evil are real after all so that we are responsible for our deeds?

Narcissism and Good and Evil

This comment evolved from a discussion about narcissism as one particular personality disorder and explanation for the behaviors of Mr. Trump. Psychiatrists tell that there are people suffering from personality disorders, and this explains why they can behave in so asocial manner. History is full of examples of this kind of persons - consider only Stalin, Hitler, and later dictators. And almost as a rule people react too late to what is happening. Narcissist as a leader of a nation is the worst thing that can happen (even for the narcissist himself) since people in crisis situation take care only about their own survival and turn their coat to survive. Also the typical portrait of terrorist might be a person with personality disorder: terrorist/ psychopath/ sadist/ narcissist has not received love as a child and decides to revenge. This seems to be true quite often. This view is humane in that it sees these persons as patients.

But is narcissist only a victim of brain disorder or of childhood abuse or can he actually choose between Good and Evil? At this age one has encountered this question several times. Furthermore, if one builds something, which one might call a theory of consciousness, one cannot avoid questions about Life and Death and Good and Evil.

These questions are encountered also in TGD framework and have gradually transformed to mathematical questions so that it becomes possible to propose plausible answers. So: Do we live in best possible world or not? And is the world becoming better all the time - or in the long run at least - as both biological and cultural evolution would suggest? Compare only the life at middle age to our life. Something positive has definitely occurred.

What the roots of N:th order order polynomials have to do with ethics?

In TGD view about Universe and consciousness conscious information plays a key role. The problem of consciousness theorist is that in standard physics there is no direct measure for information, only entropy has a measure and second law provides a rather gloomy future perspective: even the existence of life seems to be impossible and is seen as gigantic thermodynamical fluctuation - perhaps the most implausible hypothesis proposed in the documented history of mankind.

The good number theoretic news is that in p-adic and adelic physics one can speak speak about negentropic entanglement (NE) as a correlate for conscious information and entanglement negentropy measures the amount of NE. Could this bring ethics and moral to the realm of mathematics? Good deed would increase NE and bad deed reduce it. NE would be also a correlate for love and positive feelings so that good deeds would be done from love.

The basic principle would be Negentropy Maximization Principle (NMP) stating that the amount of coscious information measured by negenropy increases so that Universe evolves becoming gradually a better place. But there are two options. NMP could be true in absolute sense so that negentropy gain in quantum jump would be always maximal: we would live not only in the best possible world as Leibniz believed but in a world becoming even better quantum jump by quantum jump! This looks too good.

NMP could also true in statistical sense only. Although there would be drawbacks, situation would improve in the long run. This option looks more realistic: NMP would be analogous to second law but consistent with it. This option would allow to speak about ethics and moral. NMP in this sense would allow us to do also stupid and cruel things.

The statistical view about NMP is probably correct in TGD Universe and can be formulated number theoretically in terms of adelic physics, which is fusion of real number based physics describing matter and physics of physics based on p-adic number fields describing cognition. One can also speak about evolution and actually reduce to the growth of the complexity of extension of rationals determining given hierarchy level in the hierarchy of adeles defining evolutionary level of the system.

A simple concretization of given extension of rationals is in terms of roots of polynomials of degree N with rational coefficients. There are N roots and so called Galois group maps rationals to rationals and permutes these roots with each other. Let n denote the order of Galois group (number of its elements). Since n is positive, it necessary increases in statistical sense quantum jump by quantum jump: this is like random walk at positive half-line and leads gradually farther away from origin. In the long run the extension and therefore also classical and quantum Universe would both become more and more complex. This would be evolution. NMP would follow from adelic physics, it need not be postulated separately. This was a rather recent pleasant surprise in the middle of unpleasant surprises from the world of politics.

So: ethics and N:th order polynomials seem to have something in common! The world is full of surprises and this is especially true for the world of mathematics!

Adelic physics explains also dark matter

Evolution would be reduced to a statistical increase of maximal possible NE - quantum jump by quantum jumps things are bound to get better. Also a connection with the hierarchy of Planck constants central in the TGD inspired physics of living matter emerges. The action of Galois group on the number theoretic discretization of space-time surface consisting of points with preferred coordinates in an extension of rationals defining the adele defines space-time surface as n-sheeted covering. From the beginning it was clear that the effective Planck constant heff/h=n must correspond to the number of sheets of a covering defined in some manner. In adelic physics heff/h=n labelling the levels of a hierarchy of dark matters as phases of the ordinary matter corresponds extremely naturally to the order of Galois group of the extension. Dark matter would be the basic prediction of adelic physics! One also understand favored p-adic primes in the framework of adelic physics but this requires more information about p-adic numbers and extensions of rationals.

For a summary of earlier postings see Latest progress in TGD.

Articles and other material related to TGD.

Saturday, January 28, 2017

Time crystals, macroscopic quantum coherence, and adelic physics

Time crystals were were proposed by Frank Wilzek in 2012. The idea is that there is a periodic collective motion so that one can see the system as analog of 3-D crystal with time appearing as fourth lattice dimension. One can learn more about real life time crystals at (see this).

The first crystal was created by http://tinyurl.com/js2h6b4">Moore et al and involved magnetization. By adding a periodic driving force it was possible to generate spin flips inducing collective spin flip as a kind of domino effect. The surprise was that the period was twice the original period and small changes of the driving frequency did not affect the period. One had something more than forced oscillation - a genuine time crystal. The period of the driving force - Floquet period- was 74-75 μs and the system is measured for N=100 Floquet periods or about 7.4-7.5 milliseconds (1 ms happens to be of same order of magnitude as the duration of nerve pulse). I failed to find a comment about the size of the system. With quantum biological intuition I would guess something like the size of large neuron: about 100 micrometers.

Second law does not favor time crystals. The time in which single particle motions are thermalized is expected to be rather short. In the case of condensed matter systems the time scale would not be much larger than that for a typical rate of typical atomic transition. The rate for 2P → 1S transition of hydrogen atom gives the general idea. The decay rate is proportional to ω3d2, where ω= Δ E/hbar is the frequency difference corresponding to the energy difference between the states, d is dipole moment proportional to α a0, a0 Bohr radius and α∼ 1/137 fine structure constant. Average lifetime as inverse of the decay rate would be 1.6 ns and is expected to give a general order of magnitude estimate.

The proposal is that the systems in question emerge in non-equilibrium thermodynamics, which indeed predicts a master-slave hierarchy of time and length scales with masters providing the slowly changing background in which slaves are forced to move. I am not a specialist enough to express any strong opinions about thermodynamical explanation.

What does TGD say about the situation?

  1. So called Anderson localization is believed to accompany time crystal. In TGD framework this translates to the fusion of 3-surfaces corresponding to particles to single large 3-surface consisting of particle 3-surfaces glued together by magnetic flux tubes. On can say that a relative localization of particles occurs and they more or less lose the relative translation degrees of freedom. This effect occurs always when bound states are formed and would happen already for hydrogen atom.

    TGD vision would actually solve a fundamental problem of QED caused by the assumption that proton and electron behave as independent point like particles: QED predicts a lot of non-existing quantum states since Bethe-Salpeter equation assumes degrees of freedom, which do not actually exist. Single particle descriptions (Schrödinger equation and Dirac equation) treating proton and electron effectively as single particle geometrically (rather than independent particles) having reduced mass gives excellent description whereas QED, which was thought to be something more precise, fails. Quite generally, bound states are not properly understood in QFTs. Color confinement problem is second example about this: usually it is believed that the failure is solely due to the fact that color interaction is strong but the real reason might be much deeper.

  2. In TGD Universe time crystals would be many-particle systems having collection of 3-surfaces connected by magnetic flux tubes (tensor network in terms of condensed matter complexity theory). Magnetic flux tubes would carry dark matter in TGD sense having heff/h=n increasing the quantal scales - both spatial and temporal - so that one could have time crystals in long scales.

    Biology could provide basic examples. For instance, EEG resonance frequency could be associated with time crystals assignable to the magnetic body of brain carrying dark matter with large heff/h=n - so large that dark photon energy E=hefff would correspond to an energy above thermal energy. If bio-photons result from phase transitions heff/h=n→ 1, the energy would be in visible-UV energy range. These frequencies would in turn drive the visible matter in brain and force it to oscillate coherently.


  3. The time crystals claimed by Monroe and Lurkin to be created in laboratory demand a feed of energy (see this) unlike the time crystals proposed by Wilzek. The finding is consistent with the TGD based model. In TGD the generation of large heff phase demands energy. The reason is that the energies of states increase with heff. For instance, atomic binding energies decrease as 1/h2eff. In quantum biology this requires feeding of metabolic energy. Also now interpretation would be analogous to this.

  4. Standard physics view would rely in non-equilibrium thermodynamics whereas TGD view about time crystals would rely on dark matter and hierarchy of Planck constants in turn implied by adelic physics suggested to provide a coherent description fusing real physics as physics of matter and various p-adic physics as physics of cognition.

    Number theoretical universality (NTU) leads to the notion of adelic space-time surface (monadic manifold) involving a discretization in an extension of rationals defining particular level in the hierarchy of adeles defining evolutionary hierarchy. heff/h=n has been identified from the beginning as the dimension of poly-sheeted covering assignable to space-time surface. The action of the Galois group of extensions indeed gives rise to covering space. The number n of sheets would the order of Galois group implying heff/h=n, which is bound to increase during evolution so that the complexity increases.

    Indeed, since n is positive integer evolution is analogous to a diffusion in half-line and n unavoidably increases in the long run just as the particle diffuses farther away from origin (by looking what gradually happens near paper basket one understands what this means). The increase of n implies the increase of maximal negentropy and thus of negentropy. Negentropy Maximization Principle (NMP) follows from adelic physics alone and there is no need to postulate it separately. Things get better in the long run although we do not live in the best possible world as Leibniz who first proposed the notion of monad proposed!


Adelic physics allows also a strong grasp to metabolism and bio-catalysis - the key elements of biology.
  1. Why metabolic energy would be needed? Intuitive answer is that evolution requires it and that evolution corresponds to the increase of n=heff/h. To see the answer to the question, notice that the energy scale for the bound states of an atom is proportional to 1/h2 and for dark atom to 1/heff2 ∝ n2 (do not confuse this n with the integer n labelling the states of hydrogen atom!).

    Dark atoms have smaller binding energies and their creation by a phase transition increasing the value of n demands a feed of energy - metabolic energy! If the metabolic energy feed stops, n is gradually reduced. What is remarkable that the scale of atomic binding energies decreases with n only in dimension D=3. In other dimensions it increases and in D=4 one cannot even speak of bound states! Life based on metabolism seems to make sense only in spatial dimension D=3. Note however that there are also other quantum states than atomic states with different dependence of energy on heff.

  2. One can also understand bio-catalysis. In the simplest situation three molecules - catalyst and the two reactants meet in the reaction. Already this meeting demands heff reducing phase transition scaling down the length of some flux tubes connecting the molecules together so that they are drawn together.

    At least in the catalyst molecule some atom(s) would be in a state with some n>1 and in the reaction n would be reduced and liberate binding energy. This energy would help the reactants to overcome the potential wall making the reaction slow so that reaction would proceed swiftly. After this they would liberate binding energy back to the catalyst molecule. Catalyst would serve as a matchmaker helping the shy potential lovers to overcome the barrier. Note again that this is possible only in D=3!

For a summary of earlier postings see Latest progress in TGD.

Articles and other material related to TGD.

Wednesday, January 25, 2017

NMP and self

The preparation of an article about number theoretic aspects of TGD forced to go through various related ideas and led to a considerable integration of the ideas. In this note ideas related directly to consciousness and cognition are discussed.

  1. Adelic approach strongly suggests the reduction of NMP to number theoretic physics somewhat like second law reduces to probability theory. The dimension of extension rationals characterizing the hierarchy level of physics and defined an observable measured in state function reductions is positive and can only increase in statistical sense. Therefore the maximal value of entanglement negentropy increases as new entangling number theoretic degrees of freedom emerge. heff/h=n identifiable as factor of Galois group of extension characterizes the number of these degrees of freedom for given space-time surfaces as number of its sheets.

  2. State function reduction is hitherto assumed to correspond always to a measurement of density matrix which can be seen as a reaction of subsystem to its environment. This makes perfect sense at space-time level. Higher level measurements occur however at the level of WCW and correspond to a localization to some sector of WCW determining for instance the quantization axes of various quantum numbers. Even the measurement of heff/h=n would measure the dimension of Galois group and force a localization to an
    extension with Galois group with this dimension. These measurements cannot correspond to measurements of density matrix since different WCW sectors cannot entangle by WCW locality. This finding will be discuss in the following.

Evolution of NMP

The view about Negentropy Maximization Principle (NMP) has co-evolved with the notion of self and I have considered many variants of NMP.

  1. The original formulation of NMP was in positive energy ontology and made same predictions as standard quantum measurement theory. The new element was that the density matrix of sub-system defines the fundamental observable and the system goes to its eigenstate in state function reduction. As found, the localizations at to WCW sectors define what might be called self-measurements and identifiable as active volitions rather than reactions.

  2. In p-adic physics one can assign with rational and even algebraic entanglement probabilities number theoretical entanglement negentropy (NEN) satisfying the same basic axioms as the ordinary Shannon entropy but having negative values and therefore having interpretation as information. The definition of p-adic negentropy (real valued) reads as Sp= -∑ Pklog(|Pk|p), where | . |p denotes p-adic norm. The news is that Np= -Sp can be positive and is positive for rational entanglement probabilities. Real entanglement entropy S is always non-negative.

    NMP would force the generation of negentropic entanglement (NE) and stabilize it. NE resources of the Universe - one might call them Akashic records- would steadily increase.

  3. A decisive step of progress was the realization is that NTU forces all states in adelic physics to have entanglement coefficients in some extension of rationals inducing finite-D extension of p-adic numbers. The same entanglement can be characterized by real entropy S and p-adic negentropies Np, which can be positive. One can define also total p-adic negentropy: N= ∑p Np for all p and total negentropy Ntot=N-S.

    For rational entanglement probabilities it is easy to demonstrate that the generalization of adelic theorem holds true: Ntot=N-S=0. NMP based on Ntot rather than N would not say anything about rational entanglement. For extensions of rationals it is easy to find that N-S>0 is possible if entanglement probabilities are of form Xi/n with |Xi|p=1 and n integer. Should one identify the total negentropy as difference Ntot=N-S or as Ntot=N?

    Irrespective of answer, large p-adic negentropy seems to force large real entropy: this nicely correlates with the paradoxical finding that living systems tend to be entropic although one would expect just the opposite: this relates in very interesting manner to the work of biologists Jeremy England \citebbio/England1. The negentropy would be cognitive negentropy and not visible for ordinary physics.

  4. The latest step in the evolution of ideas NMP was the question whether NMP follows from number theory alone just as second law follows form probability theory! This irritates theoretician's ego but is victory for theory. The dimension n of extension is positive integer and cannot but grow in statistical sense in evolution! Since one expects that the maximal value of negentropy (define as N-S) must increase with n. Negentropy must increase in long run.

Number theoretic entanglement can be stable

Number theoretical Shannon entropy can serve as a measure for genuine information assignable to a pair of entanglement systems. Entanglement with coefficients in the extension is always negentropic if entanglement negentropy comes from p-adic sectors only. It can be negentropic if negentropy is defined as the difference of p-adic negentropy and real entropy.

The diagonalized density matrix need not belong to the algebraic extension since the probabilities defining its diagonal elements are eigenvalues of the density matrix as roots of N:th order polynomial, which in the generic case requires n-dimensional algebraic extension of rationals. One can argue that since diagonalization is not possible, also state function reduction selecting one of the eigenstates is impossible unless a phase transition increasing the dimension of algebraic extension used occurs simultaneously. This kind of NE could give rise to cognitive entanglement.

There is also a special kind of NE, which can result if one requires that density matrix serves a universal observable in state function reduction. The outcome of reduction must be an eigen space of density matrix, which is projector to this subspace acting as identity matrix inside it. This kind NE allows all unitarily related basis as eigenstate basis (unitary transformations must belong to the algebraic extension). This kind of NE could serve as a correlate for "enlightened" states of consciousness. Schrödingers cat is in this kind of state stably in superposition of dead and alive and state basis obtained by unitary rotation from this basis is equally good. One can say that there are no discriminations in this state, and this is what is claimed about "enlightened" states too.

The vision about number theoretical evolution suggests that NMP forces the generation of NE resources as NE assignable to the "passive boundary of CD for which no changes occur during sequence of state function reductions defining self. It would define the unchanging self as negentropy resources, which could be regarded as kind of Akashic records. During the next "re-incarnation after the first reduction to opposite boundary of CD the NE associated with the reduced state would serve as new Akashic records for the time reversed self. If NMP reduces to the statistical increase of heff/h=n the consciousness information contents of the Universe increases in statistical sense. In the best possible world of SNMP it would increase steadily.

Does NMP reduce to number theory?

The heretic question that emerged quite recently is whether NMP is actually needed at all! Is NMP a separate principle or could NMP reduced to mere number theory? Consider first the possibility that NMP is not needed at all as a separate principle.

  1. The value of heff/h=n should increase in the evolution by the phase transitions increasing the dimension of the extension of rationals. heff/h=n has been identified as the number of sheets of some kind of covering space. The Galois group of extension acts on number theoretic discretizations of the monadic surface and the orbit defines a covering space. Suppose n is the number of sheets of this covering and thus the dimension of the Galois group for the extension of rationals or factor of it.

  2. It has been already noticed that the "big" state function reductions giving rise to death and reincarnation of self could correspond to a measurement of n=heff implied by the measurement of the extension
    of the rationals defining the adeles. The statistical increase of n follows automatically and implies statistical increase of maximal entanglement negentropy. Entanglement negentropy increases in statistical sense.

    The resulting world would not be the best possible one unlike for a strong form of NMP demanding that negentropy does increaes in "big" state function reductions. n also decrease temporarily and they seem to be needed. In TGD inspired model of bio-catalysis the phase transition reducing the value of n for the magnetic flux tubes connecting reacting bio-molecules allows them to find each other in the molecular soup. This would be crucial for understanding processes like DNA replication and transcription.

  3. State function reduction corresponding to the measurement of density matrix could occur to an eigenstate/eigenspace of density matrix only if the corresponding eigenvalue and eigenstate/eigenspace is expressible using numbers in the extension of rationals defining the adele considered. In the generic case these numbers belong to N-dimensional extension of the original extension. This can make the entanglement stable with respect to state the measurements of density matrix.

    A phase transition to an extension of an extension containing these coefficients would be required to make possible reduction. A step in number theoretic evolution would occur. Also an entanglement of measured state pairs with those of measuring system in containing the extension of extension would make possible the reduction. Negentropy could be reduced but higher-D extension would provide potential for more negentropic entanglement and NMP would hold true in the statistical sense.

  4. If one has higher-D eigen space of density matrix, p-adic negentropy is largest for the entire subspace and the sum of real and p-adic negentropies vanishes for all of them. For negentropy identified as total p-adic negentropy SNMP would select the entire sub-space and NMP would indeed say something explicit about negentropy.

Or is NMP needed as a separate principle?

Hitherto I have postulated NMP as a separate principle \citeallb"nmpc". Strong form of NMP (SNMP) states that Negentropy does not decrease in "big" state function reductions corresponding to death and re-incarnations of self.

One can however argue that SNMP is not realistic. SNMP would force the Universe to be the best possible one, and this does not seem to be the case. Also ethically responsible free will would be very restricted since self would be forced always to do the best deed that is increase maximally the negentropy serving as information resources of the Universe. Giving up separate NMP altogether would allow to have also "Good" and "Evil".

This forces to consider what I christened weak form of NMP (WNMP). Instead of maximal dimension corresponding to N-dimensional projector self can choose also lower-dimensional sub-spaces and 1-D sub-space corresponds to the vanishing entanglement and negentropy assumed in standard quantum measurement theory. As a matter fact, this can also lead to larger negentropy gain since negentropy depends strongly on what is the large power of p in the dimension of the resulting eigen sub-space of density matrix. This could apply also to the purely number theoretical reduction of NMP.

WNMP suggests how to understand the notions of Good and Evil. Various choices in the state function reduction would correspond to Boolean algebra, which suggests an interpretation in terms of what might be called emotional intelligence . Also it turns out that one can understand how p-adic length scale hypothesis - actually its generalization - emerges from WNMP.

  1. One can start from ordinary quantum entanglement. It corresponds to a superposition of pairs of states. Second state corresponds to the internal state of the self and second state to a state of external world or biological body of self. In negentropic quantum entanglement each is replaced with a pair of sub-spaces of state spaces of self and external world. The dimension of the sub-space depends on which pair is in question. In state function reduction one of these pairs is selected and deed is done. How to make some of these deeds good and some bad? Recall that WNMP allows only the possibility to generate NNE but does not force it. WNMP would be like God allowing the possibility to do good but not forcing good deeds.

    Self can choose any sub-space of the subspace defined by k≤ N-dimensional projector and 1-D subspace corresponds to the standard quantum measurement. For k=1 the state function reduction leads to vanishing negentropy, and separation of self and the target of the action. Negentropy does not increase in this action and self is isolated from the target: kind of price for sin.

    For the maximal dimension of this sub-space the negentropy gain is maximal. This deed would be good and by the proposed criterion NE corresponds to conscious experience with positive emotional coloring. Interestingly, there are 2k-1 possible choices, which is almost the dimension of Boolean algebra consisting of k independent bits. The excluded option corresponds to 0-dimensional sub-space - empty set in set theoretic realization of Boolean algebra. This could relate directly to fermionic oscillator operators defining basis of Boolean algebra - here Fock vacuum would be the excluded state. The deed in this sense would be a choice of how loving the attention towards system of external world is.

  2. A map of different choices of k-dimensional sub-spaces to k-fermion states is suggestive. The realization of logic in terms of emotions of different degrees of positivity would be mapped to many-fermion states - perhaps zero energy states with vanishing total fermion number. State function reductions to k-dimensional spaces would be mapped to k-fermion states: quantum jumps to quantum states!

    The problem brings in mind quantum classical correspondence in quantum measurement theory. The direction of the pointer of the measurement apparatus (in very metaphorical sense) corresponds to the outcome of state function reduction, which is now 1-D subspace. For ordinary measurement the pointer has k positions. Now it must have 2k-1 positions. To the discrete space of k pointer positions one must assign fermionic Clifford algebra of second quantized fermionic oscillator operators. The hierarchy of Planck constants and dark matter suggests the realization. Replace the pointer with its space-time k-sheeted covering and consider zero energy energy states made of pairs of k-fermion states at the sheets of the n-sheeted covering? Dark matter would be therefore necessary for cognition. The role of fermions would be to "mark" the k space-time sheets in the covering.

The cautious conclusion is that NMP as a separate principle is not necessary and follows in statistical sense from the unavoidable increase of n=heff/h identified as dimension of extension of rationals define the adeles if this extension or at least the dimension of its Galois group is observable.

For details see the chapter Negentropy Maximization Principle or the article Re-examination of the basic notions of TGD inspired theory of consciousness.

For a summary of earlier postings see Latest progress in TGD.

Articles and other material related to TGD.

WCW and the notion of intentional free will

The preparation of an article about number theoretic aspects of TGD forced to go through various related ideas and led to a considerable integration of the ideas. In this note ideas related directly to consciousness and cognition are discussed.

  1. Adelic approach strongly suggests the reduction of NMP to number theoretic physics somewhat like second law reduces to probability theory. The dimension of extension rationals characterizing the hierarchy level of physics and defined an observable measured in state function reductions is positive and can only increase in statistical sense. Therefore the maximal value of entanglement negentropy increases as new entangling number theoretic degrees of freedom emerge. heff/h=n identifiable as factor of Galois group of extension characterizes the number of these degrees of freedom for given space-time surfaces as number of its sheets.

  2. State function reduction is hitherto assumed to correspond always to a measurement of density matrix which can be seen as a reaction of subsystem to its environment. This makes perfect sense at space-time level. Higher level measurements occur however at the level of WCW and correspond to a localization to some sector of WCW determining for instance the quantization axes of various quantum numbers. Even the measurement of heff/h=n would measure the dimension of Galois group and force a localization to an extension with Galois group with this dimension. These measurements cannot correspond to measurements of density matrix since different WCW sectors cannot entangle by WCW locality. This finding will be discuss in the following.

The notion of self can be seen as a generalization of the poorly defined definition of the notion of observer in quantum physics. In the following I take the role of skeptic trying to be as critical as possible.

The original definition of self was as a subsystem able to remain unentangled under state function reductions associated with subsequent quantum jumps. The density matrix was assumed to define the universal observable. Note that a density matrix, which is power series of a product of matrices representing commuting observables has in the generic case eigenstates, which are simultaneous eigenstates of all observables. Second aspect of self was assumed to be the integration of subsequent quantum jumps to coherent whole giving rise to the experienced flow of time.

The precise identification of self allowing to understand both of these aspects turned out to be difficult problem. I became aware the solution of the problem in terms of ZEO (ZEO) only rather recently (2014).

  1. Self corresponds to a sequence of quantum jumps integrating to single unit as in the original proposal, but these quantum jumps correspond to state function reductions to a fixed boundary of causal diamond CD leaving the corresponding parts of zero energy states invariant - "small" state function reductions. The parts of zero energy states at second boundary of CD change and even the position of the tip of the opposite boundary changes: one actually has wave function over positions of second boundary (CD sizes roughly) and this wave function changes. In positive energy ontology these repeated state function reductions would have no effect on the state (Zeno effect) but in TGD framework there occurs a change for the second boundary and gives rise to the experienced flow of time and its arrow and self: self is generalized Zeno effect.

  2. The first quantum jump to the opposite boundary corresponds to the act of "free will" or birth of re-incarnated self. Hence the act of "free will" changes the arrow of psychological time at some level of hierarchy of CDs. The first reduction to the opposite boundary of CD means "death" of self and "re-incarnation" of time-reversed self at opposite boundary at which the the temporal distance between the tips of CD increases in opposite direction. The sequence of selves and time reversed selves is analogous to a cosmic expansion for CD. The repeated birth and death of mental images could correspond to this sequence at the level of sub-selves.

  3. This allows to understand the relationship between subjective and geometric time and how the arrow of and flow of clock time (psychological time) emerge. The average distance between the tips of CD increases on the average as along as state function functions occur repeatedly at the fixed boundary: situation is analogous to that in diffusion. The localization of contents of conscious experience to boundary of CD gives rise to the illusion that universe is 3-dimensional. The possibility of memories made possibly by hierarchy of CDs demonstrates that this is not the case. Self is simply the sequence of state function reductions at same boundary of CD remaining fixed and the lifetime of self is the total growth of the average temporal distance between the tips of CD.

One can identify several rather abstract state function reductions selecting a sector of WCW.
  1. There are quantum measurements inducing localization in the moduli space of CDs with passive boundary and states at it fixed. In particular, a localization in the moduli characterizing the Lorentz transform of the upper tip of CD would be measured. The measured moduli characterize also the analog of symplectic form in M4 strongly suggested by twistor lift of TGD - that is the rest system (time axis) and spin quantization axes. Of course, also other kinds of reductions are possible.

  2. Also a localization to an extension of rationals defining the adeles should occur. Could the value of n=heff/h be observable? The value of n for given space-time surface at the active boundary of CD could be identified as the order of the smallest Galois group containing all Galois groups assignable to 3-surfaces at the boundary. The superposition of space-time surface would not be eigenstate of n at active boundary unless localization occurs. It is not obvious whether this is consistent with a fixe value of n at passive boundary.

    The measured value of n could be larger or smaller than the value of n at the passive boundary of CD but in statistical sense n would increase by the analogy with diffusion on half line defined by non-negative integers. The distance from the origin unavoidably increases in statistical sense. This would imply evolution as increase of maximal value of negentropy and generation of quantum coherence in increasingly longer scales.

  3. A further abstract choice corresponds to the the replacement of the roles of active and passive boundary of CD changing the arrow of clock time and correspond to a death of self and re-incarnation as time-reversed self.

Can one assume that these measurements reduce to measurements of a density matrix of either entangled system as assumed in the earlier formulation of NMP, or should one allow both options. This question actually applies to all quantum measurements and leads to a fundamental philosophical questions unavoidable in all consciousness theories.
  1. Do all measurements involve entanglement between the moduli or extensions of two CDs reduced in the measurement of the density matrix? Non-diagonal entanglement would allow final states states, which are not eigenstates of moduli or of n: this looks strange. This could also lead to an infinite regress since it seems that one must assume endless hierarchy of entangled CDs so that the reduction sequence would proceed from top to bottom. It looks natural to regard single CD as a sub-Universe.

    For instance, if a selection of quantization axis of color hypercharge and isospin (localization in the twistor space of CP2) is involved, one would have an outcome corresponding to a quantum superposition of measurements with different color quantization axis!

    Going philosophical, one can also argue, that the measurement of density matrix is only a reaction to environment and does not allow intentional free will.

  2. Can one assume that a mere localization in the moduli space or for the extension of rationals (producing an eigenstate of n) takes place for a fixed CD - a kind of self measurement possible for even unentangled system? If there is entanglement in these degrees of freedom between two systems (say CDs), it would be reduced in these self measurements but the outcome would not be an eigenstate of density matrix. An interpretation as a realization of intention would be approriate.

  3. If one allows both options, the interpretation would be that state function reduction as a measurement of density matrix is only a reaction to environment and self-measurement represents a realization of intention.

  4. Self measurements would occur at higher level say as a selection of quantization axis, localization in the moduli space of CD, or selection of extension of rationals. A possible general rule is that measurements at space-time level are reactions as measurements of density matrix whereas a selection of a sector of WCW would be an intentional action. This because formally the quantum states at the level of WCW are as modes of classical WCW spinor field single particle states. Entanglement between different sectors of WCW is not possible.

  5. If the selections of sectors of WCW at active boundary of CD commute with observables, whose eigenstates appear at passive boundary (briefly passive observables) meaning that time reversal commutes with them - they can occur repeatedly during the reduction sequence and self as a generalized Zeno effect makes sense.

    If the selections of WCW sectors at active boundary do not commute with passive observables then volition as a choice of sector of WCW must change the arrow of time. Libet's findings show that conscious choice induces neural activity for a fraction of second before the conscious choice. This would imply the correspondences "big" measurement changing the arrow of time - self-measurement at the level of WCW - intentional action and "small" measurement - measurement at space-time level - reaction.

    Self as a generalized Zeno effect makes sense only if there are active commuting with passive observables. If the passive observables form a maximal set, the new active observables commuting with them must emerge. The increase of the size of extension of rationals might generate them by expanding the state space so that self would survive only as long at it evolves. Self would die and re-incarnate when it could not generate any new observables communicating with those assignable to active boundary to be measured. From personal experience I can say that ageing is basically the loss of the ability to make new choices. When all possible choices are made, all observables are measured or self-measured, it is time to start again.

    Otherwise there would be only single unitary time evolution followed by a reduction to opposite boundary. This makes sense only if the sequence of "big" reductions for sub-selves can give rise to the time flow experienced by self: the birth and death of mental images would give rise to flow of time of self.

The overall conclusion is that the notion of WCW is necessary to understand intentional free will. One must distinguish between measurements at WCW level as localizations, which do not involved measurement of density matrix and measurements space-time level reducible to measurements of density matrix (taking the density matrix to be function of product of commuting observables one can measure all these observables simultaneously by measuring density matrix. WCW localizations correspond to intentional actions - say decision fixing quantization axis for spin and space-time reductions correspond to state function reductions at the level of matter. By reading Krishnamurti I learned that eastern philosophies make a sharp distinction between behavior as mere reactivity and behavior as intentional actions which are not reactions. Furthermore, death and reincarnation happen when self has made all choices.

For details see the chapter Negentropy Maximization Principle or the article Re-examination of the basic notions of TGD inspired theory of consciousness.

For a summary of earlier postings see Latest progress in TGD.

Articles and other material related to TGD.

Progress in adelic TGD

The preparation of an article about number theoretic aspects of TGD forced to go through various related ideas and led to a considerable integration of the ideas. In this note ideas related directly to adelic TGD are discussed.

  1. Both hierarchy of Planck constant and preferred p-adic primes are now understood number theoretically.

  2. The realization of number theoretical universality (NTU) for functional integral seems like a formidable problem but
    the special features of functional integral makes the challenge tractable. NTU of functional integral is indeed suggested by the need to describe also cognition quantally.

  3. Strong form of holography (SH) is now understood. 2-D surfaces (string world sheets and possibly partonic 2-surfaces) are not quite enough: also number theoretic discretization of space-time surface is required. This allows to understand the distinction between imagination in terms of p-adic space-time surfaces and reality in terms of real space-time surface. The number of imaginations is much larger than realities (p-adic pseudo-constants).

  4. The localization of spinor modes to string world sheets can be understand only as effective: this resolves several interpretational problems. These spinors give all information needed to construct 4-D spinor modes. Also 2-D modified Dirac action and area action are enough to construct scattering amplitudes once number theoretic discretization of space-time surface responsible for dark matter is given. This means enormous simplification of the theory.
Galois group of number theoretic discretization and hierarchy of Planck constants

Simple arguments lead to the identification of heff/h=n as a factor of the order of Galois group of extension of rationals.

  1. The strongest form of NTU would require that the allowed points of imbedding space belonging an extension of rationals are mapped as such to corresponding extensions of p-adic number fields (no canonical identification). At imbedding space level this correspondence would be extremely discontinuous. The "spines" of space-time surfaces would however contain only a subset of points of extension, and a natural resolution length scale could emerge and prevent the fluctuation. This could be also seen as a reason for why space-times surfaces must be 4-D. The fact that the curve xn+yn=zn has no rational points for n>2, raises the hope that the resolution scale could emerge spontaneously.

  2. The notion of monadic geometry - discussed in detail here would realize this idea. Define first a number theoretic discretization of imbedding space in terms of points, whose coordinates in group theoretically preferred coordinate system belong to the extension of rationals considered. One can say that these algebraic points are in the intersection of reality and various p-adicities. Overlapping open sets assigned with this discretization define in the real sector a covering by open sets. In p-adic sector compact-open-topology allows to assign with each point 8th Cartesian power of algebraic extension of p-adic numbers. These compact open sets define analogs for the monads of Leibniz and p-adic variants of field equations make sense inside them.

    The monadic manifold structure of H is induced to space-time surfaces containing discrete subset of points in the algebraic discretization with field equations defining a continuation to space-time surface in given number field, and unique only in finite measurement resolution. This approach would resolve the tension between continuity and symmetries in p-adic--real correspondence: isometry groups would be replaced by their sub-groups with parameters in extension of rationals considered and acting in the intersection of reality and p-adicities.

    The Galois group of extension acts non-trivially on the "spines" of space-time surfaces. Hence the number theoretical symmetries act as physical symmetries and define the orbit of given space-time surface as a kind of covering space. The coverings assigned to the hierarchy of Planck constants would naturally correspond to Galois coverings and dark matter would represent number theoretical physics.

    This would give rise to a kind of algebraic hierarchy of adelic 4-surfaces identifiable as evolutionary hierarchy: the higher the dimension of the extension, the higher the evolutionary level.

  3. But how does quantum criticality relate to number theory and adelic physics? heff/h=n has been identified as the number of sheets of space-time surface identified as a covering space of some kind. Number theoretic discretization defining the "spine for a monadic space-time surface defines also a covering space with Galois group for an extension of rationals acting as covering group. Could n be identifiable as the order for a sub-group of Galois group?

    If this is the case, the proposed rule for heff changing phase transitions stating that the reduction of n occurs to its factor would translate to spontaneous symmetry breaking for Galois group and spontaneous - symmetry breakings indeed accompany phase transitions.

Ramified primes as referred primes for a given extension

The intuitive feeling is that the notion of preferred prime is something extremely deep and to me the deepest thing I know is number theory. Does one end up with preferred primes in number theory? This question brought to my mind the notion of ramification of primes (more precisely, of prime ideals of number field in its extension), which happens only for special primes in a given extension of number field, say rationals. Ramification is completely analogous to the degeneracy of some roots of polynomial and corresponds to criticality if the polynomial corresponds to criticality (catastrophe theory of Thom is one application). Could this be the mechanism assigning preferred prime(s) to a given elementary system, such as elementary particle? I have not considered their role earlier also their hierarchy is highly relevant in the number theoretical vision about TGD.

  1. Stating it very roughly (I hope that mathematicians tolerate this sloppy language of physicist): as one goes from number field K, say rationals Q, to its algebraic extension L, the original prime ideals in the so called integral closure over integers of K decompose to products of prime ideals of L (prime ideal is a more rigorous manner to express primeness). Note that the general ideal is analog of integer.

    Integral closure for integers of number field K is defined as the set of elements of K, which are roots of some monic polynomial with coefficients, which are integers of K having the form xn+ an-1xn-1 +...+a0. The integral closures of both K and L are considered. For instance, integral closure of algebraic extension of K over K is the extension itself. The integral closure of complex numbers over ordinary integers is the set of algebraic numbers.

    Prime ideals of K can be decomposed to products of prime ideals of L: P= ∏ Piei, where ei is the ramification index. If ei>1 is true for some i, ramification occurs. Pi:s in question are like co-inciding roots of polynomial, which for in thermodynamics and Thom's catastrophe theory corresponds to criticality. Ramification could therefore be a natural aspect of quantum criticality and ramified primes P are good candidates for preferred primes for a given extension of rationals. Note that the ramification make sense also for extensions of given extension of rationals.

  2. A physical analogy for the decomposition of ideals to ideals of extension is provided by decomposition of hadrons to valence quarks. Elementary particles becomes composite of more elementary particles in the extension. The decomposition to these more elementary primes is of form P= ∏ Pie(i), the physical analog would be the number of elementary particles of type i in the state (see this). Unramified prime P would be analogous a state with e fermions. Maximally ramified prime would be analogous to Bose-Einstein condensate of e bosons. General ramified prime would be analogous to an e-particle state containing both fermions and condensed bosons. This is of course just a formal analogy.

  3. There are two further basic notions related to ramification and characterizing it. Relative discriminant is the ideal divided by all ramified ideals in K (integer of K having no ramified prime factors) and relative different for P is the ideal of L divided by all ramified Pi:s (product of prime factors of P in L). These ideals represent the analogs of product of preferred primes P of K and primes Pi of L dividing them. These two integers ideals would characterize the ramification.
Ramified primes for preferred extensions as preferred p-adic primes?

In TGD framework the extensions of rationals (see this) and p-adic number fields (see this) are unavoidable and interpreted as an evolutionary hierarchy physically and cosmological evolution would gradually proceed to more and more complex extensions. One can say that string world sheets and partonic 2-surfaces with parameters of defining functions in increasingly complex extensions of prime emerge during evolution. Therefore ramifications and the preferred primes defined by them are unavoidable. For p-adic number fields the number of extensions is much smaller for instance for p>2 there are only 3 quadratic extensions.

How could ramification relate to p-adic and adelic physics and could it explain preferred primes?

  1. Ramified p-adic prime P=Pie would be replaced with its e:th root Pi in p-adicization. Same would apply to general ramified primes. Each un-ramified prime of K is replaced with e=K:L primes of L and ramified primes P with #{Pi}<e primes of L: the increase of algebraic dimension is smaller. An interesting question relates to p-adic length scale. What happens to p-adic length scales. Is p-adic prime effectively replaced with e:th root of p-adic prime: Lp∝ p1/2L1 → p1/2eL1? The only physical option is that the p-adic temperature for P would be scaled down Tp=1/n → 1/ne for its e:th root (for fermions serving as fundamental particles in TGD one actually has Tp=1). Could the lower temperature state be more stable and select the preferred primes as maximimally ramified ones? What about general ramified primes?

  2. This need not be the whole story. Some algebraic extensions would be more favored than others and p-adic view about realizable imaginations could be involved. p-Adic pseudo constants are expected to allow p-adic continuations of string world sheets and partonic 2-surfaces to 4-D preferred extremals with number theoretic discretization. For real continuations the situation is more difficult. For preferred extensions - and therefore for corresponding ramified primes - the number of real continuations - realizable imaginations - would be especially large.

    The challenge would be to understand why primes near powers of 2 and possibly also of other small primes would be favored. Why for them the number of realizable imaginations would be especially large so that they would be winners in number theoretical fight for survival?

NTU for functional integral

Number theoretical vision relies on NTU. In fermionic sector NTU is necessary: one cannot speak about real and p-adic fermions as separate entities and fermionic anti-commutation relations are indeed number theoretically universal.

What about NTU in case of functional integral? There are two opposite views.

  1. One can define p-adic variants of field equations without difficulties if preferred extremals are minimal surface extremals of Kähler action so that coupling constants do not appear in the solutions. If the extremal property is determined solely by the analyticity properties as it is for various conjectures, it makes sense independent of number field. Therefore there would be no need to continue the functional integral to p-adic sectors. This in accordance with the philosophy that thought cannot be put in scale. This would be also the option favored by pragmatist.

  2. Consciousness theorist might argue that also cognition and imagination allow quantum description. The supersymmetry NTU should apply also to functional integral over WCW (more precisely, its sector defined by CD) involved with the definition of scattering amplitudes.

The general vision involves some crucial observations.
  1. Only the expressions for the scatterings amplitudes should should satisfy NTU. This does not require that the functional integral satisfies NTU.

  2. Since the Gaussian and metric determinants cancel in WCW Kähler metric the contributions form maxima are proportional to action exponentials exp(Sk) divided by the ∑k exp(Sk). Loops vanish by quantum criticality.

  3. Scattering amplitudes can be defined as sums over the contributions from the maxima, which would have also stationary phase by the double extremal property made possible by the complex value of αK. These contributions are normalized by the vacuum amplitude.

    It is enough to require NTU for Xi=exp(Si)/∑k exp(Sk). This requires that Sk-Sl has form q1+q2 iπ + q3log(n). The condition brings in mind homology theory without boundary operation defined by the difference Sk-Sl. NTU for both Sk and exp(Sk) would only values of general form Sk=q1+q2 iπ + q3log(n) for Sk and this looks quite too strong a condition.

  4. If it is possible to express the 4-D exponentials as single 2-D exponential associated with union of string world sheets, vacuum functional disappears completely from consideration! There is only a sum over discretization with the same effective action and one obtains purely combinatorial expression.

See the chapter Unified Number Theoretic Vision or the article p-Adicization and adelic physics.

For a summary of earlier postings see Latest progress in TGD.

Articles and other material related to TGD.

Symplectic structure for M4, CP breaking, matter-antimatter asymmetry, and electroweak symmetry breaking


The preparation of an article about number theoretic aspects of TGD forced to go through various related ideas and led to a considerable integration of the ideas. In this note idea about the symplectic structure of M4 is discussed although it is not directly related to number theoretic aspects of TGD.
  1. Twistor lift of TGD suggests strongly a symmetry between M4 and CP2. In particular, M4 should have the analog of symplectic structure.
  2. It has been already noticed that this structure could allow to understand both CP breaking and matter-antimatter asymmetry from first principles. A further study showed that it can also allow to understand electroweak symmetry breaking.
Consider now the delicacies of this picture.
  1. Should assign also to M4 the analog of symplectic structure giving an additional contribution to the induced Kähler form? The symmetry between M4 and CP2 suggests this, and this term could be highly relevant for the understanding of the observed CP breaking and matter antimatter asymmetry. Poincare invariance is not lost since the needed moduli space for M4 Kähler forms would be the moduli space of CDs forced by ZEO in any case, and M4 Kähler form would serve as the correlate for fixing rest system and spin quantization axis in quantum measurement.

  2. Also induced spinor fields are present. The well-definedness of electro-magnetic charge for the spinor modes forces in the generic case the localization of the modes of induced spinor fields at string world sheets (and possibly to partonic 2-surfaces) at which the induced charged weak gauge fields and possibly also neutral Z0 gauge field vanish. The analogy with branes and super-symmetry force to consider two options.

    Option I: The fundamental action principle for space-time surfaces contains besides 4-D action also 2-D action assignable to string world sheets, whose topological part (magnetic flux) gives rise to a coupling term to Kähler gauge potentials assignable to the 1-D boundaries of string world sheets containing also geodesic length part. Super-symplectic symmetry demands that modified Dirac action has 1-, 2-, and 4-D parts: spinor modes would exist at both string boundaries, string world sheets, and space-time interior. A possible interpretation for the interior modes would be as generators of space-time super-symmetries.

    This option is not quite in the spirit of SH and string tension appears as an additional parameter. Also the conservation of em charge forces 2-D string world sheets carrying vanishing induced W fields and this is in conflict with the existence of 4-D spinor modes unless they satisfy the same condition. This looks strange.

    Option II: Stringy action and its fermionic counterpart are effective actions only and justified by SH. In this case there are no problems of interpretation. SH requires only that the induced spinor fields at string world sheets determine them in the interior much like the values of analytic function at curve determine it in an open set of complex plane. At the level of quantum theory the scattering amplitudes should be determined by the data at string world sheets. If induced W fields at string world sheets are vanishing, the mixing of different charge states in the interior of X4 would not make itself visible at the level of scattering amplitudes! In this case 4-D spinor modes do not define space-time super-symmetries.

    This option seems to be the only logical one. It is also simplest and means that quantum TGD would reduce to string model apart from number theoretical discretization of space-time surface bringing in dark matter as heff/h=n phases with n identifiable as factor of the order of the Galois group of extension of rationals. This would also lead to adelic physics, predict preferred extensions and identify corresponding ramified primes as preferred p-adic primes.

  3. Why the string world sheets coding for effective action should carry vanishing weak gauge fields? If M4 has the analog of Kähler structure, one can speak about Lagrangian sub-manifolds in the sense that the sum of the symplectic forms of M4 and CP2 projected to Lagrangian sub-manifold vanishes. Could the induced spinor fields for effective action be localized to generalized Lagrangian sub-manifolds? This would allow both string world sheets and 4-D space-time surfaces but SH would select 2-D Lagrangian manifolds. At the level of effective action the theory would be incredibly simple.

    Induced spinor fields at string world sheets could obey the "dynamics of avoidance" in the sense that both the induced weak gauge fields W,Z0 and induced Kähler form (to achieve this U(1) gauge potential must be sum of M4 and CP2 parts) would vanish for the regions carrying induced spinor fields. They would coupleonly to the induced em field (!) given by the vectorial R12 part of CP2 spinor curvature for D=2,4. For D=1 at boundaries of string world sheets the coupling to gauge potentials would be non-trivial since gauge potentials need not vanish there. Spinorial dynamics would be extremely simple and would conform with the vision about symmetry breaking of weak group to electromagnetic gauge group.

    The projections of canonical currents of Kähler action to string world sheets would vanish, and the projections of the 4-D modified gamma matrices would define just the induced 2-D metric. If the induced metric of space-time surface reduces to an orthogonal direct sum of string world sheet metric and metric acting in normal space, the flow defined by 4-D canonical momentum currents is parallel to string world sheet. These conditions could define the "boundary" conditions at string world sheets for SH.

To sum up, the notion M4 symplectic structure is now on rather firm basis both physically and mathematically.

For a summary of earlier postings see Latest progress in TGD.

Articles and other material related to TGD.

Thursday, January 19, 2017

Does Monster have place in TGD Universe?

Lubos Motl tells about yet another attempt to save superstring theory. The title of the posting is A monstrously symmetric cousin of our heterotic Universe and tells about the work of Natalie M. Paquette, Daniel Persson, and Roberto Volpato (Stanford, Sweden, Italy) . The newcomer is told to be a cousin of heterotic string model with E8× E8 group of symmetries.

The problem of these scenarios is that they have no connection to experimenal reality. Heterotic string is extremely ugly mathematically, probably the ugliest object created by my respected colleagues in the history of the theoretical physics. If nomen is omen its Monster group based relative is at least equally ugly;-).

Speaking more seriously, Monster group ) is a fascinating creature. In particular, there is no known way to represent its elements (what does this strictly mean is not quite clear to me). In any case, this looks strange! It would be nice to find a place also for the Monster in physics and also its physical representation to see what it really looks like;-).

If I remember correctly, Galois groups acting as automorphisms of algebraic extensions of rationals involve all finite groups, also Monster as one learns from the Wikipedia article. This is why Monster - democratically together with all other Galois groups - could have place in the number theoretical physics of TGD Universe.

  1. In TGD this would also mean the existence of a concrete representation. Galois group - in special case Monster - would transform to each other number theoretic discretizations -"spines" - of the space-time surface characterizing it as a monadic sub-manifold and give rise to an n-fold covering space, n the order of Monster group. The "spine" of the space-time surface is a number theoretic discretization with points in preferred coordinates having values in an extension of rationals (algebraic extension involving also powers of some root of e).

  2. Strong holography (SH) conjectures that one can construct preferred extremal from data given at 2-D string world sheets and partonic 2-surface provided one also fixes these discretizations with finite pinary cutoff so that they do not have "too many" points. Number theoretic discretization would break SH meaning that TGD does not fully reduce to string theory like theory with everything coded by data at 2-surfaces.

  3. Intriguingly, also effective Planck constant n= heff/h labelling dark matter as phases of ordinary matter in TGD Universe corresponds to the number for the sheets of space-time surface regarded as covering space!
    Could heff/h correspond to the order of Galois group of the extension? If so, the physics of dark matter would be number theoretical physics and could reduce to a theory of finite groups appearing as Galois groups! This would be extremely elegant climax for the story starting that radiation at EEG frequencies has quantal effect on vertebrate brain although the EEG photons energies in the standard quantum theory are too small by 10 orders of magnitude!

  4. Monster group would correspond to one particular phase of dark matter is the largest sporadic finite simple group (there are 26 (amazing!) sporadic groups) with n= 246× 320 × 59 × 76 × 112 × 133 × 17 × 19 × 23× 29× 31 × 41 × 47 × 59 ×71 ∼ 8× 1053.

  5. The values of n= heff/h would be quite generally orders of finite groups or of factors of these. For the orders of finite groups see this and this. I vaguely remember that powers of 2: n= 2k, are very strongly favoured as orders of finite groups.Note that also the order of Monster has large power of 2 as a factor.

  6. Monster appears as an automorphism group of a Kac-Moody algebra. Is this true more generally for finite groups or Galois groups? In TGD Kac-Moody algebras could emerge dynamically for preferred extremals for which sub-algebra of super-symplectic algebra and its commutator with the full algebra give rise to vanishing classical Noether charges (and annihilate the quantum states). The remnant of the super-symplectic symmetry would be Kac-Moody algebra acting on the induced spinor fields and string world sheets.

    Could these geometrically realized Galois groups appear as automorphism groups of these Kac-Moody algebras? The action of Galois group on the spine of monadic manifold must induce an automorphism of the dynamical Kac-Moody algebra so that this seems to be the case. If true, this would allow to say something highly non-trivial about the relationship between Galois group and corresponding dynamical Kac-Moody algebra. A real mathematician would be however needed to say it;-).

  7. If this intuition is correct, the subgroups of Galois groups should play a key role in the rules of phase transitions changing the value of n= heff/h. I have proposed that n can be reduced in phase transition only to a factor of n. This would correspond to the breaking of Galois symmetry group to its subgroup: spontaneous symmetry breaking number theoretically!
For a summary of earlier postings see Latest progress in TGD.

Articles and other material related to TGD.




Sunday, January 15, 2017

Anomalies of water as evidence for dark matter in TGD sense

The motivation for this brief comment came from a popular article telling that a new phase of water has been discovered in the temperature range 50-60 oC (see this ). Also Gerald Pollack (see this ) has introduced what he calls the fourth phase of water. For instance, in this phase water consists of hexagonal layers with effective H1.5O stoichiometry and the phase has high negative charge. This phase plays a key role in TGD based quantum biology. These two fourth phases of water could relate to each other if there exist a deeper mechanism explaining both these phases and various anomalies of water.

Martin Chaplin (see this ) has an extensive web page about various properties of water. The physics of water is full of anomalous features and therefore the page is a treasure trove for anyone ready to give up the reductionistic dogma. The site discusses the structure, thermodynamics, and chemistry of water. Even academically dangerous topics such as water memory and homeopathy are discussed.

One learns from this site that the physics of water involves numerous anomalies (see this ). The structural, dynamic and thermodynamic anomalies form a nested in density-temperature plane. For liquid water at atmospheric pressure of 1 bar the anomalies appear in the temperature interval 0-100 oC.

Hydrogen bonding creating a cohesion between water molecules distinguishes water from other substances. Hydrogen bonds induce the clustering of water molecules in liquid water. Hydrogen bonding is also highly relevant for the phase diagram of H2O coding for various thermodynamical properties of water (see this ). In biochemistry hydrogen bonding is involved with hydration. Bio-molecules - say amino-acids - are classified to hydrophobic, hydrophilic, and amphiphilic ones and this characterization determines to a high extent the behavior of the molecule in liquid water environment. Protein folding represents one example of this.

Anomalies are often thought to reduce to hydrogen bonding. Whether this is the case, is not obvious to me and this is why I find water so fascinating substance.

TGD indeed suggests that water decomposes into ordinary water and dark water consisting of phases with effective Planck constant heff=n× h residing at magnetic flux tubes. Hydrogen bonds would be associated with short and rigid flux tubes but for larger values of n the flux tubes would be longer by factor n and have string tension behaving as 1/n so that they would softer and could be loopy. The portional of water molecules connected by flux tubes carrying dark matter could be identified as dark water and the rest would be ordinary water. This model allows to understand various anomalies. The anomalies are largest at the physiological temperature 37 C, which conforms with the vision about the role of dark matter and dark water in living matter since the fraction of dark water would be highest at this temperature. The anomalies discussed are density anomalies, anomalies of specific heat and compressibility, and Mpemba effect. I have discussed these anomalies already for decade ago. The recent view about dark matter allows however much more detailed modelling.

For details see the chapter Dark Nuclear Physics and Condensed Matter of "Hyper-finite factors, p-adic length scale hypothesis, and dark matter hierarchy" or the article
The anomalies of water as evidence for the existence of dark matter in TGD sense
.

For a summary of earlier postings see Latest progress in TGD.

Articles and other material related to TGD.

Thursday, January 05, 2017

What does Negentropy Maximization Principle really say?

There is something in NMP that I still do not understand: every time I begin to explain what NMP is I have this unpleasant gut feeling. I have the habit of making a fresh start everytime rather than pretending that everything is crystal clear. I have indeed considered very many variants of NMP. In the following I will consider two variants of NMP. Second variant reduces to a pure number theory in adelic framework inspired by number theoretic vision. It is certainly the simplest one since it says nothing explicit about negentropy. Second variant says essentially the as "strong form of NMP", when the reduction occurs to an eigen-space of density matrix.

I will not consider zero energy ontology (ZEO) related aspects and the aspects related to the hierarchy of subsystems and selves since I dare regard these as "engineering" aspects.

What NMP should say?

What NMP should state?

  1. NMP takes in some sense the role of God and the basic question is whether we live in the best possible world or not. Theologists asks why God allows sin. I ask whether NMP demand increase of negentropy always or does it allow also reduction of negentropy? Why? Could NMP lead to increase of negentropy only in statistical sense - evolution? Could it only give potential for gaining a larger negentropy?

    These questions have turned to be highly non-trivial. My personal experience is that we do not live in the best possible world and this experience plus simplicity motivates the proposal to be discussed.

  2. Is NMP a separate principle or could NMP be reduced to mere number theory? For the latter option state function would occur to an eigenstate/eigenspace of density matrix only if the corresponding eigenvalue and eigenstate/eigenspace are expressible using numberes in the extension of rationals defining the adele considered. A phase transition to an extension of an extension containing these coefficients would be required to make possible reduction. A step in number theoretic evolution would occur. Also an entanglement of measured state pairs with those of measuring system in containing the extension of extension would make possible the reduction. Negentropy would be reduced but higher-D extension would provide potential for more negentropic entanglement. I will consider this option in the following.

  3. If one has higher-D eigenspace of density matrix, p-adic negentropy is largest for the entire subspace and the sum of real and p-adic negentropies vanishes for all of them. For negentropy identified as total p-adic negentropy strong from of NMP would select the entire sub-space and NMP would indeed say something explicit about negentropy.

The notion of entanglement negentropy

  1. Number theoretic universality demands that density matrix and entanglement coefficients are numbers in an algebraic extension of rationals extended by adding root of e. The induced p-adic extensions are finite-D and one obtains adele assigned to the extension of rationals. Real physics is replaced by adelic physics.

  2. The same entanglement in coefficients in extension of rationals can be seen as numbers is both real and various p-adic sectors. In real sector one can define real entropy and in various p-adic sectors p-adic negentropies (real valued).

  3. Question: should one define total entanglement negentropy as

    1. sum of p-adic negentropies or

    2. as difference for the sum of p-adic negentropies and real etropy. For rational entanglement probabilities real entropy equals to the sum of p-adic negentropies and total negentropy would vanish. For extensions this negentropy would be positive under natural additional conditions as shown earlier.
    Both options can be considered.

State function reduction as universal measurement interaction between any two systems


  1. The basic vision is that state function reductions occur all the for all kinds of matter and involves a measurement of density matrix ρ characterizing entanglement of the system with environment leading to a sub-space for which states have same eigenvalue of density matrix. What this measurement really is is not at all clear.

  2. The measurement of the density matrix means diagonalization of the density matrix and selection of an eigenstate or eigenspace. Diagonalization is possible without going outside the extension only if the entanglement probabilities and the coefficients of states belong to the original extension defining the adele. This need not be the case!

    More precisely, the eigenvalues of the density matrix as roots of N:th order polynomial with coefficients in extension in general belong to N-D extension of extension. Same about the coefficients of eigenstates in the original basis. Consider as example the eigen values and eigenstates of rational valued N× N entanglement matrix, which are roots of a polynomial of degree N and in general algebraic number.

    Question: Is state function reduction number theoretically forbidden in the generic case? Could entanglement be stable purely number theoretically? Could NMP reduce to just this number theoretic principle saying nothing explicit about negentropy? Could phase transition increasing the dimension of extension but keeping the entanglement coefficients unaffected make reduction possible. Could entanglement with an external system in higher-D extension -intelligent observer - make reduction possible?

  3. There is a further delicacy involved. The eigen-space of density matrix can be N-dimensional if the density matrix has N-fold degenerate eigenvalue with all N entanglement probabilities identical. For unitary entanglement matrix the density matrix is indeed N×N unit matrix. This kind of NE is stable also algebraically if the coefficients of eigenstates do not belong to the extension. If they do not belong to it then the question is whether NMP allows a reduction to subspace of and eigen space or whether only entire subspace is allowed.

    For total negentropy identified as the sum of real and p-adic negentropies for any eigenspace would vanish and would not distinguish between sub-spaces. Identification of negentropy as as p-adic negentropy would distinguish between sub-spaces and´NMP in strong form would not allow reduction to sub-spaces. Number theoretic NMP would thus also say something about negentropy.

    I have also consider the possibility of weak NMP. Any subspace could be selected and negentropy would be reduced. The worst thing to do in this case would be a selection of 1-D subspace: entanglement would be totally lost and system would be totally isolated from the rest of the world. I have proposed that this possibility corresponds to the fact that we do not seem to live in the best possible world.

NMP as a purely number theoretic constraint?

Let us consider the possibility that NMP reduces to the number theoretic condition tending to stabilize generic entanglement.

  1. Density matrix characterizing entanglement with the environment is a universal observable. Reduction can occur to an eigenspace of the density matrix. For rational entanglement probabilities the total negentropy would vanish so that NMP formulated in terms of negentropy cannot say anything about the situation. This suggests that NMP quite generally does not directly refer to negentropy.

  2. The condition that eigenstates and eigenvalues are in the extension of rationals defining the adelic physics poses a restriction. The reduction could occur only if these numbers are in the original extension. Also rational entanglement would be stable in the generic case and a phase transition to higher algebraic extension is required for state function reduction to occur. Standard quantum measurement theory would be obtained when the coefficients of eigenstates and entanglement probabilities are in the original extension.

  3. If this is not the case, a phase transition to an extension of extension containing the N-D extension of it could save the situation. This would be a step in number theoretic evolution. Reduction would lead to a reduction of negentropy but would give potential for gaining a larger entanglement negentropy. Evolution would proceed through catastrophes giving potential for more negentropic entanglement! This seems to be the case!

    Alternatively, the state pairs of the system + complement could be entangled with observer in an extension of rationals containg the needed N-D extension of extension and state function possible for observer would induce reduction in the original system. This would mean fusion with a self at higher level of evolutionary hierarchy - kind of enlightment. This would give an active role to the intelligent observer (intelligence characterized by the dimension of extension of rationals). Intelligent observer would reduce the negentropy and thus NMP would not hold true universally.

    Since higher-D extension allows higher negentropy and in the generic case NE is stable, one might hope that NMP holds true statistically (for rationals total negentropy as sum or real and total p-adic negentropies vanishes).

    The Universe would evolve rather than being a paradize: the number theoretic NMP would allow temporary reduction of negentropy but provide a potential for larger negentropy and the increase of negentropy in statistical sense is highly suggestive. To me this option looks like simplest and most realistic one.

  4. If negentropy is identified as total p-adic negentropy rather than sum of real and p-adic negentropies, strong form of NMP says something explicit about negentropy: the reduction would take place to the entire subspace having the largest p-adic negentropy.

For a summary of earlier postings see Latest progress in TGD.

Articles and other material related to TGD.

Answer to a question about general aspects of TGD

In FB I received a question about general aspects of TGD. It was impossible to answer the question with few lines and I decided to write a blog posting. I am sorry for typos in the hastily written text. A more detailed article Can one apply Occam’s razor as a general purpose debunking argument to TGD? tries to emphasize the simplicity of the basic principles of TGD and of the resulting theory.


A. In what aspects TGD extends other theory/theories of physics?

I will replace "extends" with "modifies" since TGD also simplifies in many respects. I shall restrict the considerations to the ontological level which to my view is the really important level.

  1. Space-time level is where TGD started from. Space-time as an abstract 4-geometry is replaced as space-time as 4-surface in M4× CP2. In GRT space-time is small deformation of Minkowski space.

    In TGD both Relativity Principle (RP) of Special Relativity (SRT) and General Coordinate Invariance (GCI) and Equivalence Principle (EP) of General Relativity hold true. In GRT RP is given up and leads to the loss of conservation laws since Noether theorem cannot be applied anymore: this is what led to the idea about space-time as surface in H. Strong form of holography (SH) is a further principle reducing to strong form of GCI (SGCI).

  2. TGD as a physical theory extends to a theory of consciousness and cognition. Observer as something external to the Universe becomes part of physical system - the notion of self - and quantum measurement theory which is the black sheet of quantum theory extends to a theory of consciousness and also of cognition relying of p-adic physics as correlate for cognition. Also quantum biology becomes part of fundamental physics and consciousness and life are seen as basic elements of physical existence rather than something limited to brain.

    One important aspect is a new view about time: experienced time and geometric time are not one and same thing anymore although closely related. ZEO explains how the experienced flow and its direction emerges. The prediction is that both arrows of time are possible and that this plays central role in living matter.

  3. p-Adic physics is a new element and an excellent candidate for a correlate of cognition. For instance, imagination could be understood in terms of non-determinism of p-adic partial differential equations for p-adic variants of space-time surfaces. p-Adic physics and fusion of real and various p-adic physics to adelic physics provides fusion of physics of matter with that of cognition in TGD inspired theory of cognition. This means a dramatic extension of ordinary physics. Number Theoretical Universality states that in certain sense various p-adic physics and real physics can be seen as extensions of physics based on algebraic extensions of rationals (and also those generated by roots of e inducing finite-D extensions of p-adics).

  4. Zero energy ontology (ZEO) in which so called causal diamonds (CDs, analogs Penrose diagrams) can be seen as being forced by very simple condition: the volume action forced by twistorial lift of TGD must be finite. CD would represent the perceptive field defined by finite volume of imbedding space H=M4× CP2.

    ZEO implies that conservation laws formulated only in the scale of given CD do not anymore fix select just single solution of field equations as in classical theory. Theories are strictly speaking impossible to test in the old classical ontology. In ZEO testing is possible be sequence of state function reductions giving information about zero energy states.

    In principle transition between any two zero energy states - analogous to events specified by the initial and final states of event - is in principle possible but Negentropy Maximization Principle (NMP) as basic variational principle of state function reduction and of consciousness restricts the possibilities by forcing generation of negentropy: the notion of negentropy requires p-adic physics.

    Zero energy states are quantum superpositions of classical time evolutions for 3-surfaces and classical physics becomes exact part of quantum physics: in QFTs this is only the outcome of stationary phase approximation. Path integral is replaced with well-defined functional integral- not over all possible space-time surface but pairs of 3-surfaces at the ends of space-time at opposite boundaries of CD.

    ZEO leads to a theory of consciousness as quantum measurement theory in which observer ceases to be outsider to the physical world. One also gets rid of the basic problem caused by the conflict of the non-determinism of state function reduction with the determinism of the unitary evolution. This is obviously an extension of ordinary physics.

  5. Hierarchy of Planck constants represents also an extension of quantum mechanics at QFT limi. At fundamental level one actually has the standard value of h but at QFT limit one has effective Planck constant heff =n× h, n=1,2,... this generalizes quantum theory. This scaling of h has a simple topological interpretation: space-time surface becomes n-fold covering of itself and the action becomes n-multiple of the original which can be interpreted as heff=n×h.

    The most important applications are to biology, where quantum coherence could be understood in terms of a large value of heff/h. The large n phases resembles the large N limit of gauge theories with gauge couplings behaving as α ∝ 1/N used as a kind of mathematical trick. Also gravitation is involved: heff is associated with the flux tubes mediating various interactions (being analogs to wormholes in ER-EPR correspondence). In particular, one can speak about hgr, which Nottale introduced originally and heff= hgr plays key role in quantum biology according to TGD.

B. In what sense TGD is simplification/extension of existing theory?

  1. Classical level: Space-time as 4-surface of H means a huge reduction in degrees of freedom. There are only 4 field like variables - suitably chosen 4 coordinates of H=M4× CP2. All classical gauge fields and gravitational field are fixed by the surface dynamics. There are no primary gauge fields or gravitational fields nor any other fields in TGD Universe and they appear only at the QFT limit.

    GRT limit would mean that many-sheeted space-time is replaced by single slightly curved region of M4. The test particle - small particle like 3-surface - touching the sheets simultaneously experience sum of gravitational forces and gauge forces. It is natural to assume that this superposition corresponds at QFT limit to the sum for the deviations of induced metrics of space-time sheets from flat metric and sum of induce gauge potentials. These would define the fields in standard model + GRT. At fundamental level effects rather than fields would superpose. This is absolutely essential for the possibility of reducing huge number field like degrees of freedom. One can obviously speak of emergence of various fields.

    A further simplification is that only preferred extremals for which data coding for them are reduced by SH to 2-D string like world sheets and partonic 2-surfaces are allowed. TGD is almost like string model but space-time surfaces are necessary for understanding the fact that experiments must be analyzed using classical 4-D physics. Things are extremely simple at the level of single space-time sheet.

    Complexity emerges from many-sheetedness. From these simple basic building bricks - minimal surface extremals of Kähler action (not the extremal property with respect to Kähler action and volume term strongly suggested by the number theoretical vision plus analogs of Super Virasoro conditions in initial data) - one can engineer space-time surfaces with arbitrarily complex topology - in all length scales. An extension of existing space-time concept emerges. Extremely simple locally, extremely complex globally with topological information added to the Maxwellian notion of fields (topological field quantization allowing to talk about field identify of system/field body/magnetic body.

    Another new element is the possibility of space-time regions with Euclidian signature of the induced metric. These regions correspond to 4-D "lines" of general scattering diagrams. Scattering diagrams has interpretation in terms of space-time geometry and topology.

  2. The construction of quantum TGD using canonical quantization or path integral formalism failed completely for Kähler action by its huge vacuum degeneracy. The presence of volume term still suffers from complete failure of perturbation theory and extreme non-linearity. This led to the notion of world of classical worlds (WCW) - roughly the space of 3-surfaces. Essentially pairs of 3-surfaces at the boundaries of given CD connected by preferred extremals of action realizing SH and SGCI.

    The key principle is geometrization of the entire quantum theory, not only of classical fields geometrized by space-time as surface vision. This requires geometrization of hermitian conjugation and representation of imaginary unit geometrically. Kähler geometry for WCW makes this possible and is fixed once Kähler function defining Kähler metric is known. Kähler action for a preferred extremal of Kähler action defining space-time surface as an analog of Bohr orbit was the first guess but twistor lift forced to add volume term having interpretation in terms of cosmological constant.

    Already the geometrization of loop spaces demonstrated that the geometry - if it exists - must have maximal symmetries (isometries). There are excellent reasons to expect that this is true also in D=3. Physics would be unique from its mathematical existence!


  3. WCW has also spinor structure. Spinors correspond to fermionic Fock states using oscillator operators assignable to the induced spinor fields - free spinor fiels. WCW gamma matrices are linear combinations of these oscillator operators and Fermi statistics reduces to spinor geometry.


  4. There is no quantization in TGD framework at the level of WCW. The construction of quantum states and S-matrix reduces to group theory by the huge symmetries of WCW. Therefore zero energy states of Universe (or CD) correspond formally to classical WCW spinor fields satisfying WCW Dirac equation analogous to Super Virasoro conditions and defining representations for the Yangian generalization of the isometries of WCW (so called super-symplectic group). In ZEO stated are analogous to pairs of initial and final states and the entanglement coefficients between positive and negative energy parts of zero energy states expected to be fixed by Yangian symmetry define scattering matrix and have purely group theoretic interpretation. If this is true, entire dynamics would reduce to group theory in ZEO.

C. What is the hypothetical applicability of the extension - in energies, sizes, masses etc?

TGD is a unified theory and is meant to apply in all scales. Usually the unifications rely on reductionistic philosophy and try to reduce physics to Planck scale. Also super string models tried this and failed: what happens at long length scales was completely unpredictable (landscape catastrophe).

Many-sheeted space-time however forces to adopt fractal view. Universe would be analogous to Mandelbrot fractal down to CP2 scale. This predicts scaled variants of say hadron physics and electroweak physics. p-Adic length scale hypothesis and hierarchy of phases of matter with heff=n×h interpreted as dark matter gives a quantitative realization of this view.

  1. p-Adic physics shows itself also at the level of real physics. One ends up to the vision that particle mass squared has thermal origin: the p-adic variant of particle mass square is given as thermal mass squared given by p-adic thermodynamics mappable to real mass squared by what I call canonical identification. p-Adic length scale hypothesis states that preferred p-adic primes characterizing elementary particles correspond to primes near to power of 2: p=about 2k. p-Adic length scale is proportional to p1/2.

    This hypothesis is testable and it turns out that one can predict particle mass rather accurately. This is highly non-trivial since the sensitivity to the integer k is exponential. So called Mersenne primes turn out to be especially favoured. This part of theory was originally inspired by the regularities of particle mass spectrum. I have developed arguments for why the crucial p-adic length scale hypothesis - actually its generalization - should hold true. A possible interpretation is that particles provide cognitive representations of themselves by p-adic thermodynamics.

  2. p-Adic length scale hypothesis leads also to consider the idea that particles could appear as different p-adically scaled up variants. For instance, ordinary hadrons to which one can assign Mersenne prime M107=2107-1 could have fractally scaled variants. M89 and MG,107 (Gaussian prime) would be two examples and there are indications at LHC for these scaled up variants of hadron physics. These fractal copies of hadron physics and also of electroweak physics would correspond to extension of standard model.

  3. Dark matter hierarchy predicts zoomed up copies of various particles. The simplest assumption is that masses are not changed in the zooming up. One can however consider that binding energy scale scales non-trivially. The dark phases would emerge are quantum criticality and give rise to the associated long range correlations (quantum lengths are typically scaled up by heff/h=n).

D. What is the leading correction/contribution to physical effects due to TGD onto particles, interactions, gravitation, cosmology?

  1. Concerning particles I already mentioned the key predictions.

    1. The existence of scaled variants of various particles and entire branches of physics. The fundamental quantum numbers are just standard model quantum numbers code by CP2 geometry.


    2. Particle families have topological description meaning that space-time topology would be an essential element of particle physics. The genus of partonic 2-surfaces (number of handles attached to sphere) is g=0,1,2,... and would give rise to family replication. g<2 partonic 2-surfaces have always global conformal symmetry Z2 and this suggests that they give rise to elementary particles identifiable as bound states of g handles. For g>2 this symmetry is absent in the generic case which suggests that they can be regarded as many-handle states with mass continuum rather than elementary particles. 2-D anyonic systems could represent an example of this.

    3. A hierarchy of dynamical symmetries as remnants of super-symplectic symmetry however suggests itself. The super-symplectic algebra possess infinite hierarchy of isomorphic sub-algebras with conformal weights being n-multiples of for those for the full algebra (fractal structure again possess also by ordinary conformal algebras). The hypothesis is that sub-algebra specified by n and its commutator with full algebra annihilate physical states and that corresponding classical Noether charges vanish. This would imply that super-symplectic algebra reduces to finite-D Kac-Moody algebra acting as dynamical symmetries. The connection with ADE hierarchy of Kac-Moody algebras suggests itself. This would predict new physics. Condensed matter physics comes in mind.

    4. Number theoretic vision suggests that Galois groups for the algebraic extensions of rationals act as dynamical symmetry groups. They would act on algebraic discretizations of 3-surfaces and space-time surfaces necessary to realize number theoretical universality. This would be completely new physics.

  2. Interactions would be mediated at QFT limit by standard model gauge fields and gravitons. QFT limit however loses all information about many-sheetedness and there would be anomalies reflecting this information loss. In many-sheeted space-time light can propagate along several paths and the time taken to travel along light-like geodesic from A to B depends on space-time sheet since the sheet is curved and warped. Neutrinos and gamma rays from SN1987A arriving at different times would represent a possible example of this. It is quite possible that the outer boundaries of even macroscopic objects correspond to boundaries between Euclidian and Minkowskian regions at the space-time sheet of the object.

    The failure of QFTs to describe bound states of say hydrogen atom could be second example: many-sheetedness and identification of bound states as single connected surface formed by proton and electron would be essential and taken into account in wave mechanical description but not in QFT description.

  3. Concerning gravitation the basic outcome is that by number theoretical vision all preferred extremals are extremals of both Kähler action and volume term. This is true for all known extremals what happens if one introduces the analog of Kähler form in M4 is an open question).

    Minimal surfaces carrying no K&aum;lher field would be the basic model for gravitating system. Minimal surface equation are non-linear generalization of d'Alembert equation with gravitational self-coupling to induce gravitational metric. In static case one has analog for the Laplace equation of Newtonian gravity. One obtains analog of gravitational radiation as "massless extremals" and also the analog of spherically symmetric stationary metric.

    Blackholes would be modified. Besides Schwartschild horizon which would differ from its GRT version there would be horizon where signature changes. This would give rise to a layer structure at the surface of blackhole.

  4. Concerning cosmology the hypothesis has been that RW cosmologies at QFT limit can be modelled as vacuum extremals of Kä hler action. This is admittedly ad hoc assumption inspired by the idea that one has infinitely long p-adic length scale so that cosmological constant behaving like 1/p as function of p-adic length scale assignable with volume term in action vanishes and leaves only Kähler action. This would predict that cosmology with critical is specified by a single parameter - its duration as also over-critical cosmology. Only sub-critical cosmologies have infinite duration.

    One can look at the situation also at the fundamental level. The addition of volume term implies that the only RW cosmology realizable as minimal surface is future light-cone of M4. Empty cosmology which predicts non-trivial slightly too small redshift just due to the fact that linear Minkowski time is replaced with lightcone proper time constant for the hyperboloids of M4+. Locally these space-time surfaces are however deformed by the addition of topologically condensed 3-surfaces representing matter. This gives rise to additional gravitational redshift and the net cosmological redshift. This also explains why astrophysical objects do not participate in cosmic expansion but only comove. They would have finite size and almost Minkowski metric.

    The gravitational redshift would be basically a kinematical effect. The energy and momentum of photons arriving from source would be conserved but the tangent space of observer would be Lorentz-boosted with respect to source and this would course redshift.

    The very early cosmology could be seen as gas of arbitrarily long cosmic strings in H (or M4) with 2-D M4 projection. Horizon would be infinite and TGD suggests strongly that large values of heff makes possible long range quantum correlations. The phase transition leading to generation of space-time sheets with 4-D M4 projection would generate many-sheeted space-time giving rise to GRT space-time at QFT limit. This phase transition would be the counterpart of the inflationary period and radiation would be generated in the decay of cosmic string energy to particles.

For a summary of earlier postings see Latest progress in TGD.

Articles and other material related to TGD.