Monday, April 13, 2015

Manifest unitarity and information loss in gravitational collapse

There was a guest posting in the blog of Lubos by Prof. Dejan Stojkovic from Buffalo University. The title of the post was Manifest unitarity and information loss in gravitational collapse. It explained the contents of the article Radiation from a collapsing object is manifestly unitary by Stojkovic and Saini.

The posting

The posting describes calculations carried out for a collapsing spherical mass shell, whose radius approaches its own Scwartschild radius. The metric outside the shell with radius larger than rS is assumed to be Schwartschild metric. In the interior of the shell the metric would be Minkowski metric. The system considered is second quantized massless scalar field. One can calculate the Hamiltonian of the radiation field in terms of eigenmodes of the kinetic and potential parts and by canonical quantization the Schrödinger equation for the eigenmodes reduces to that for a harmonic oscillator with time dependent frequency. Solutions can be developed in terms of solutions of time-independent harmonic oscillator. The average value of the photon number turns out to approach to that associated with a thermal distribution irrespective of initial values at the limit when the of the shell approaches its blackhole radius. The temperature is Hawking temperature. This is of course highly interesting result and should reflect the fact that Minkowski vacuum looks from the point of view of an accelerated system to be in thermal equilibrium. Manifest unitary is just what one expects.

The authors assign a density matrix to the state in the harmonic oscillator basis. Since the state is pure, the density matrix is just a projector to the quantum state since the components of the density matrix are products of the coefficients characterizing the state in the oscillator basis (there are a couple of typos in the formulas, reader certainly notices them). In Hawking's original argument the non-diagonal cross terms are neglected and one obtains a non-pure density matrix. The approach of authors is of course correct since they consider only the situation before the formation of horizon. Hawking consider the situation after the formation of horizon and assumes some un-specified process taking the non-diagonal components of the density matrix to zero. This decoherence hypothesis is one of the strange figments of insane theoretical imagination which plagues recent day theoretical physics.

Authors mention as a criterion for purity of the state the condition that the square of the density matrix has trace equal to one. This states that the density matrix is N-dimensional projector. The criterion alone does not however guarantee the purity of the state for N> 1. This is clear from the fact that the entropy is in this case non-vanishing and equal to log(N). I notice this because negentropic entanglement in TGD framework corresponds to the situation in entanglement matrix is proportional to unit matrix (that is projector). For this kind of states number theoretic counterpart of Shannon entropy makes sense and gives negative entropy meaning that entanglement carries information. Note that unitary 2-body entanglement gives rise to negentropic entanglement.

Authors inform that Hawkins used Bogoliubov transformations between initial Minkowski vacuum and final Schwartschild vacum at the end of collapse which looks like thermal distribution with Hawking temperature in terms from Minkowski space point of view. I think that here comes an essential physical point. The question is about the relationship between two observers - one might call them the observer falling into blackhole and the observer far away approximating space-time with Minkowski space. If the latter observer traces out the degrees of freedom associated with the region below horizon, the outcome is genuine density matrix and information loss. This point is not discussed in the article and authors inform that their next project is to look at the situation after the spherical shell has reached Schwartschild radius and horizon is born. One might say that all that is done concerns the system before the formation of blackhole (if it is formed at all!).

Several poorly defined notions arise when one tries to interpret the results of the calculation.


  1. What do we mean with observer? What do we mean with information? For instance, authors define information as difference between maximum entropy and real entropy. Is this definition just an ad hoc manner to get sum well-defined number christened as an information? Can we really reduce the notion of information to thermodynamics? Shouldn't we be very careful in distinguishing between thermodynamical entropy and entanglement entropy? A sub-system possessing entanglement entropy with its complement can be purified by seeing it as a part of the entire system. This entropy relates to pair of systems. Thermal entropy can be naturally assigned to an average representative of ensemble and is single particle observable.

  2. Second list of questions relates to quantum gravitation. Is blackhole really a relevant notion or just a singular outcome of a theory exceeding its limits? Does something deserving to be called blackhole collapse really occur? Is quantum theory in its recent form enough to describe what happens in this process or its analog? Do we really understand the quantal description of gravitational binding?

What TGD can say about blackholes?

The usual objection of string theory hegemony is that there are no competing scenarios so that superstring is the only "known" interesting approach to quantum gravitation (knowing in academic sense is not at all the same thing as knowing in the naive layman sense and involves a lot of sociological factors transforming actual knowing to sociological unknowing: in some situations these sociological factors can make a scientist practically blind, deaf, and as it looks - brainless!) . I dare however claim that TGD represents an approach, which leads to a new vision challenging a long list of cherished notions assigned with blackholes.

To my view blackhole science crystallizes huge amount of conceptual sloppiness. People can calculate but are not so good in concetualizing. Therefore one must start the conceptual cleaning from fundamental notions such as information, notions of time (experienced and geometric), observer, etc... In attempt to develop TGD from a bundle of ideas to a real theory I have been forced to carry out this kind of distillation and the following tries to summarize the outcome.

  1. TGD provides a fundamental description for the notions of observer and information. Observer is replaced with "self" identified in ZEO by a sequences of quantum jumps occurring at same boundary of CD and leaving it and the part of the zero energy state at it fixed whereas the second boundary of CD is delocalized and superposition for which the average distance between the tips of CDs involve increases: this gives to the experience flow of time and its correlation with the flow of geometric time. The average size of CDs simply increases and this means that the experiences geometric time increases. Self "dies" as the first state function reduction to the opposite boundary takes place and new self assignable it is born.

  2. Negentropy Maximizaton Principle favors the generation of entanglement negentropy. For states with projection operator as density matrix the number theoretic negentropy is possible for primes dividing the dimension of the projection and is maximum for the largest power of prime factor of N. Second law is replaced with its opposite but for negentropy which is two-particle observable rather than single particle observable as thermodynamical entropy. Second law follows at ensemble level from the non-determinism of the state function reduction alone.

The notions related to blackhole are also in need of profound reconsideration.

  1. Blackhole disappears in TGD framework as a fundamental object and is replaced by a space-time region having Euclidian signature of the induced metric identifiable as wormhole contact, and defining a line of generalized Feynman diagram (here "Feynmann" could be replaced with " twistor" or "Yangian" something even more appropriate). Blackhole horizon is replaced the 3-D light-like region defining the orbit of wormhole throat having degenerate metric in 4-D sense with signature (0,-1,-1,-1). The orbits of wormhole throats are carries of various quantum numbers and the sizes of M4 projections are of order CP2 size in elementary particle scales. This is why I refer to these regions also as light-like parton orbits. The wormhole contacts involved connect to space-time sheets with Minkowskian signature and stability requires that the wormhole contacts carry monopole magnetic flux. This demands at least two wormhole contacts to get closed flux lines. Elementary particles are this kind of pairs but also multiples are possible and valence quarks in baryons could be one example.

  2. The connection with GRT picture could emerge as follows. The radial component of Schwartschild-Nordström metric associated with electric charge can be deformed slightly at horizon to transform horizon to light-like surface. In the deep interior CP2 would provide gravitational instanton solution to Maxwell-Einstein system with cosmological constant and having thus Euclidian metric. This is the nearest to TGD description that one can get within GRT framework obtained from TGD at asymptotic regions by replacing many-sheeted space-time with slightly deformed region of Minkowski space and summing the gravitational fields of sheets to get the the gravitational field of M4 region.

    All physical systems have space-time sheets with Euclidian signature analogous to blackhole. The analog of blackhole horizon provides a very general definition of "elementary particle".

  3. Strong form of general coordinate invariance is central piece of TGD and implies strong form of holography stating that partonic 2-surfaces and their 4-D tangent space data should be enough to code for quantum physics. The magnetic flux tubes and fermionic strings assignable to them are however essential. The localization of induced spinor fields to string world sheets follows from the well-definedness of em charge and also from number theoretical arguments as well as generalization of twistorialization from D=4 to D=8.

    One also ends up with the analog of AdS/CFT duality applying to the generalization of conformal invariance in TGD framework. This duality states that one can describe the physics in terms of Kähler action and related bosonic data or in terms of Kähler-Dirac action and related data. In particular, Kähler action is expressible as string world sheet area in effective metric defined by Kähler-Dirac gamma matrices. Furthermore, gravitational binding is describable by strings connecting partonic 2-surfaces. The hierarchy of Planck constants is absolutely essential for the description of gravitationally bound states in thems of gravitational quantum coherence in macroscopic scales. The proportionality of the string area in effective metric to 1/heff2, heff=n× h=hgr=GMm/v0 is absolutely essential for achieving this.

    If the stringy action were the ordinary area of string world sheet as in string models, only gravitational bound states with size of order Planck length would be possible. Hence TGD forces to say that superstring models are at completely wrong track concerning the quantum description of gravitation. Even the standard quantum theory lacks something fundamental required by this goal. This something fundamental relates directly to the mathematics of extended super-conformal invariance: these algebras allow infinite number of fractal inclusion hierarchies in which algebras are isomorphic with each other. This allows to realize infinite hierarchies of quantum criticalities. As heff increases, some degrees are reduced from critical gauge degrees of freedom to genuine dynamical degrees of freedom but the system is still critical, albeit in longer scale.


  4. A naive model for the TGD analog of blackhole is as a macroscopic wormhole contact surrounded by particle wormhole contacts with throats connected to the large wormhole throats by flux tubes and strings to the large wormhole contact. The macroscopic wormhole contact would carry magnetic charge equal to the sum of those associated with elemenentary particle wormhole throats.

  5. What about black hole collapse and blackhole evaporation if blackholes are replaced with wormhole contacts with Euclidian signature of metric? Do they have any counterparts in TGD? Maybe! Any phase transition increasing heff=hgr would occur spontaneously as transitions to lower criticality and could be interpreted as analog of blackhole evaporation. The gravitationally bound object would just increase in size. I have proposed that this phase transition has happened for Earth (Cambrian explosion) and increases its radius by factor 2. This would explain the strange finding that the continents seem to fit nicely together if the radius of Earth is one half of the recent value. These phase transitions would be the quantum counterpart of smooth classical cosmic expansion.

    The phase transition reducing heff would not occur spontaneusly and in living systems metabolic energy would be needed to drive them. Indeed, from the condition that heff=hgr= GMm/v0 increases as M and v0 change also gravitational Compton length Lgr=hgr/m= GM/v0 defining the size scale of the gravitational object increases so that the spontaneous increase of hgr means increase of size.

    Does TGD predict any process resembling blackhole collapse? In Zero Energy Ontology (ZEO) state function reductions occurring at the same boundary of causal diamond (CD) define the notion of self possessing arrow of time. The first quantum state function reduction at opposite boundary is eventually forced by Negentropy Maximization Principle (NMP) and induces a reversal of geometric time. The expansion of object with a reversed arrow of geometric time with respect to observer looks like collapse. This is indeed what the geometry of causal diamond suggests.

  6. The role of strings (and magnetic flux tubes with which they are associated) in the description of gravitational binding (and possibly also other kinds of binding) is crucial in TGD framework. They are present in arbitrary long length scales since the value of gravitational Planck constant heff = hgr = GMm/v0, v0 (v0/c<1) has dimensions of velocity can have huge values as compared with those of ordinary Planck constant. This implies macroscopic quantum gravitational coherence and the fountain effect of superfluidity could be seen as an example of this.

    The presence of flux tubes and strings serves as a correlate for quantum entanglement present in all scales is highly suggestive. This entanglement could be negentropic and by NMP and could be transferred but not destroyed. The information would be coded to the relationship between two gravitationally bound systems and instead of entropy one would have enormous negentropy resources. Whether this information can be made conscious is a fascinating problem. Could one generalize the interaction free quantum measurement so that it would give information about this entanglement? Or could the transfer of this information make it conscious?

    Also super string camp has become aware about possibility of geometric and topological correlates of entanglement. The GRT based proposal relies on wormhole connections. Much older TGD based proposal applied systematically in quantum biology and TGD inspired theory of consciousness identifies magnetic flux tubes and associated fermionic string world sheets as correlates of negentropic entanglement.

Tuesday, March 31, 2015

Links to the latest progress in TGD

During last years the understanding of the mathematical aspects of TGD and of its connection with the experimental world has developed rapidly. The material is scattered to 17 books about TGD and its applications and therefore it seems appropriate give an overall view about the developments as links (mostly) to blog postings containing links to homepage. In the article The latest progress in TGD I list blog links and also some homepage links to Quantum TGD, its applications to physics, to biology and to consciousness theory with the intention to give an overall view about the development of the ideas (I did not receive the final form of TGD from heaven and have been forced to work hardly for almost four decades!).

Saturday, March 28, 2015

About Huygens Principle and TGD

Stephen made an interesting question about the relationship of Huygens principle to TGD
The answer to the question became too long to serve as a comment so that I decided to add it as a blog posting.

1. Huygens Principle

Huygens principle can be assigned most naturally with classical linear wave equations with a source term. It applies also in perturbation theory involving small non-linearities.

One can solve the d'Alembert equation Box Φ= J with a source term J by inverting the d'Alembertian operator to get a bifocal function G(x,y) what one call's Green function.

Green function is bi-local function G(x,y) and the solution generated by point infinitely strong source J localised at single space-time point y - delta function is the technical term. This description allows to think that every point space-time point acts as a source for a spherical wave described by Green function. Green function is Lorentz invariant satisfies causality: on selects the boundary conditions so that the signal is with future light cone.

There are many kind of Green functions and also Feynman propagator satisfies same equation. Now however causality in the naive sense is not exact for fermions. The distance between points x and y can be also space-like but the breaking of causality is small. Feynman propagators
form the basics of QFT description but now the situation is changing after what Nima et al have done to the theoretical physics;-). Twistors are the tool also in TGD too but generalised to 8-D case and this generalisation has been one of the big steps of progress in TGD shows that M4×CP2 is twistorially completely unique.

2. What about Huygens principle and Green functions at the level of TGD space-time?

In TGD classical field equations are extremely non-linear. Hence perturbation theory based Green function around a solution defined by canonically imbedded Minkowski space M4 in M4×CP2 fails. Even worse: the Green function would vanish identically because Kahler action is non-vanishing only in fourth order for the perturbations of canonically imbedded M4! This total breakdown of perturbation theory forces to forget standard ways to quantise TGD and I ended up with the world of classical worlds: geometrization of the space of 3-surfaces. Later zero energy ontology emerged and 3-surfaces were replaced by pairs of 3-surfaces at opposite boundaries of causal diamond CD defining the portion of imbedding space which can be perceived by conscious entity in given scale. Scale hierarchy is explicitly present.

Preferred externals in space-time regions with Minkowskian signature of induced metric decompose to topological light-rays which behave like quantum of massless radiation field. Massless externals for instance are space-time tubes carrying superposition of waves in same light-like direction proceeding. Restricted superposition replaces superposition for single space-time sheet whereas unlimited superposition holds only for the effects caused by space-time sheets to at test particle touching them simultaneously.

The shape of the radiation pulse is preserved which means soliton like behaviour: form of pulse is preserved, velocity of propagation is maximal, and the pulse is precisely targeted. Classical wave equation is "already quantized". This has very strong implications for communications and control in living matter . The GRT approximation of many-sheetedness of course masks tall these beauty as it masked also dark matter, and we see only some anomalies such as several light velocities for signals from SN1987A.

In geometric optics rays are a key notion. In TGD they correspond to light-like orbits of partonic 2-surfaces. The light-like orbit of partonic 2-surface is a highly non-curved analog of light-one boundary - the signature of the induced metric changes at it from Minkowskian to Eucldian at it. Partonic 2-surface need not expand like sphere for ordinary light-cone. Strong gravitational effects make the signature of the induced metric 3-metric (0,-1,-1) at partonic 2-surfaces. There is a strong analogy with Schwartscild horizon but also differences: for Scwartschild blackhole the interior has
Minkowskian signature.

3. What about fermonic variant of Huygens principle?

In fermionic sector spinors are localised at string world sheets and obey Kähler-Dirac equation which by conformal invariance is just what spinors obey in super string models. Holomorphy in hypercomplex coordinate gives the solutions in universal form, which depends on the conformal equivalence class of the effective metric defined by the anti-commutators of Kähler-Dirac gamma matrices at string world sheet. Strings are associated with magnetic flux tubes carrying monopole flux and it would seem that the cosmic web of these flux tubes defines the wiring along which fermions propagate.

The behavior of spinors at the 1-D light-like boundaries of string world sheets carrying fermion number has been a long lasting head ache. Should one introduce a Dirac type action these lines?. Twistor approach and Feynman diagrammatics suggest that fundamental fermionic propagator should emerge from this action.

I finally t turned out that one must assign 1-D massless Dirac action in induced metric and also its 1-D super counterpart as line length which however vanishes for solutions. The solutions of Dirac equation have 8-D light-like momentum assignable to the 1-D curves, which are 8-D light-like geodesics of M4×CP2. The 4-momentum of fermion line is time-like or light-like so that the propagation is inside future light-cone rather than only along future light-cone as in Huygens principle.

The propagation of fundamental fermions and elementary particles obtained as the composites is inside the future light-one, not only along light-cone boundary with light-velocity. This reflects the presence of CP2 degrees of freedom directly and leads to massivation.

To sum up, quantized form of Huygens principle but formulated statistically for partonic fermionic lines at partonic 2-surfaces, for partonic 2-surfaces, or for the masses quantum like regions of space-time regions - could hold true. Transition from TGD to GRT limit by approximating many-sheeted space-time with region of M4 should give Huygens principle. Especially interesting is 8-D generalisation of Huygens principle implying that boundary of 4-D future light-cone is replaced by its interior. 8-D notion of twistor should be relevant here.

Sunday, March 22, 2015

Cell memory and magnetic body

In Ulla's "Thinking Allowed" there was a very interesting link to a popular article telling about claimed discovery of mechanism of cell memory. In the case considered memory means now the ability of mother and daughter cell to remember what was or what should should be their identity as highly differentiated cells. In the division this information seems to be completely forgotten in the sense of standard biochemistry. How is it regained: this is the problem!

Transcription factor proteins bound to DNA guarantee that cell expresses itself according to its differentiation. I have never asked myself what the mechanism of differentiation is (I should have a proper emoticon to describe this unpleasant feeling)! Transcription factors is it: they guarantee that correct genes are expressed.

As cell replicates the transcription factors disappear temporarily but are restored in mother and daughter cell later. How this happens looks like a complete mystery in the ordinary biochemistry in which one has soup of all kinds stupid molecules moving randomly around and making random collisions with similar idiots;-).

This problem is much more general and plagues all biochemistry. How just the correct randomly moving biomolecules of the dense molecular soup find each other - say say DNA and mRNA in transcription process and DNA and its conjugate in replication?

The TGD based answer is that reacting molecules are connected or get connected by rather long magnetic flux tubes, which actually appear as pairs (this is not relevant for the argument). Then magnetic flux tubes contract and force the reacting molecules close to each other. The contraction of the dark magnetic flux tube is induced by a reduction of Plankc constant heff=n×h: this does not occur spontaneously since one ends up to a higher criticality. This conclusions follows by accepting the association of a hierarchy of criticalities to a hierarchy of Planck constants and fractal hierarchy of symmetry breakings for what I call supersymplectic algebra possessing natural conformal structure and the hierarchy of isomorphic sub-algebras for which the conformal weights of the original algebra are multipled by integer n characterizing the sub-algebra. Metabolic energy would be needed at this stage.

As a matter fact, the general rule would be that anything requiring reduction of Planck constant demands metabolic energy. Life can be seen as an endless attempt to bet back to higher criticality and spontaneous drifting to lower criticality. Buddhists understood this long time ago and talked about Karma's law: the purpose of life is to keep heff low and fight with all means to avoid spiritual awakening;-). In biological death we have again the opportunity to get rid of this cycle and get enlightened. Personally I do not dare to be optimistic;-).

In the case of cell replication also the transcription factors replicate just as the DNA but stay farther away from DNA during the replication: the value of heff has spontaneously increased during replication period as it happens as conscious entity "dies";-). When the time is ripe, their heff is reduced and they return to their proper place near DNA and the fight with Karma's law continues again. Note that this shows also that death is not a real thing at molecular level. Same should be true at higher levels of fractal self hierarchy.

Second quantisation of Kähler-Dirac action

Second quantization of Kähler-Dirac action is crucial for the construction of the Kähler metric of world of classical worlds as anticommutators of gamma matrices identified as super-symplectic Noether charges. To get a unique result, the anticommutation relations must be fixed uniquely. This has turned out to be far from trivial.

The canonical manner to second quantize fermions identifies spinorial canonical momentum densities and their conjugates as Πbar= ∂ LKD/∂Ψ= ΨbarΓt and their conjugates. The vanishing of Kähler-Dirac gamma matrix Γt at points, where the induced Kähler form J vanishes can cause problems since anti-commutation relations are not internally consistent anymore. This led me to give up the canonical quantization and to consider various alternatives consistent with the possibility that J vanishes. They were admittedly somewhat ad hoc. Correct (anti-)commutation relations for various fermionic Noether currents seem however to fix the anti-commutation relations to the standard ones. It seems that it is better to be conservative: the canonical method is heavily tested and turns out to work quite nicely.

Consider first the 4-D situation without the localization to 2-D string world sheets. The canonical anti-commutation relations would state {Πbar, Ψ}= δ3(x,y) at the space-like boundaries of the string world sheet at either boundary of CD. At points where J and thus K-D gamma matrix ΓTt vanishes, canonical momentum density vanishes identically and the equation seems to be inconsistent.

If fermions are localized at string world sheets assumed to always carry a non-vanishing J at their boundaries at the ends of space-time surfaces, the situation changes since Γt is non-vanishing. The localization to string world sheets, which are not vacua saves the situation. The problem is that the limit when string approaches vacuum could be very singular and discontinuous. In the case of elementary particle strings are associated with flux tubes carrying monopole fluxes so that the problem disappears.

It is better to formulate the anti-commutation relations for the modes of the induced spinor field. By starting from

{Πbar (x),Ψ (y)}=δ1(x,y)

and contracting with Ψ(x) and Π (y) and integrating, one obtains using orthonormality of the modes of Ψ the result

{bm,bn} = γ0 δm,n

holding for the modes with non-vanishing norm. At the limit J→ 0 there are no modes with non-vanishing norm so that one avoids the conflict between the two sides of the equation.

Quantum deformation introducing braid statistics is of considerable interest. Quantum deformations are essentially 2-D phenomenon, and the condition that it indeed occurs gives a further strong support for the localization of spinors at string world sheets. If the existence of anyonic phases is taken completely seriously, it supports the existence of the hierarchy of Planck constants and TGD view about dark matter. Note that the localization also at partonic 2-surfaces cannot be excluded yet.

I have wondered whether quantum deformation could relate to the hierarchy of Planck constants in the sense that n=heff/h corresponds to the value of deformation parameter q=exp(i2π/n). The quantum deformed anti-commutation relations

bb+q-1bb= q-N

are obtained by posing the constraints that the eigenvalues of bb and bb are Nq (1-N)q. Here N=,1 is the number of fermions in the mode (see this). The modification to the recent case is obvious.

What TGD is and what it is not

People, in particular those in academy, tend to see TGD from their perspective often defined by a heavy specialization to a rather narrow discipline. This is of course understandable but often leads to rather comic mis-understandings and considerable intellectual violence and my heart is crying when I see how brilliant ideas are bleeding in the heavy grasp of big academic hands. The following is a humble attempt to express concisely what TGD is not and also what new TGD can give to physics - just to avoid more violence.

  1. TGD is not just General Relativity made concrete by using imbeddings: the 4-surface property is absolutely essential for unifying standard model physics with gravitation. The many-sheeted space-time of TGD gives rise only at macroscopic limit to GRT space-time as a slightly curved Minkowski space. TGD is not a Kaluza-Klein theory although color gauge potentials are analogous to gauge potentials in these theories. TGD is not a particular string model although string world sheets emerge in TGD very naturally as loci for spinor modes: their 2-dimensionality makes among other things possible quantum deformation of quantization known to be physically realized in condensed matter, and conjectured in TGD framework to be crucial for understanding the notion of finite measurement resolution. TGD space-time is 4-D and its dimension is due to completely unique conformal properties of 3-D light-like surfaces implying enormous extension of the ordinary conformal symmetries. TGD is not obtained by performing Poincare gauging of space-time to introduce gravitation.

  2. In TGD framework the counterparts of also ordinary gauge symmetries are assigned to super-symplectic algebra, which is a generalization of Kac-Moody algebras rather than gauge algebra and suffers a fractal hierarchy of symmetry defining hierarchy of criticalities. TGD is not one more quantum field theory like structure based on path integral formalism: path integral is replaced with functional integral over 3-surfaces, and the notion of classical space-time becomes exact part of the theory. Quantum theory becomes formally a purely classical theory of WCW spinor fields: only state function reduction is something genuinely quantal.

  3. TGD is in some sense extremely conservative geometrization of entire quantum physics: no additional structures such as torsion and gauge fields as independent dynamical degrees of freedom are introduced: Kähler geometry and associated spinor structure are enough. Twistor space emerges as a technical tool and its Kähler structure is possible only for H=M4× CP2. What is genuinely new is the infinite-dimensional character of the Kähler geometry making it highly unique, and its generalization to p-adic number fields to describe correlates of cognition. Also the hierarchies of Planck constants heff=n× h and p-adic length scales and Zero Energy Ontology represent something genuinely new.

Friday, March 20, 2015

Could the Universe be doing Yangian quantum arithmetics?

One of the old TGD inspired really crazy ideas about scattering amplitudes is that Universe is doing some sort of arithmetics so that scattering amplitude are representations for computational sequences of minimum length. The idea is so crazy that I have even given up its original form, which led to an attempt to assimilate the basic ideas about bi-algebras, quantum groups, Yangians and related exotic things. The work with twistor Grassmannian approach inspired a reconsideration of the original idea seriously with the idea that super-symplectic Yangian could define the arithmetics. I try to describe the background, motivation, and the ensuing reckless speculations in the following.

Do scattering amplitudes represent quantal algebraic manipulations?

  1. I seems that tensor product ⊗ and direct sum ⊕ - very much analogous to product and sum but defined between Hilbert spaces rather than numbers - are naturally associated with the basic vertices of TGD. I have written about this a highly speculative chapter - both mathematically and physically.

    1. In ⊗ vertex 3-surface splits to two 3-surfaces meaning that the 2 "incoming" 4-surfaces meet at single common 3-surface and become the outgoing 3-surface: 3 lines of Feynman diagram meeting at their ends. This has a lower-dimensional shadow realized for partonic 2-surfaces. This topological 3-particle vertex would be higher-D variant of 3-vertex for Feynman diagrams.

    2. The second vertex is trouser vertex for strings generalized so that it applies to 3-surfaces. It does not represent particle decay as in string models but the branching of the particle wave function so that particle can be said to propagate along two different paths simultaneously. In double slit experiment this would occur for the photon space-time sheets.
  2. The idea is that Universe is doing arithmetics of some kind in the sense that particle 3-vertex in the above topological sense represents either multiplication or its time-reversal co-multiplication.
The product, call it •, can be something very general, say algebraic operation assignable to some algebraic structure. The algebraic structure could be almost anything: a random list of structures popping into mind consists of group, Lie-algebra, super-conformal algebra quantum algebra, Yangian, etc.... The algebraic operation • can be group multiplication, Lie-bracket, its generalization to super-algebra level, etc...). Tensor product and thus linear (Hilbert) spaces are involved always, and in product operation tensor product ⊗ is replaced with •.
  1. The product Ak⊗ Al→ C= Ak• Al is analogous to a particle reaction in which particles Ak and Al fuse to particle Ak⊗ Al→ C=Ak• Al. One can say that ⊗ between reactants is transformed to • in the particle reaction: kind of bound state is formed.

  2. There are very many pairs Ak, Al giving the same product C just as given integer can be divided in many manners to a product of two integers if it is not prime. This of course suggests that elementary particles are primes of the algebra if this notion is defined for it! One can use some basis for the algebra and in this basis one has C=Ak• Al= fklmAm, fklm are the structure constants of the algebra and satisfy constraints. For instance, associativity A(BC)=(AB)C is a constraint making the life of algebraist more tolerable and is almost routinely assumed.

    For instance, in the number theoretic approach to TGD associativity is proposed to serve as fundamental law of physics and allows to identify space-time surfaces as 4-surfaces with associative (quaternionic) tangent space or normal space at each point of octonionic imbedding space M4× CP2. Lie algebras are not associative but Jacobi-identities following from the associativity of Lie group product replace associativity.

  3. Co-product can be said to be time reversal of the algebraic operation •. Co-product can be defined as C=Ak→ ∑lm fklmAl⊗ Bm is co-product in which one has quantum superposition of final states which can fuse to C (Ak⊗ Bkl→ C=Ak• Bl is possible). One can say that • is replaced with ⊗: bound state decays to a superposition of all pairs, which can form the bound states by product vertex.
There are motivations for representing scattering amplitudes as sequences of algebraic operations performed for the incoming set of particles leading to an outgoing set of particles with particles identified as algebraic objects acting on vacuum state. The outcome would be analogous to Feynman diagrams but only the diagram with minimal length to which a preferred extremal can be assigned is needed. Larger ones must be equivalent with it.

The question is whether it could be indeed possible to characterize particle reactions as computations involving transformation of tensor products to products in vertices and co-products to tensor products in co-vertices (time reversals of the vertices). A couple of examples gives some idea about what is involved.

  1. The simplest operations would preserve particle number and to just permute the particles: the permutation generalizes to a braiding and the scattering matrix would be basically unitary braiding matrix utilized in topological quantum computation.

  2. A more complex situation occurs, when the number of particles is preserved but quantum numbers for the final state are not same as for the initial state so that particles must interact. This requires both product and co-product vertices. For instance, Ak⊗ Al→ fklmAm followed by Am→ fmrsAr⊗ As giving Ak→ fklmfmrsAr⊗ As representing 2-particle scattering. State function reduction in the final state can select any pair Ar⊗ As in the final state. This reaction is characterized by the ordinary tree diagram in which two lines fuse to single line and defuse back to two lines. Note also that there is a non-deterministic element involved. A given final state can be achieved from a given initial state after large enough number of trials. The analogy with problem solving and mathematical theorem proving is obvious. If the interpretation is correct, Universe would be problem solver and theorem prover!

  3. More complex reactions affect also the particle number. 3-vertex and its co-vertex are the simplest examples and generate more complex particle number changing vertices. For instance, on twistor Grassmann approach on can construct all diagrams using two 3-vertices. This encourages the restriction to 3-vertice (recall that fermions have only 2-vertices)

  4. Intuitively it is clear that the final collection of algebraic objects can be reached by a large - maybe infinite - number of ways. It seems also clear that there is the shortest manner to end up to the final state from a given initial state. Of course, it can happen that there is no way to achieve it! For instance, if • corresponds to group multiplication the co-vertex can lead only to a pair of particles for which the product of final state group elements equals to the initial state group element.

  5. Quantum theorists of course worry about unitarity. How can avoid the situation in which the product gives zero if the outcome is element of linear space. Somehow the product should be such that this can be avoided. For instance, if product is Lie-algebra commutator, Cartan algebra would give zero as outcome.

Generalized Feynman diagram as shortest possible algebraic manipulation connecting initial and final algebraic objects

There is a strong motivation for the interpretation of generalized Feynman diagrams as shortest possible algebraic operations connecting initial and final states. The reason is that in TGD one does not have path integral over all possible space-time surfaces connecting the 3-surfaces at the ends of CD. Rather, one has in the optimal situation a space-time surface unique apart from conformal gauge degeneracy connecting the 3-surfaces at the ends of CD (they can have disjoint components).

Path integral is replaced with integral over 3-surfaces. There is therefore only single minimal generalized Feynman diagram (or twistor diagram, or whatever is the appropriate term). It would be nice if this diagram had interpretation as the shortest possible computation leading from the initial state to the final state specified by 3-surfaces and basically fermionic states at them. This would of course simplify enormously the theory and the connection to the twistor Grassmann approach is very suggestive. A further motivation comes from the observation that the state basis created by the fermionic Clifford algebra has an interpretation in terms of Boolean quantum logic and that in ZEO the fermionic states would have interpretation as analogs of Boolean statements A→ B.

To see whether and how this idea could be realized in TGD framework, let us try to find counterparts for the basic operations ⊗ and • and identify the algebra involved. Consider first the basic geometric objects.

  1. Tensor product could correspond geometrically to two disjoint 3-surfaces representing 3-particles. Partonic 2-surfaces associated with a given 3-surface represent second possibility. The splitting of a partonic 2-surface to two could be the geometric counterpart for co-product.

  2. Partonic 2-surfaces are however connected to each other and possibly even to themselves by strings. It seems that partonic 2-surface cannot be the basic unit. Indeed, elementary particles are identified as pairs of wormhole throats (partonic 2-surfaces) with magnetic monopole flux flowing from throat to another at first space-time sheet, then through throat to another sheet, then back along second sheet to the lower throat of the first contact and then back to the thirst throat. This unit seems to be the natural basic object to consider. The flux tubes at both sheets are accompanied by fermionic strings. Whether also wormhole throats contain strings so that one would have single closed string rather than two open ones, is an open question.

  3. The connecting strings give rise to the formation of gravitationally bound states and the hierarchy of Planck constants is crucially involved. For elementary particle there are just two wormhole contacts each involving two wormhole throats connected by wormhole contact. Wormhole throats are connected by one or more strings, which define space-like boundaries of corresponding string world sheets at the boundaries of CD. These strings are responsible for the formation of bound states, even macroscopic gravitational bound states.
Super-symplectic Yangian would be a reasonable guess for the algebra involved.
  1. The 2-local generators of Yangian would be of form TA1= fABCTB⊗ TC, where fABC are the structure constants of the super-symplectic algebra. n-local generators would be obtained by iterating this rule. Note that the generator TA1 creates an entangled state of TB and TC with fABC the entanglement coefficients. TAn is entangled state of TB and TCn-1 with the same coefficients. A kind replication of TAn-1 is clearly involved, and the fundamental replication is that of TA. Note that one can start from any irreducible representation with well defined symplectic quantum numbers and form similar hierarchy by using TA and the representation as a starting point.

    That the hierarchy TAn and hierarchies irreducible representations would define a hierarchy of states associated with the partonic 2-surface is a highly non-trivial and powerful hypothesis about the formation of many-fermion bound states inside partonic 2-surfaces.

  2. The charges TA correspond to fermionic and bosonic super-symplectic generators. The geometric counterpart for the replication at the lowest level could correspond to a fermionic/bosonic string carrying super-symplectic generator splitting to fermionic/bosonic string and a string carrying bosonic symplectic generator TA. This splitting of string brings in mind the basic gauge boson-gauge boson or gauge boson-fermion vertex.

    The vision about emission of virtual particle suggests that the entire wormhole contact pair replicates. Second wormhole throat would carry the string corresponding to TA assignable to gauge boson naturally. TA should involve pairs of fermionic creation and annihilation operators as well as fermionic and anti-fermionic creation operator (and annihilation operators) as in quantum field theory.

  3. Bosonic emergence suggests that bosonic generators are constructed from fermion pairs with fermion and anti-fermion at opposite wormhole throats: this would allow to avoid the problems with the singular character of purely local fermion current. Fermionic and anti-fermionic string would reside at opposite space-time sheets and the whole structure would correspond to a closed magnetic tube carrying monopole flux. Fermions would correspond to superpositions of states in which string is located at either half of the closed flux tube.

  4. The basic arithmetic operation in co-vertex would be co-multiplication transforming TAn to TAn+1 = fABCTBn ⊗ TC. In vertex the transformation of TAn+1 to TAn would take place. The interpretations would be as emission/absorption of gauge boson. One must include also emission of fermion and this means replacement of TA with corresponding fermionic generators FA, so that the fermion number of the second part of the state is reduced by one unit. Particle reactions would be more than mere braidings and re-grouping of fermions and anti-fermions inside partonic 2-surfaces, which can split.

  5. Inside the light-like orbits of the partonic 2-surfaces there is also a braiding affecting the M-matrix. The arithmetics involved would be therefore essentially that of measuring and "co-measuring" symplectic charges.

    Generalized Feynman diagrams (preferred extremals) connecting given 3-surfaces and many-fermion states (bosons are counted as fermion-anti-fermion states) would have a minimum number of vertices and co-vertices. The splitting of string lines implies creation of pairs of fermion lines. Whether regroupings are part of the story is not quite clear. In any case, without the replication of 3-surfaces it would not be possible to understand processes like e-e scattering by photon exchange in the proposed picture.

This was not the whole story yet

The proposed amplitude represents only the value of WCW spinor field for single pair of 3-surfaces at the opposite boundaries of given CD. Hence Yangian construction does not tell the whole story.

  1. Yangian algebra would give only the vertices of the scattering amplitudes. On basis of previous considerations, one expects that each fermion line carries propagator defined by 8-momentum. The structure would resemble that of super-symmetric YM theory. Fermionic propagators should emerge from summing over intermediate fermion states in various vertices and one would have integrations over virtual momenta which are carried as residue integrations in twistor Grassmann approach. 8-D counterpart of twistorialization would apply.

  2. Super-symplectic Yangian would give the scattering amplitudes for single space-time surface and the purely group theoretical form of these amplitudes gives hopes about the independence of the scattering amplitude on the pair of 3-surfaces at the ends of CD near the maximum of Kähler function. This is perhaps too much to hope except approximately but if true, the integration over WCW would give only exponent of Kähler action since metric and poorly defined Gaussian and determinants would cancel by the basic properties of Kähler metric. Exponent would give a non-analytic dependence on αK.

    The Yangian supercharges are proportional to 1/αK since covariant Kähler-Dirac gamma matrices are proportional to canonical momentum currents of Kähler action and thus to 1/αK. Perturbation theory in powers of αK= gK2/4πhbareff is possible after factorizing out the exponent of vacuum functional at the maximum of Kähler function and the factors 1/αK multiplying super-symplectic charges.

    The additional complication is that the characteristics of preferred extremals contributing significantly to the scattering amplitudes are expected to depend on the value of αK by quantum interference effects. Kähler action is proportional to 1/αK. The analogy of AdS/CFT correspondence states the expressibility of Kähler function in terms of string area in the effective metric defined by the anti-commutators of K-D matrices. Interference effects eliminate string length for which the area action has a value considerably larger than one so that the string length and thus also the minimal size of CD containing it scales as heff. Quantum interference effects therefore give an additional dependence of Yangian super-charges on heff leading to a perturbative expansion in powers of αK although the basic expression for scattering amplitude would not suggest this.

See the chapter Classical part of the twistor story or the article Classical part of the twistor story.

Wednesday, March 18, 2015

Hierarchies of conformal symmetry breakings, Planck constants, and inclusions of hyperfinite factors of type II1

The basic almost prediction of TGD is a fractal hierarchy of breakings of symplectic symmetry as a gauge symmetry.

It is good to briefly summarize the basic facts about the symplectic algebra assigned with δ M4+/-× CP2 first.

  1. Symplectic algebra has the structure of Virasoro algebra with respect to the light-like radial coordinate rM of the light-cone boundary taking the role of complex coordinate for ordinary conformal symmetry. The Hamiltonians generating symplectic symmetries can be chosen to be proportional to functions fn(rM). What is the natural choice for fn(rM) is not quite clear. Ordinary conformal invariance would suggests fn(rM)=rMn. A more adventurous possibility is that the algebra is generated by Hamiltonians with fn(rM)= r-s, where s is a root of Riemann Zeta so that one has either s=1/2+iy (roots at critical line) or s=-2n, n>0 (roots at negative real axis).

  2. The set of conformal weights would be linear space spanned by combinations of all roots with integer coefficients s= n - iy, s=∑ niyi, n>-n0, where -n0≥ 0 is negative conformal weight. Mass squared is proportional to the total conformal weight and must be real demanding y=∑ yi=0 for physical states: I call this conformal confinement analogous to color confinement. One could even consider introducing the analog of binding energy as "binding conformal weight".

    Mass squared must be also non-negative (no tachyons) giving n0≥ 0. The generating conformal weights however have negative real part -1/2 and are thus tachyonic. Rather remarkably, p-adic mass calculations force to assume negative half-integer valued ground state conformal weight. This plus the fact that the zeros of Riemann Zeta has been indeed assigned with critical systems forces to take the Riemannian variant of conformal weight spectrum with seriousness. The algebra allows also now infinite hierarchy of conformal sub-algebras with weights coming as n-ples of the conformal weights of the entire algebra.

  3. The outcome would be an infinite number of hierarchies of symplectic conformal symmetry breakings. Only the generators of the sub-algebra of the symplectic algebra with radial conformal weight proportional to n would act as gauge symmetries at given level of the hierarchy. In the hierarchy ni divides ni+1 . In the symmetry breaking ni→ ni+1 the conformal charges, which vanished earlier, would become non-vanishing. Gauge degrees of freedom would transform to physical degrees of freedom.

  4. What about the conformal Kac-Moody algebras associated with spinor modes. It seems that in this case one can assume that the conformal gauge symmetry is exact just as in string models.

The natural interpretation of the conformal hierarchies ni→ ni+1 would be in terms of increasing measurement resolution.

  1. Conformal degrees of freedom below measurement resolution would be gauge degrees of freedom and correspond to generators with conformal weight proportional to ni. Conformal hierarchies and associated hierarchies of Planck constants and n-fold coverings of space-time surface connecting the 3-surfaces at the ends of causal diamond would give a concrete realization of the inclusion hierarchies for hyper-finite factors of type II1.

    ni could correspond to the integer labelling Jones inclusions and associating with them the quantum group phase factor Un=exp(i2π/n), n≥ 3 and the index of inclusion given by |M:N| = 4cos2(2π/n) defining the fractal dimension assignable to the degrees of freedom above the measurement resolution. The sub-algebra with weights coming as n-multiples of the basic conformal weights would act as gauge symmetries realizing the idea that these degrees of freedom are below measurement resolution.

  2. If heff =n× h defines the conformal gauge sub-algebra, the improvement of the resolution would scale up the Compton scales and would quite concretely correspond to a zoom analogous to that done for Mandelbrot fractal to get new details visible. From the point of view of cognition the improving resolution would fit nicely with the recent view about heff/h as a kind of intelligence quotient.

    This interpretation might make sense for the symplectic algebra of δ M4+/- × CP2 for which the light-like radial coordinate rM of light-cone boundary takes the role of complex coordinate. The reason is that symplectic algebra acts as isometries.

  3. If Kähler action has vanishing total variation under deformations defined by the broken conformal symmetries, the corresponding conformal charges are conserved. The components of WCW Kähler metric expressible in terms of second derivatives of Kähler function can be however non-vanishing and have also components, which correspond to WCW coordinates associated with different partonic 2-surfaces. This conforms with the idea that conformal algebras extend to Yangian algebras generalizing the Yangian symmetry of N =4 symmetric gauge theories. The deformations defined by symplectic transformations acting gauge symmetries the second variation vanishes and there is not contribution to WCW Kähler metric.

  4. One can interpret the situation also in terms of consciousness theory. The larger the value of heff, the lower the criticality, the more sensitive the measurement instrument since new degrees of freedom become physical, the better the resolution. In p-adic context large n means better resolution in angle degrees of freedom by introducing the phase exp(i2π/n) to the algebraic extension and better cognitive resolution. Also the emergence of negentropic entanglement characterized by n× n unitary matrix with density matrix proportional to unit matrix means higher level conceptualization with more abstract concepts.

The extension of the super-conformal algebra to a larger Yangian algebra is highly suggestive and gives and additional aspect to the notion of measurement resolution.
  1. Yangian would be generated from the algebra of super-conformal charges assigned with the points pairs belonging to two partonic 2-surfaces as stringy Noether charges assignable to strings connecting them. For super-conformal algebra associated with pair of partonic surface only single string associated with the partonic 2-surface. This measurement resolution is the almost the poorest possible (no strings at all would be no measurement resolution at all!).

  2. Situation improves if one has a collection of strings connecting set of points of partonic 2-surface to other partonic 2-surface(s). This requires generalization of the super-conformal algebra in order to get the appropriate mathematics. Tensor powers of single string super-conformal charges spaces are obviously involved and the extended super-conformal generators must be multi-local and carry multi-stringy information about physics.

  3. The generalization at the first step is simple and based on the idea that co-product is the "time inverse" of product assigning to single generator sum of tensor products of generators giving via commutator rise to the generator. The outcome would be expressible using the structure constants of the super-conformal algebra schematically a Q1A= fABCQB⊗ QC. Here QB and QC are super-conformal charges associated with separate strings so that 2-local generators are obtained. One can iterate this construction and get a hierarchy of n-local generators involving products of n stringy super-conformal charges. The larger the value of n, the better the resolution, the more information is coded to the fermionic state about the partonic 2-surface and 3-surface. This affects the space-time surface and hence WCW metric but not the 3-surface so that the interpretation in terms of improved measurement resolution makes sense. This super-symplectic Yangian would be behind the quantum groups and Jones inclusions in TGD Universe.

  4. n gives also the number of space-time sheets in the singular covering. One possible interpretation is in terms measurement resolution for counting the number of space-time sheets. Our recent quantum physics would only see single space-time sheet representing visible manner and dark matter would become visible only for n>1.


It is not an accident that quantum phases are assignable to Yangian algebras, to quantum groups, and to inclusions of HFFs. The new deep notion added to this existing complex of high level mathematical concepts are hierarchy of Planck constants, dark matter hierarchy, hierarchy of criticalities, and negentropic entanglement representing physical notions. All these aspects represent new physics.

Tuesday, March 17, 2015

Is the view about evolution as approach away from criticality consistent with biology?

The naive idea would be that living systems are thermodynamically critical so that life would be inherently unstable phenomenon. One can find support for this view. For instance, living matter as we know it functions in rather narrow temperature range. In this picture the problem is how the emergence of life is possible at all.

TGD suggests a different view. Evolution corresponds to the transformation of gauge degrees of freedom to dynamical ones and leads away from quantum criticality rather than towards it. Which view is correct?

The argument below supports the view that evolution indeed involves a spontaneous drift away from maximal quantum criticality. One cannot however avoid the feeling about the presence of a paradox.

  1. Maybe the crux of paradox is that quantum criticality relies on NMP and thermodynamical criticality relies on second law which follows from NMP at ensemble level for ordinary entanglement (as opposed to negentropic one) at least. Quantum criticality is geometric criticality of preferred extremals and thermodynamical criticality criticality against the first state function reduction at opposite boundary of CD inducing decoherence and "death" of self defined by the sequence of state function reductions at fixed boundary of CD. NMP would be behind both criticalities: it would stabilize self and force the first quantum jump killing the self.

  2. Perhaps the point is that living systems are able to stay around both thermodynamical and quantum criticalities. This would make them flexible and sensitive. And indeed, the first quantum jump has an interpretation as correlate for volitional action at some level of self hierarchy. Consciousness involves passive and active aspects: periods of repeated
    state function reductions and acts of volition. The basic applications of hierarchy of Planck constants to biology indeed involve the heff changing phase transitions in both directions: for instance, molecules are able to find is each by heff reducing phase transition of connecting magnetic flux tubes bringing them near to each other.

The attempt to understand cosmological evolution in terms of hierarchy of Planck constants demonstrates that the view about evolution corresponds to a spontaneous drift away from maximal quantum criticality is feasible.
  1. In primordial cosmology one has gas of cosmic strings X2× Y2⊂ M4× CP2. If they behave deterministically as it seems, their symplectic symmetries are fully dynamical and cannot act as gauge symmetries. This would suggest that they are not quantum critical and cosmic evolution leading to the thickening of the cosmic strings would be towards criticality contrary to the general idea.

    Here one must be however extremely cautious: are cosmic strings really maximally non-critical? The CP2 projection of cosmic string can be any holomorphic 2-surface in CP2 and there could be criticality against transitions changing geodesic sphere to a holomorphic 2-surface. There is also a criticality against transitions changing M4 projection 4-dimensional. The hierarchy of Planck constants could be assignable to the resulting magnetic flux tubes.

    In TGD inspired biology magnetic flux tubes are indeed carriers of large heff phases. That cosmic strings are actually critical, is also supported by the fact that it does not make sense to assign infinite value of heff and therefore vanishing value of αK to cosmic strings since Kähler action would become infinite. The assignment of large heff to cosmic strings does not seem a good idea since there are no gravitationally bound states yet, only a gas of cosmic strings in M4× CP2.

    Cosmic strings allow conformal invariance. Does this conformal invariance act as gauge symmetries or dynamical symmetries? Quantization of ordinary strings would suggests the interpretation of super-conformal symmetries as gauge symmetries. It however seems that the conformal invariance of standard strings corresponds to that associated with the modes of the induced spinor field, and these would be indeed full gauge invariance. What matters is however symplectic conformal symmetries - something new and crucial for TGD view. The non-generic character of 2-D M4 projection suggests that a sub-algebra of the symplectic conformal symmetries increasing the thickness of M4 projection of string act as gauge symmetries (the Hamiltonians would be products of S2 and CP2 Hamiltonians). The most plausible conclusion is that cosmic strings recede from criticality as their thickness increases.

  2. Cosmic strings are not the only objects involved. Space-time sheets are generated during inflationary period and cosmic strings topologically condense at them creating wormhole contacts and begin to expand to magnetic flux tubes with M4 projection of increasing size. Ordinary matter is generated in the decay of the magnetic energy of cosmic strings replacing the vacuum energy of inflaton fields in inflationary scenarios.

    M4 and CP2 type vacuum extremals are certainly maximally critical by their non-determinism and symplectic conformal gauge invariance is maximal for them. During later stages gauge degrees of freedom would transform to dynamical ones. The space-time sheets and wormhole contacts would also drift gradually away from criticality so that also their evolution conforms with the general TGD view.

    Cosmic evolution would thus reduce criticality and would be spontaneous (NMP). The analogy would be provided by the evolution of cell from a maximally critical germ cell to a completely differentiated outcome.

  3. There is however a paradox lurking there. Thickening cosmic string should gradually approach to M4 type vacuum extremals as the density of matter is reduced in expansion. Could the approach from criticality transforms to approach towards it? The geometry of CD involves the analogs of both Big Bang and Big Crunch. Could it be that the eventual turning of expansion to contraction allows to circumvent the paradox? Is the crux of matter the fact that thickened cosmic strings already have a large value of heff mea meaning that they are n-sheeted objects unlike the M4 type vacuum extremals.

    Could NMP force the first state function reduction to the opposite boundary of CD when the expansion inside CD would turn to contraction at space-time level and the contraction would be experienced as expansion since the arrow of time changes? Note that at the imbedding space level the size of CD increases all the time. Could the ageing and death of living systems be understood by using this analogy. Could the quantum jump to the opposite boundary of CD be seen as a kind of reincarnation allowing the increase of heff and conscious evolution to continue as NMP demands? The first quantum jump would also generate entropy and thermodynamical criticality could be criticality against its occurrence. This interpretation of thermodynamical criticality would mean that living system by definition live at the borderline of life and death!