https://matpitka.blogspot.com/2015/08/

Monday, August 31, 2015

Evidence of ancient life discovered in mantle rocks deep below the seafloor

Physicalist who has learned his lessons sees life, evolution, generation of genetic code, etc.. as random thermal fluctuations. Empirical facts suggests that the situation is just the opposite. The emergence of life seems to be unavoidable but water seems to be a prerequisite for it. Now researchers have found evidence for ancient life in deep mantle rocks for about 125 million years ago. The emergence of life in mantle is believed to involve interaction of rocks with hydrothermal water originating from seawater and circulating in mantle (see the illustration).

A serious objection against the successful Urey-Miller experiments as a guideline to how prebiotic life emerged is that the atmosphere was not at that time reducing (reducing means that there are atoms able to donate electrons, oxygen does just the reverse).

This objection could serve as a motivation for assuming that prebiotic life evolved in mantle. For detailed vision about underground prebiology see the article More Precise TGD Based View about Quantum Biology and Prebiotic Evolution. This model predicts that Cambrian Explosion lasting for 20-25 million years was associated with a sudden expansion of Earth radius by a factor 2 about 542 million years ago.

Expanding Earth hypothesis would be reduced in TGD framework to the replacement of continuous cosmic expansion for astrophysical objects with a sequence of short expanding periods followed by long non-expanding periods in accordance with the finding that astrophysical objects do not participate in cosmic expansion (see this). This sudden expansion would have led to a burst of underground oceans to the surface of Earth and generated the oceans as we know them now. This prediction is consistent with the assumption that the hydrothermal water explaining the above described finding originated from seawater.

A killer prediction is that underground life would have developed photosynthesis, and the lifeforms would have been rather highly evolved as they burst on the surface of Earth. How this was possible is one of the questions for which answer is proposed in the article More Precise TGD Based View about Quantum Biology and Prebiotic Evolution. The new physics predicted by TGD - in particular hierarchy of Planck constants identified in terms of dark matter - is an essential element of the model.

For a summary of earlier postings see Links to the latest progress in TGD.

Ontology-Epistemology duality?

Theorist suffering from a tendency to philosophize sooner or later encouters the strange self-referential question "Does my theory about Universe represent also itself as an aspect of the Universe?". For a dualist theory and reality are of course two different things. But are they? Could one make sense of the statement that theory is the reality, which it describes? Or more or less equivalently: epistemology is self-dual to ontology. This would be very nice, theory would not be something outside the reality. This must also relate closely to the question about observer in physics: in the recent quantum measurement theory observer is outsider affecting reality by quantum measuring it but is not described by the quantum theory.

TGD inspired theory of consciousness generalizes quantum measurement theory to a theory of consciousness and brings physicist part of the physical system. Indeed, the new view about state function reduction inspired by Zero Energy Ontology allows to identify "self" as a quantum physical entity and makes several testable killer predictions. In ZEO zero energy states are pairs of positive and negative energy states at the opposite light-like boundaries of CD. Self is identified as as a sequence of state function reductions at given boundary of CD - the members of state pairs at this passive boundary are not changed and the passive boundary itself remains unaffected: obviously this corresponds to Zeno effect.

What is new that the members of state pairs at opposite - active - boundary are changed and the active boundary receides from the passive boundary reduction by reduction so that the size of CD increases. This gives rise to the experienced arrow of geometric time identified as the proper time distance between the tips of CD. The first reduction to the opposite boundary is forced by Negentropy Maximization Principle to eventually occur, and means "death" of self and re-incarnation at the opposite boundary. Time reversal of the original self is generated and geometric time begins to flow in opposite direction.

This suggests that one could indeed see also theory as something not outside the physical world understood in sufficiently general sense.

What do I mean with theory? Can I imbed it to my tri-partistic ontology with three forms of existence: space-time surfaces as counterparts of classical states; quantum states as mathematical objects in ZEO ; quantum jumps as of building bricks of conscious existence giving rise to moments of consciousness and integrating to selves. This trinity is analogous to shape-matter-mind trinity. Let us call this holy trinity just A-B-C to reduce the amount of typing.

I want the equation Theory= Reality. There would be no separate reality behind theory. What would I mean with this statement?

The first attempt to give content to this equation is as equation Theory= quantum state as a mathematical object. Theory would be something restricted to the compartment B in A-B-C. Quantum state as quantum superposition of space-time surfaces ( implied by by holography implied General Coordinate Invariance) would be theory about reality, but there would be no distinct "physical reality" behind it. As far as conscious experiences are considered, this is enough since conscious experience is in the quantum jump between these mathematical objects.

One can however develop objections.

  1. Quantum state in ZEO is counterpart of only one possible quantal time evolution. The theory is therefore very restricted and not enough in quantum Universe in which quantum jumps re-create this reality again and again. A real theory must be able to describe counterparts of all possible time evolutions: the collection of these evolutions should define kind of unified theory. The space of WCW spinor field would be the next trial for a theory and quantum jumps between different evolutions - points of WCW by holography - make it possible to gather conscious information about this landscape.

  2. Theories involve also self-referentiality: statements about statements. Boolean algebra of set defining exponent set is the basic example and corresponds to binary valued from the set. Second quantization is what gives rise to a mathematical structure analogous to statements about statements. Many-fermion Fock states have the structure of quantum Boolean algebra.

    But this is not enough. Theorists make also statements about statements. In particular, very strong statetements about theories of other theorists. This can generate entire hierarchy of highly emotional statements about statements about... known as scientific debate. This suggests that one should allow iterated second quantization emerging from the notion of infinite primes obtained by a repeated second quantization of arithmetic quantum field theory with supersymmetry by starting from boson and fermion states labeled by finite primes.

    In a given quantization one obtains the analogs of both Fock states and even bound states purely number theoretically, and one can repeat this procedure again and again. This hierarchical process corresponds at the level of Boolean algebra formation of statements about statements about... It can be seen also as a hierarchy of logics with order labeled by non-negative integer n. Theories about theories about.... This hierarchy would have many-sheeted space-time as a space-time correlate with the hierarchy of quantizations assigned with the hierarchy of sheets.

  3. But can "theory" really reside only inside compartment B? Theory should contain also the mental images of theoreticians and documentations about these. The documentations are represented in term of classical space-time as a huge generalization of written language - this forces to include compartment A. Also subselves defining mental images of theoreticians and thus entire self hierarchy must be there: therefore also compartment C is needed.

    Our equation would become Epistemelogy= Ontology in 3-partistic sense. Theory about what can be known would be equivalent to the theory about what can exist. This duality is self-duality rather than duality: the latter identification bothered me originally since the nex step would be to construct theory for theory and reader can guess the rest. Notice that this identification is not the physicalist's view saying that consciousness is an epiphenomenon since the ontology is monistic.

Consider now possible objections.
  1. Theories are never complete: they have all kinds of failures. How can reality=theory be incomplete? How can it have failures?
    This is possible: the incompleteness is in conscious experience about theory, not theory. For some reason theorists have a strong tendency to erratically call it the theory. In tripartistic view about theory, incompleteness of the theory would be located at sector C. Conscious experience contains limited amount of information due to the presence of finite measurement resolution and cognitive resolution.

    Finite resolution is necessary in order to avoid drowning into a sea of irrelevant information. Finite resolution leads to an ordering of bits (more generally pinary digits) by 2-adic (p-adic) norm. The realization of the finite measurement resolution is in terms of quantum criticality involving hierarchy of Planck constants and hierarchy of inclusions of hyper-finite factors to which one can assign a hierarchy of dynamical symmetry groups represented physically. Therefore finite measurement resolution is actually something very useful - many beautiful things follow just by being sloppy (but only with the non-significant bits!).

    What Ontology=Epistemology implies that quantum states themselves provide a representation for the finite measurement resolution. It is not something characterizing only the measurement but also the target of measurement. This is very radical change of view point. This is realized quite concretely in the representation of quantum states in terms of partonic two-surfaces and strings connecting them. The large the number of partonic 2-surfaces and the number of strings connecting them, the better the measurement resolution.

  2. There is also an objection relating to self-referentiality. Quantum states provide a representation/theory about itself, are their own mirror images. Doesn't this lead to a kind of self-referential loop and infinite regression? If self is conscious about being conscious about something, one ends up to a similar infinite regression.

    The resolution of the problem is that one self becomes conscious about contents consciousness for previous moment of consciousness. The infinite regress is replaced with endless evolution. Zero energy states become more and more complex as the information about previous moments of consciousness is represented quantum mechanically and classically. By NMP the universe is generating negentropic entanglement giving rise to kind of Akashic records and negentropy resources are increasing. Biological evolution and evolution of sciences are not just random thermodynamical fluctuations but coded to the basic laws of quantum physics and consciousness.

For a summary of earlier postings see Links to the latest progress in TGD.

Sunday, August 30, 2015

Sharpening of Hawking's argument

I already told about the latest argument of Hawking to solve information paradox associated with black holes (see this and this).

There is now a popular article explaining the intuitive picture behind Hawking's proposal. The blackhole horizon would involve tangential flow of light and particles of the infalling matter would induce supertranslations on the pattern of this light thus coding information about their properties to this light. After that this light would be radiated away as analog of Hawking radiation and carry out this information.

The objection would be that in GRT horizon is no way special - it is just a coordinate singularity. Curvature tensor does not diverge either and Einstein tensor and Ricci scalar vanish. This argument has been used in the firewall debates to claim that nothing special should occur as horizon is traversed. Why light would rotate around it? No reason for this!

The answer in TGD would be obvious: horizon is replaced for TGD analog of blackhole with a light-like 3-surface at which the induced metric becomes Euclidian. Horizon becomes analogous to light front carrying not only photons but all kinds of elementary particles. Particles do not fall inside this surface but remain at it!

The objection now is that photons of light front should propagate in direction normal to it, not parallel. The point is however that this light-like 3-surface is the surface at which induced 4-metric becomes degenerate: hence massless particles can live on it.

For a summary of earlier postings see Links to the latest progress in TGD.

Wednesday, August 26, 2015

TGD view about black holes and Hawking radiation: part II

In the second part of posting I discuss TGD view about blackholes and Hawking radiation. There are several new elements involved but concerning black holes the most relevant new element is the assignment of Euclidian space-time regions as lines of generalized Feynman diagrams implying that also blackhole interiors correspond to this kind of regions. Negentropy Maximization Principle is also an important element and predicts that number theoretically defined black hole negentropy can only increase. The real surprise was that the temperature of the variant of Hawking radiation at the flux tubes of proton Sun system is room temperature! Could TGD variant of Hawking radiation be a key player in quantum biology?

The basic ideas of TGD relevant for blackhole concept

My own basic strategy is to not assume anything not necessitated by experiment or not implied by general theoretical assumptions - these of course represent the subjective element. The basic assumptions/predictions of TGD relevant for the recent discussion are following.

  1. Space-times are 4-surfaces in H=M4× CP2 and ordinary space-time is replaced with many-sheeted space-time. This solves what I call energy problem of GRT by lifting gravitationally broken Poincare invariance to an exact symmetry at the level of imbedding space H.

    GRT type description is an approximation obtained by lumping together the space-time sheets to single region of M4, with various fields as sums of induced fields at space-time surface geometrized in terms of geometry of H.

    Space-time surface has both Minkowskian and Euclidian regions. Euclidian regions are identified in terms of what I call generalized Feynman/twistor diagrams. The 3-D boundaries between Euclidian and Minkowskina regions have degenerate induced 4-metric and I call them light-like orbits of partonic 2-surfaces or light-like wormhole throats analogous to blackhole horizons and actually replacing them. The interiors of blackholes are replaced with the Euclidian regions and every physical system is characterized by this kind of region.

    Euclidian regions are identified as slightly deformed pieces of CP2 connecting two Minkowskian space-time regions. Partonic 2-surfaces defining their boundaries are connected to each other by magnetic flux tubes carrying monopole flux.

    Wormhole contacts connect two Minkowskian space-time sheets already at elementary particle level, and appear in pairs by the conservation of the monopole flux. Flux tube can be visualized as a highly flattened square traversing along and between the space-time sheets involved. Flux tubes are accompanied by fermionic strings carrying fermion number. Fermionic strings give rise to string world sheets carrying vanishing induced em charged weak fields (otherwise em charge would not be well-defined for spinor modes). String theory in space-time surface becomes part of TGD. Fermions at the ends of strings can get entangled and entanglement can carry information.

  2. Strong form of General Coordinate Invariance (GCI) states that light-like orbits of partonic 2-surfaces on one hand and space-like 3-surfaces at the ends of causal diamonds on the other hand provide equivalent descriptions of physics. The outcome is that partonic 2-surfaces and string world sheets at the ends of CD can be regarded as basic dynamical objects.

    Strong form of holography states the correspondence between quantum description based on these 2-surfaces and 4-D classical space-time description, and generalizes AdS/CFT correspondence. Conformal invariance is extended to the huge super-symplectic symmetry algebra acting as isometries of WCW and having conformal structure. This explains why 10-D space-time can be replaced with ordinary space-time and 4-D Minkowski space can be replaced with partonic 2-surfaces and string world sheets. This holography looks very much like the one we are accustomed with!

  3. Quantum criticality of TGD Universe fixing the value(s) of the only coupling strength of TGD (Kähler coupling strength) as analog of critical temperature. Quantum criticality is realized in terms of infinite hierarchy of sub-algebras of super-symplectic algebra actings as isometries of WCW, the "world of classical worlds" consisting of 3-surfaces or by holography preferred extremals associated with them.

    Given sub-algebra is isomorphic to the entire algebra and its conformal weights are n≥ 1-multiples of those for the entire algebra. This algebra acts as conformal gauge transformations whereas the generators with conformal weights m<n act as dynamical symmetries defining an infinite hierarchy of simply laced Lie groups with rank n-1 acting as dynamical symmetry groups defined by Mac-Kay correspondence so that the number of degrees of freedom becomes finite. This relates very closely to the inclusions of hyper-finite factors - WCW spinors provide a canonical representation for them.

    This hierarchy corresponds to a hierarchy of effective Planck constants heff=n× h defining an infinite number of phases identified as dark matter. For these phases Compton length and time are scale up by n so that they give rise to macroscopic quantum phases. Super-conductivity is one example of this kind of phase - charge carriers could be dark variants of ordinary electrons. Dark matter appears at quantum criticality and this serves as an experimental manner to produce dark matter. In living matter dark matter identified in this manner would play a central role. Magnetic bodies carrying dark matter at their flux tubes would control ordinary matter and carry information.

  4. I started the work with the hierarchy of Planck constants from the proposal of Nottale stating that it makes sense to talk about gravitational Planck constant hgr=GMm/v0, v0/c≤ 1 (the interpretation of symbols should be obvious). Nottale found that the orbits of inner and outer planets could be modelled reasonably well by applying Bohr quantization to planetary orbits with tge value of velocity parameter differing by a factor 1/5. In TGD framework hgr would be associated with magnetic flux tubes mediating gravitational interaction between Sun with mass M and planet or any object, say elementary particle, with mass m. The matter at the flux tubes would be dark as also gravitons involved. The Compton length of particle would be given by GM/v0 and would not depend on the mass of particle at all.

    The identification hgr=heff is an additional hypothesis motivated by quantum biology, in particular the identification of biophotons as decay products of dark photons satisfying this condition. As a matter fact, one can talk also about hem assignable to electromagnetic interactions: its values are much lower. The hypothesis is that when the perturbative expansion for two particle system does not converge anymore, a phase transition increasing the value of the Planck constant occurs and guarantees that coupling strength proportional to 1/heff increases. This is one possible interpretation for quantum criticality. TGD provides a detailed geometric interpretation for the space-time correlates of quantum criticality.

    Macroscopic gravitational bound states not possible in TGD without the assumption that effective string tension associated with fermionic strings and dictated by strong form of holography is proportional to 1/heff2. The bound states would have size scale of order Planck length since for longer systems string energy would be huge. heff=hgr makes astroscopic quantum coherence unavoidable. Ordinary matter is condensed around dark matter. The counterparts of black holes would be systems consisting of only dark matter.

  5. Zero energy ontology (ZEO) is central element of TGD. There are many motivations for it. For instance, Poincare invariance in standard sense cannot make sense since in standard cosmology energy is not conserved. The interpretation is that various conserved quantum numbers are length scale dependent notions.

    Physical states are zero energy states with positive and negative energy parts assigned to ends of space-time surfaces at the light-like boundaries of causal diamonds (CDs). CD is defined as Cartesian products of CP2 with the intersection of future and past directed lightcones of M4. CDs form a fractal length scale hierarchy. CD defines the region about which single conscious entity can have conscious information, kind of 4-D perceptive field. There is a hierarchy of WCWs associated with CDs. Consciously experienced physics is always in the scale of given CD.

    Zero energy states identified as formally purely classical WCW spinor fields replace positive energy states and are analogous to pairs of initial and final, states and the crossing symmetry of quantum field theories gives the mathematical motivation for their introduction.

  6. Quantum measurement theory can be seen as a theory of consciousness in ZEO. Conscious observer or self as a conscious entity becomes part of physics. ZEO gives up the assumption about unique universe of classical physics and restricts it to the perceptive field defined by CD.

    In each quantum jump a re-creation of Universe occurs. Subjective experience time corresponds to state function reductions at fixed, passive bounary of CD leaving it invariant as well as state at it. The state at the opposite, active boundary changes and also its position changes so that CD increases state function by state function reduction doing nothing to the passive boundary. This gives rise to the experienced flow of geometric time since the distance between the tips of CD increases and the size of space-time surfaces in the quantum superposition increases. This sequence of state function reductions is counterpart for the unitary time evolution in ordinary quantum theory.

    Self "dies" as the first state function reduction to the opposite boundary of CD meaning re-incarnation of self at it and a reversal of the arrow of geometric time occurs: CD size increases now in opposite time direction as the opposite boundary of CD recedes to the geometric past reduction by reduction.

    Negentropy Maximization Principle (NMP) defines the variational principle of state function reduction. Density matrix of the subsystem is the universal observable and the state function reduction leads to its eigenspaces. Eigenspaces, not only eigenstates as usually.

    Number theoretic entropy makes sense for the algebraic extensions of rationals and can be negative unlike ordinary entanglement entropy. NMP can therefore lead to a generation of NE if the entanglement correspond to a unitary entanglement matrix so that the density matrix of the final state is higher-D unit matrix. Another possibility is that entanglement matrix is algebraic but that its diagonalization in the algebraic extension of rationals used is not possible. This is expected to reduce the rate for the reduction since a phase transition increasing the size of extension is needed.

    The weak form of NMP does not demand that the negentropy gain is maximum: this allow the conscious entity responsible for reduction to decide whether to increase maximally NE resources of the Universe or not. It can also allow larger NE increase than otherwise. This freedom brings the quantum correlates of ethics, moral, and good and evil. p-Adic length scale hypothesis and the existence of preferred p-adic primes follow from weak form of NMP and one ends up naturally to adelic physics.

The analogs of blackholes in TGD

Could blackholes have any analog in TGD? What about Hawking radiation? The following speculations are inspired by the above general vision.

  1. Ordinary blackhole solutions are not appropriate in TGD. Interior space-time sheet of any physical object is replaced with an Euclidian space-time region. Also that of blackhole by perturbation argument based on the observation that if one requires that the radial component of blackhole metric is finite, the horizon becomes light-like 3-surface analogous to the light-like orbit of partonic 2-surface and the metric in the interior becomes Euclidian.

  2. The analog of blackhole can be seen as a limiting case for ordinary astrophysical object, which already has blackhole like properties due to the presence of heff=n× h dark matter particles, which cannot appear in the same vertices with visible manner. Ideal analog of blackhole consist of dark matter only, and is assumed to satisfy the hgr=heff already discussed. It corresponds to region with a radius equal to Compton length for arbitrary particle R=GM/v0=rS/2v0, where rS is Schwartschild radius. Macroscopic quantum phase is in question since the Compton radius of particle does not depend on its mass. Blackhole limit would correspond to v0/c→ 1 and dark matter dominance. This would give R=rS/2. Naive expectation would be R=rS (maybe factor of two is missing somewhere: blame me!).

  3. NMP implies that information cannot be lost in the formation of blackhole like state but tends to increase. Matter becomes totally dark and the NE with the partonic surfaces of external world is preserved or increases. The ingoing matter does not fall to a mass point but resides at the partonic 2-surface which can have arbitrarily large surface. It can have also wormholes connecting different regions of a spherical surface and in this manner increase its genus. NMP, negentropy , negentropic entanglement between heff=n× h dark matter systems would become the basic notions instead of second law and entropy.


  4. There is now a popular article explaining the intuitive picture behind Hawking's proposal. The blackhole horizon would involve tangential flow of light and particles of the infalling matter would induce supertranslations on the pattern of this light thus coding information about their properties to this light. After that this light would be radiated away as analog of Hawking radiation and carry out this information.

    The objection would be that in GRT horizon is no way special - it is just a coordinate singularity. Curvature tensor
    does not diverge either and Einstein tensor and Ricci scalar vanish. This argument has been used in the firewall debates to claim that nothing special should occur as horizon is traversed. So: why light would rotate around it? No reason for this! The answer in TGD would be obvious: horizon is replaced for TGD analog of blackhole with a light-like 3-surface at which the induced metric becomes Euclidian. Horizon becomes analogous to light front carrying not only photons but all kinds of elementary particles. Particles do not fall inside this surface but remain at it!

  5. The replacement of second law with NMP leads to ask whether a generalization of blackhole thermodynamics does make sense in TGD Universe. Since blackhole thermodynamics characterizes Hawking radiation, the generalization could make sense at least if there exist analog for the Hawking radiation. Note that also geometric variant of second law makes sense.

    Could the analog of Hawking radiation be generated in the first state function reduction to the opposite boundary, and be perhaps be assigned with the sudden increase of radius of the partonic 2-surface defining the horizon? Could this burst of energy release the energy compensating the generation of gravitational binding energy? This burst would however have totally different interpretation: even gamma ray bursts from quasars could be considered as candidates for it and temperature would be totally different from the extremely low general relativistic Hawking temperature of order

    TGR=[hbar/8π GM ] ,

    which corresponds to an energy assignable to wavelength equal to 4π times Schwartschild radius. For Sun with Schwartschild radius rS=2GM=3 km one has TGR= 3.2× 10-11 eV.

One can of course have fun with formulas to see whether the generalizaton of blackhole thermodynamics assuming the replacement h→ hgr could make sense physically. Also the replacement rS→ R, where R is the real radius of the star will be made.
  1. Blackhole temperature can be formally identified as surface gravity

    T=(hgr/hbar) × [GM/2π R2] = [hgr/h] × [rS2/R2]× TGR = 1/[4π v0] [rS2/R2] .

    For Sun with radius R= 6.96× 105 km one has T/m= 3.2× 10-11 giving T= 3× 10-2 eV for proton. This is by 9 orders higher than ordinary Hawking temperature. Amazingly, this temperature equals to room temperature! Is this a mere accident? If one takes seriously TGD inspired quantum biology in which quantum gravity plays a key role (see this) this does not seem to be the case. Note that for electron the temperature would correspond to energy 3/2× 10-5 eV which corresponds to 4.5 GHz frequency for ordinary Planck constant.

    It must be however made clear that the value of v0 for dark matter could differ from that deduced assuming that entire gravitational mass is dark. For M→ MD= kM and v0→ k1/2v0 the orbital radii remain unchanged but the velocity of dark matter object at the orbit scales to k1/2v0. This kind of scaling is suggested by the fact that the value of hgr seems to be too large as compared by the identification of biophotons as decay results of dark photons with heff=hgr (some arguments suggest the value k≈ 2× 10-4).

    Note that for the radius R=[rS/2v0π] the thermal energy exceeds the rest mass of the particle. For neutron stars this limit might be achieved.


  2. Blackhole entropy

    SGR= [A/4 hbar G]= 4π GM2/hbar=4π [M2/MPl2]

    would be replaced with the negentropy for dark matter making sense also for systems containing both dark and ordinary matter. The negentropy N(m) associated with a flux tube of given type would be a fraction h/hgr from the total area of the horizon using Planck area as a unit:

    N(m)=[h/hgr] × [A/4hbar G]= [h/hgr] × [R2/rS2] ×SGR = v0×[M/m]× [R2/rS2] .

    The dependence on m makes sense since a given flux tube type characterized by mass m determining the corresponding value of hgr has its own negentropy and the total negentropy is the sum over the particle species. The negentropy of Sun is numerically much smaller that corresponding blackhole entropy.

  3. Horizon area is proportional to (GM/v0)2∝ heff2 and should increase in discrete jumps by scalings of integer and be proportional to n2.

How does the analog of blackhole evolve in time? The evolution consists of sequences of repeated state function reductions at the passive boundary of CD followed by the first reduction to the opposite boundary of CD followed by a similar sequence. These sequences are analogs of unitary time evolutions. This defines the analog of blackhole state as a repeatedly re-incarnating conscious entity and having CD, whose size increases gradually. During given sequence of state function reductions the passive boundary has constant size. About active boundary one cannot say this since it corresponds to a superposition of quantum states.

The reduction sequences consist of life cycles at fixed boundary and the size of blackhole like state as of any state is expected to increase in discrete steps if it participates to cosmic expansion in average sense. This requires that the mass of blackhole like object gradually increases. The interpretation is that ordinary matter gradually transforms to dark matter and increases dark mass M= R/G.

Cosmic expansion is not observed for the sizes of individual astrophysical objects, which only co-move. The solution of the paradox is that they suddenly increase their size in state function reductions. This hypothesis allows to realize Expanding Earth hypothesis in TGD framework (see this). Number theoretically preferred scalings of blackhole radius come as powers of 2 and this would be the scaling associated with Expanding Earth hypothesis.

See the chapter Criticality and dark matter" or the article TGD view about black holes and Hawking radiation.

For a summary of earlier postings see Links to the latest progress in TGD.

TGD view about blackholes and Hawking radiation: part I

The most recent revealation of Hawking was in Hawking radiation conference held in KTH Royal Institute of Technology in Stockholm. The title of the posting of Bee telling about what might have been revealed is "Hawking proposes new idea for how information might escape from black holes". Also Lubos has - a rather aggressive - blog post about the talk. A collaboration of Hawking, Andrew Strominger and Malcom Perry is behind the claim and the work should be published within few months.

The first part of posting gives a critical discussion of the existing approach to black holes and Hawking gravitation. The intention is to demonstrate that a pseudo problem following from the failure of General Relativity below black hole horizon is in question.

In the second past of posting I will discuss TGD view about blackholes and Hawking radiation. There are several new elements involved but concerning black holes the most relevant new element is the assignment of Euclidian space-time regions as lines of generalized Feynman diagrams implying that also blackhole interiors correspond to this kind of regions. Negentropy Maximization Principle is also an important element and predicts that number theoretically defined black hole negentropy can only increase. The real surprise was that the temperature of the variant of Hawking radiation at the flux tubes of proton Sun system is room temperature! Could TGD variant of Hawking radiation be a key player in quantum biology?

Is information lost or not in blackhole collapse?

The basic problem is that classically the collapse to blackhole seems to destroy all information about the matter collapsing to the blackhole. The outcome is just infinitely dense mass point. There is also a theorem of classical GRT stating that blackhole has no hair: blachole is characterized only by few conserved charges.

Hawking has predicted that blackhole loses its mass by generating radiation, which looks like thermal. As blackhole radiates its mass away, all information about the material which entered to the blackhole seems to be lost. If one believes in standard quantum theory and unitary evolution preserving the information, and also forgets the standard quantum theory's prediction that state function reductions destroy information, one has a problem. Does the information really disappear? Or is the GRT description incapable to cope with the situation? Could information find a new representation?

Superstring models and AdS/CFT correspondence have inspired the proposal that a hologram results at the horizon and this hologram somehow catches the information by defining the hair of the blackhole. Since the radius of horizon is proportional to the mass of blackhole, one can however wonder what happens to this information as the radius shrinks to zero when all mass is Hawking radiated out.

What Hawking suggests is that a new kind of symmetry known as super-translations - a notion originally introduced by Bondi and Metzner - could somehow save the situation. Andrew Strominger has recently discussed the notion. The information would be "stored to super-translations". Unfortunately this statement says nothing to me nor did not say to Bee and New Scientist reporter. The idea however seems to be that the information carried by Hawking radiation emanating from the blackhole interior would be caught by the hologram defined by the blackhole horizon.

Super-translation symmetry acts at the surface of a sphere with infinite radius in asymptotically flat space-times looking like empty Minkowski space in very distant regions. The action would be translations along sphere plus Poincare transformations.

What comes in mind in TGD framework is conformal transformations of the boundary of 4-D lightcone, which act as scalings of the radius of sphere and conformal transformations of the sphere. Translations however translate the tip of the light-cone and Lorentz transformations transform the sphere to an ellipsoid so that one should restrict to rotation subgroup of Lorentz group. Besides this TGD allows huge group of symplectic transformations of δ CD× CP2 acting as isometries of WCW and having structure of conformal algebra with generators labelled by conformal weights.

Sharpening of the argument of Hawking

There is now a popular article explaining the intuitive picture behind Hawking's proposal. The blackhole horizon would involve tangential flow of light and particles of the infalling matter would induce supertranslations on the pattern of this light thus coding information about their properties to this light. After that this light would be radiated away as analog of Hawking radiation and carry out this information.

The objection would be that in GRT horizon is no way special - it is just a coordinate singularity. Curvature tensor does not diverge either and Einstein tensor and Ricci scalar vanish. This argument has been used in the firewall debates to claim that nothing special should occur as horizon is traversed. Why light would rotate around it? I see no reason for this! The answer in TGD framework would be obvious: horizon is replaced for TGD analog of blackhole with a light-like 3-surface at which the induced metric becomes Euclidian. Horizon becomes analogous to light front carrying not only photons but all kinds of elementary particles. Particles do not fall inside this surface but remain at it!

What are the problems?

My fate is to be an aggressive dissident listened by no-one, and I find it natural to continue in the role of angry old man. Be cautious, I am arrogant, I can bite, and my bite is poisonous!

  1. With all due respect to Big Guys, to me the problem looks like a pseudo problem caused basically by the breakdown of classical GRT. Irrespective of whether Hawking radiation is generated, the information about matter (apart from mass, and some charges) is lost if the matter indeed collapses to single infinitely dense point. This is of course very unrealistic and the question should be: how should we proceed from GRT.

    Blackhole is simply too strong an idealization and it is no wonder that Hawking's calculation using blackhole metric as a background gives rise to blackbody radiation. One might hope that Hawking radiation is genuine physical phenomenon, and might somehow carry the information by being not genuinely thermal radiation. Here a theory of quantum gravitation might help. But we do not have it!

  2. What do we know about blackholes? We know that there are objects, which can be well described by the exterior Schwartschild metric. Galactic centers are regarded as candidates for giant blackholes. Binary systems for which another member is invisible are candidates for stellar blackholes. One can however ask wether these candidates actually consist of dark matter rather than being blackholes. Unfortunately, we do not understand what dark matter is!

  3. Hawking radiation is extremely weak and there is no experimental evidence pro or con. Its existence assumes the existence of blackhole, which presumably represents the failure of classical GRT. Therefore we might be seeing a lot of trouble and inspired heated debates about something, which does not exist at all! This includes both blackholes, Hawking radiation and various problems such as firewall paradox.
There are also profound theoretical problems.
  1. Contrary to the intensive media hype during last three decades, we still do not have a generally accepted theory of quantum gravity. Super string models and M-theory failed to predict anything at fundamental level, and just postulate effective quantum field theory limit, which assumes the analog of GRT at the level of 10-D or 11-D target space to define the spontaneous compactification as a solution of this GRT type theory. Not much is gained.

    AdS/CFT correspondence is an attempt to do something in absence of this kind of theory but involves 10- or 11- D blackholes and does not help much. Reality looks much simpler to an innocent non-academic outsider like me. Effective field theorizing allows intellectual laziness and many problems of recent day physics will be probably seen in future as being caused by this lazy approach avoiding attempts to build explicit bridges between physics at different scales. Something very similar has occurred in hadron physics and nuclear physics and one has kind of stable of Aigeias to clean up before one can proceed.

  2. A mathematically well-defined notion of information is lacking. We can talk about thermodynamical entropy - single particle observable - and also about entanglement entropy - basically a 2-particle observable. We do not have genuine notion of information and second law predicts that the best that one can achieve is no information at all!

    Could it be that our view about information as single particle characteristic is wrong? Could information be associated with entanglement and be 2-particle characteristic? Could information reside in the relationship of object with the external world, in the communication line? Not inside blackhole, not at horizon but in the entanglement of blackhole with the external world.

  3. We do not have a theory of quantum measurement. The deterministic unitary time evolution of Schrödinger equation and non-deterministic state function reduction are in blatant conflict. Copenhagen interpretation escapes the problem by saying that no objective reality/realities exist. Easy trick once again! A closely related Pandora's box is that experienced time and geometric time are very different but we pretend that this is not the case.

    The only way out is to bring observer part of quantum physics: this requires nothing less than quantum theory of consciousness. But the gurus of theoretical physics have shown no interest to consciousness. It is much easier and much more impressive to apply mechanical algorithms to produce complex formulas. If one takes consciousness seriously, one ends up with the question about the variational principle of consciousness. Yes, your guess was correct! Negentropy Maximization Principle! Conscious experience tends to maximize conscious information gain. But how information is represented?

In the second part I will discuss TGD view about blackholes and Hawking radiation.

See the chapter Criticality and dark matter" or the article TGD view about black holes and Hawking radiation.

For a summary of earlier postings see Links to the latest progress in TGD.

Tuesday, August 25, 2015

Field equations as conservation laws, Frobenius integrability conditions, and a connection with quaternion analyticity

The following represents qualitative picture of field equations of TGD trying to emphasize the physical aspects. What is new is the discussion of the possibility that Frobenius integrability conditions are satisfied and correspond to quaternion analyticity.

  1. Kähler action is Maxwell action for induced Kähler form and metric expressible in terms of imbedding space coordinates and their gradients. Field equations reduce to those for imbedding space coordinates defining the primary dynamical variables. By GCI only four of them are independent dynamical variables analogous to classical fields.

  2. The solution of field equations can be interpreted as a section in fiber bundle. In TGD the fiber bundle is just the Cartesian product X4× CD× CP2 of space-time surface X4 and causal diamond CD× CP2. CD is the intersection of future and past directed light-cones having two light-like boundaries, which are cone-like pieces of light-boundary δ M4+/-× CP2. Space-time surface serves as base space and CD× CP2 as fiber. Bundle projection Π is the projection to the factor X4. Section corresponds to the map x→ hk(x) giving imbedding space coordinates as functions of space-time coordinates. Bundle structure is now trivial and rather formal.

    By GCI one could also take suitably chosen 4 coordinates of CD× CP2 as space-time coordinates, and identify CD× CP2 as the fiber bundle. The choice of the base space depends on the character of space-time surface. For instance CD, CP2 or M2× S2 (S2 a geodesic sphere of CP2), could define the base space. The bundle projection would be projection from CD× CP2 to the base space. Now the fiber bundle structure can be non-trivial and make sense only in some space-time region with same base space.

  3. The field equations derived from Kähler action must be satisfied. Even more: one must have a preferred extremal of Kähler action. One poses boundary conditions at the 3-D ends of space-time surfaces and at the light-like boundaries of CD× CP2.

    One can fix the values of conserved Noether charges at the ends of CD (total charges are same) and require that the Noether charges associated with a sub-algebra of super-symplectic algebra isomorphic to it and having conformal weights coming as n-ples of those for the entire algebra, vanish. This would realize the effective 2-dimensionality required by SH. One must pose boundary conditions also at the light-like partonic orbits. So called weak form of electric-magnetic duality is at least part of these boundary conditions.

    It seems that one must restrict the conformal weights of the entire algebra to be non-negative r≥ 0 and those of subalgebra to be positive: mn>0. The condition that also the commutators of sub-algebra generators with those of the entire algebra give rise to vanishing Noether charges implies that all algebra generators with conformal weight m≥ n vanish so the dynamical algebra becomes effectively finite-dimensional. This condition generalizes to the action of super-symplectic algebra generators to physical states.

    M4 time coordinate cannot have vanishing time derivative dm0/dt so that four-momentum is non-vanishing for non-vacuum extremals. For CP2 coordinates time derivatives dsk/dt can vanish and for space-like Minkowski coordinates dmi/dt can be assumed to be non-vanishing if M4 projection is 4-dimensional. For CP2 coordinates dsk/dt=0 implies the vanishing of electric parts of induced gauge fields. The non-vacuum extremals with the largest conformal gauge symmetry (very small n) would correspond to cosmic string solutions for which induced gauge fields have only magnetic parts. As n increases, also electric parts are generated. Situation becomes increasingly dynamical as conformal gauge symmetry is reduced and dynamical conformal symmetry increases.

  4. The field equations involve besides imbedding space coordinates hk also their partial derivatives up to second order. Induced Kähler form and metric involve first partial derivatives ∂αhk and second fundamental form appearing in field equations involves second order partial derivatives ∂αβhk.

    Field equations are hydrodynamical, in other worlds represent conservation laws for the Noether currents associated with the isometries of M4× CP2. By GCI there are only 4 independent dynamical variables so that the conservation of m≤ 4 isometry currents is enough if chosen to be independent. The dimension m of the tangent space spanned by the conserved currents can be smaller than 4. For vacuum extremals one has m= 0 and for massless extremals (MEs) m= 1! The conservation of these currents can be also interpreted as an existence of m≤ 4 closed 3-forms defined by the duals of these currents.

  5. The hydrodynamical picture suggests that in some situations it might be possible to assign to the conserved currents flow lines of currents even globally. They would define m≤ 4 global coordinates for some subset of conserved currents (4+8 for four-momentum and color quantum numbers). Without additional conditions the individual flow lines are well-defined but do not organize to a coherent hydrodynamic flow but are more like orbits of randomly moving gas particles. To achieve global flow the flow lines must satisfy the condition dφA/dxμ= kABJBμ or dφA= kABJB so that one can special of 3-D family of flow lines parallel to kABJB at each point - I have considered this kind of possibility in detail earlier but the treatment is not so general as in the recent case.

    Frobenius integrability conditions follow from the condition d2φA=0= dkAB∧ JB+ kABdJB=0 and implies that dJB is in the ideal of exterior algebra generated by the JA appearing in kABJB. If Frobenius conditions are satisfied, the field equations can define coordinates for which the coordinate lines are along the basis elements for a sub-space of at most 4-D space defined by conserved currents. Of course, the possibility that for preferred extremals there exists m≤ 4 conserved currents satisfying integrability conditions is only a conjecture.

    It is quite possible to have m<4. For instance for vacuum extremals the currents vanish identically For MEs various currents are parallel and light-like so that only single light-like coordinate can be defined globally as flow lines. For cosmic strings (cartesian products of minimal surfaces X2 in M4 and geodesic spheres S2 in CP2 4 independent currents exist). This is expected to be true also for the deformations of cosmic strings defining magnetic flux tubes.

  6. Cauchy-Riemann conditions in 2-D situation represent a special case of Frobenius conditions. Now the gradients of real and imaginary parts of complex function w=w(z)= u+iv define two conserved currents by Laplace equations. In TGD isometry currents would be gradients apart from scalar function multipliers and one would have generalization of C-R conditions. In citeallb/prefextremals,twistorstory I have considered the possibility that the generalization of Cauchy-Riemann-Fuerter conditions could define quaternion analyticity - having many non-equivalent variants - as a defining property of preferred extremals. The integrability conditions for the isometry currents would be the natural physical formulation of CRF conditions. Different variants of CRF conditions would correspond to varying number of independent conserved isometry currents.

  7. This picture allows to consider a generalization of the notion of solution of field equation to that of integral manifold. If the number of independent isometry currents is smaller than 4 (possibly locally) and the integrability conditions hold true, lower-dimensional sub-manifolds of space-time surface define integral manifolds as kind of lower-dimensional effective solutions. Genuinely lower-dimensional solutions would of course have vanishing (g41/2) and vanishing Kähler action.

    String world sheets can be regarded as 2-D integral surfaces. Charged (possibly all) weak boson gauge fields vanish at them since otherwise the electromagnetic charge for spinors would not be well-defined. These conditions force string world sheets to be 2-D in the generic case. In special case 4-D space-time region as a whole can satisfy these conditions. Well-definedness of Kähler-Dirac equation demands that the isometry currents of Kähler action flow along these string world sheets so that one has integral manifold. The integrability conditions would allow 2<m≤ n integrable flows outside the string world sheets, and at string world sheets one or two isometry currents would vanish so that the flows would give rise 2-D independent sub-flow.

  8. The method of characteristics is used to solve hyperbolic partial differential equations by reducing them to ordinary differential equations. The (say 4-D) surface representing the solution in the field space has a foliation using 1-D characteristics. The method is especially simple for linear equations but can work also in the non-linear case. For instance, the expansion of wave front can be described in terms of characteristics representing light rays. It can happen that two characteristics intersect and a singularity results. This gives rise to physical phenomena like caustics and shock waves.

    In TGD framework the flow lines for a given isometry current in the case of an integrable flow would be analogous to characteristics, and one could also have purely geometric counterparts of shockwaves and caustics. The light-like orbits of partonic 2-surface at which the signature of the induced metric changes from Minkowskian to Euclidian might be seen as an example about the analog of wave front in induced geometry. These surfaces serve as carriers of fermion lines in generalized Feynman diagrams. Could one see the particle vertices at which the 4-D space-time surfaces intersect along their ends as analogs of intersections of characteristics - kind of caustics? At these 3-surfaces the isometry currents should be continuous although the space-time surface has "edge".

For details see the chapter Recent View about Kähler Geometry and Spin Structure of "World of Classical Worlds" of "Quantum physics as infinite-dimensional geometry" or the article Could One Define Dynamical Homotopy Groups in WCW?.

For a summary of earlier postings see Links to the latest progress in TGD.

Saturday, August 22, 2015

Does color deconfinement really occur?

Bee had a nice blog posting related to the origin of hadron masses and the phase transition from color confinement to quark-gluon plasma involving also restoration of chiral symmetry in the sigma model description. In the ideal situation the outcome should be a black body spectrum with no correlations between radiated particles.

The situation is however not this. Some kind of transition occurs and produces a phase, which has much lower viscosity than expected for quark-gluon plasma. Transition occurs also in much smoother manner than expected. And there are strong correlations between opposite charged particles - charge separation occurs. The simplest characterization for these events would be in terms of decaying strings emitting particles of opposite charge from their ends. Conventional models do not predict anything like this.

Some background

The masses of current quarks are very small - something like 5-20 MeV for u and d. These masses explain only a minor fraction of the mass of proton. The old fashioned quark model assumed that quark masses are much bigger: the mass scale was roughly one third of nucleon mass. These quarks were called constituent quarks and - if they are real - one can wonder how they relate to current quarks.

Sigma model provide a phenomenological decription for the massivation of hadrons in confined phase. The model is highly analogous to Higgs model. The fields are meson fields and baryon fields. Now neutral pion and sigma meson develop vacuum expectation values and this implies breaking of chiral symmetry so that nucleon become massive. The existence of sigma meson is still questionable.

In a transition to quark-gluon plasma one expects that mesons and protons disappear totally. Sigma model however suggests that pion and proton do not disappear but become massless. Hence the two descriptions might be inconsistent.

The authors of the article assumes that pion continues to exist as a massless particle in the transition to quark gluon plasma. The presence of massless pions would yield a small effect at the low energies at which massless pions have stronger interaction with magnetic field as massive ones. The existence of magnetic wave coherent in rather large length scale is an additional assumption of the model: it corresponds to the assumption about large heff in TGD framework, where color magnetic fields associated with M89 meson flux tubes replace the magnetic wave.

In TGD framework sigma model description is at best a phenomenological description as also Higgs mechanism. p-Adic thermodynamics replaces Higgs mechanism and the massivation of hadrons involves color magnetic flux tubes connecting valence quarks to color singles. Flux tubes have quark and antiquark at their ends and are mesonlike in this sense. Color magnetic energy contributes most of the mass of hadron. Constituent quark would correspond to valence quark identified as current quark plus the associated flux tube and its mass would be in good approximation the mass of color magnetic flux tube.

There is also an analogy with sigma model provided by twistorialization in TGD sense. One can assign to hadron (actually any particle) a light-like 8-momentum vector in tangent space M8=M4× E4 of M4× CP2 defining 8-momentum space. Massless implies that ordinary mass squared corresponds to constant E4 mass which translates to a localization to a 3-sphere in E4. This localization is analogous to symmetry breaking generating a constant value of π0 field proportional to its mass in sigma model.

An attempt to understand charge asymmetries in terms of charged magnetic wave and charge separation

One of the models trying to explain the charge asymmetries is in terms of what is called charged magnetic wave effect and charge separation effect related to it. The experiment discussed by Bee attempts to test this model.

  1. So called chiral magnetic wave effect and charge separation effects are proposed as an explanation for the the linear dependence of the asymmetry of so called elliptic flow on charge asymmetry. Conventional models explain neither the charge separation nor this dependence. Chiral magnetic wave would be a coherent magnetic field generated by the colliding nuclei in a relatively long scale, even the length scale of nuclei.

  2. Charged pions interact with this magnetic field. The interaction energy is roughly h× eB/E, where E is the energy of pion. In the phase with broken chiral symmetry the pion mass is non-vanishing and at low energy one has E=m in good approximation. In chirally symmetric phase pion is massless and magnetic interaction energy becomes large a low energies. This could serve as a signature distginguishing between chirally symmetric and asymmetric phases.

  3. The experimenters try to detect this difference and report slight evidence for it. This is change of the charge asymmetry of so called elliptic flow for positively and negatively charged pions interpreted in terms of charge separation fluctuation caused by the presence of strong magnetic field assumed to lead to separation of chiral charges (left/righ handedness). The average velocities of the pions are different and average velocity depends azimuthal angle in the collision plane: second harmonic is in question (say sin(2φ)).

In TGD framework the explanation of the un-expected behavior of should-be quark-gluon plasma is in terms of M89 hadron physics.
  1. A phase transition indeed occurs but means a phase transition transforming the quarks of the ordinary M107 hadron physics to those of M89 hadron physics. They are not free quarks but confined to form M89 mesons. M89 pion would have mass about 135 GeV. A naive scaling gives half of this mass but it seems unfeasible that pion like state with this mass could have escaped the attention - unless of course the unexpected behavior of quark gluon plasma demonstrates its existence! Should be easy for a professional to check. Thus a phase transition would yield a scaled up hadron physics with mass scale by a factor 512 higher than for the ordinary hadron physics.

  2. Stringy description applies to the decay of flux tubes assignable to the M89 mesons to ordinary hadrons. This explains charge separation effect and the deviation from the thermal spectrum.

  3. In the experiments discussed in the article the cm energy for nucleon-nucleon system associated with the colliding nuclei varied between 27-200 GeV so that the creation of even on mass shell M89 pion in single collision of this kind is possible at highest energies. If several nucleons participate simultaneosly even many-pion states are possible at the upper end of the interval.

  4. These hadrons must have large heff=n× h since collision time is roughly 5 femtoseconds, by a factor about 500 (not far from 512!) longer than the time scale associated with their masses if M89 pion has the proposed mass of 135 MeV for ordinary Planck constant and scaling factor 2× 512 instead of 512 in principle allowed by p-adic length scale hypothesis. There are some indications for a meson with this mass. The hierarchy of Planck constants allows at quantum criticality to zoom up the size of much more massive M89 hadrons to nuclear size! The phase transition to dark M89 hadron physics could take place in the scale of nucleus producing several M89 pions decaying to ordinary hadrons.

  5. The large value of heff would mean quantum coherence in the scale of nucleus explaining why the value of viscosity was much smaller than expected for quark gluon plasma. The expected phase transition was also much smoother than expected. Since nuclei are many-nucleon systems and the Compton wavelength of M89 pion would be of order nucleus size, one expects that the phase transition can take place in a wide collision energy range. At lower energies several nucleon pairs could provide energy to generate M89 pion. At higher energies even single nucleon pair could provide the energy. The number of M89 pions should therefore increase with nucleon-nucleon collision energy, and induce the increase of charge asymmetry and strength of the charge asymmery of the elliptic flow.

  6. Hydrodynamical behavior is essential in order to have low viscosity classically. Even more, the hydrodynamics had better to be that of an ideal liquid. In TGD framework the field equations have hydrodynamic character as conservation laws for currents associated with various isometries of imbedding space. The isometry currents define flow lines. Without further conditions the flow lines do not however integrate to a coherent flow: one has something analogous to gas phase rather than liquid so that the mixing induced by the flow cannot be described by a smooth map.

    To achieve this given isometry flow must make sense globally - that is to define coordinate lines of a globally defined coordinate ("time" along flow lines). In this case one can assign to the flow a continuous phase factor as an order parameter varying along the flow lines. Super-conductivity is an example of this. The so called Frobenius conditions guarantee this at least the preferred extremals could have this complete integrability property making TGD an integrable theory (see the appendix of the article at my homepage). In the recent case, the dark flux tubes with size scale of nucleus would carry ideal hydrodynamical flow with very low viscosity.

See the chapter New Particle Physics Predicted by TGD: Part I or the article Does color deconfinement really occur?.

For a summary of earlier postings see Links to the latest progress in TGD.

Wednesday, August 19, 2015

Could one define dynamical homotopy groups in WCW?

I learned that Agostino Prastaro has done highly interesting work with partial differential equations, also those assignable to geometric variational principles such as Kähler action in TGD. I do not understand the mathematical details but the key idea is a simple and elegant generalization of Thom's cobordism theory, and it is difficult to avoid the idea that the application of Prastaro's idea might provide insights about the preferred extremals, whose identification is now on rather firm basis.

One could also consider a definition of what one might call dynamical homotopy groups as a genuine characteristics of WCW topology. The first prediction is that the values of conserved classical Noether charges correspond to disjoint components of WCW. Could the natural topology in the parameter space of Noether charges zero modes of WCW metric) be p-adic and realize adelic physics at the level of WCW? An analogous conjecture was made on basis of spin glass analogy long time ago. Second surprise is that the only the 6 lowest dynamical homotopy/homology groups of WCW would be non-trivial. The Kähler structure of WCW suggets that only Π0, Π2, and Π4 are non-trivial.

The interpretation of the analog of Π1 as deformations of generalized Feynman diagrams with elementary cobordism snipping away a loop as a move leaving scattering amplitude invariant conforms with the number theoretic vision about scattering amplitude as a representation for a sequence of algebraic operation can be always reduced to a tree diagram. TGD would be indeed topological QFT: only the dynamical topology would matter.

For details see the chapter Recent View about K\"ahler Geometry and Spin Structure of "World of Classical Worlds" of "Quantum physics as infinite-dimensional geometry" or the article Could One Define Dynamical Homotopy Groups in WCW?.

For a summary of earlier postings see Links to the latest progress in TGD.

Tuesday, August 18, 2015

Hydrogen sulfide superconducts at -70 degrees Celsius!

The newest news is that hydrogen sulfide - the compound responsible for the smell of rotten eggs - conducts electricity with zero resistance at a record high temperature of 203 Kelvin (–70 degrees C), reports a paper published in Nature. This super-conductor however suffers from a serious existential crisis: it behaves very much like old fashioned super-conductor for which superconductivity is believed to be caused by lattice vibrations and is therefore not allowed to exist in the world of standard physics! To be or not to be!

TGD Universe allows however all flowers to bloom: the interpretation is that the mechanism is large enough value of heff=n×h implying that critical temperature scales up. Perhaps it is not a total accident that hydrogen sulfide H2S - chemically analogous to water - results from the bacterial breakdown of organic matter, which according to TGD is high temperature super-conductor at room temperature and mostly water, which is absolutely essential for the properties of living matter in TGD Universe.

See the earlier posting about pairs magnetic flux tubes carrying dark electrons of Cooper pair as an explanation of high Tc (and maybe also of low Tc) superconductivity.

For a summary of earlier postings see Links to the latest progress in TGD.

About negentropic entanglement as analog of an error correction code

In classical computation, the simplest manner to control errors is to take several copies of the bit sequences. In quantum case no-cloning theorem prevents this. Error correcting codes (\urlhttps://en.wikipedia.org/wiki/Quantumerrorcorrection) code n information qubits to the entanglement of N>n physical qubits. Additional contraints represents the subspace of n-qubits as a lower-dimensional sub-space of N qubits. This redundant representation is analogous to the use of parity bits. The failure of the constraint to be satisfied tells that the error is present and also the character of error. This makes possible the automatic correction of the error is simple enough - such as the change of the phase of spin state or or spin flip.

Negentropic entanglement (NE) obviously gives rise to a strong reduction in the number of states of tensor product. Consider a system consisting of two entangled systems consisting of N1 and N2 spins. Without any constraints the number of states in state basis is 2N1× 2N2 and one as N1+N2 qubits. The elements of entanglement matrix can be written as EA,B, A== ⊗i=1N1 (mi,si), B== ⊗k=1N2 (mk,sk) in order to make manifest the tensor product structure. For simplicity one can consider the situation N1=N2=N.

The un-normalized general entanglement matrix is parametrized by 2× 22N independent real numbers with each spin contributing two degrees of freedom. Unitary entanglement matrix is characterized by 22N real numbers. One might perhaps say that one has 2N real bits instead of almost 2N+1 real qubits. If the time evolution according to ZEO respects the negentropic character of entanglement, the sources of errors are reduced dramatically.

The challenge is to understand what kind of errors NE eliminates and how the information bits are coded by it. NE is respected if the errors act as unitary transformations E→ UEU of the unitary entanglement matrix. One can consider two interpretations.

  1. The unitary automorphisms leave information content unaffected only if they commute with E. In this case unitary automorphisms acting non-trivially would give rise genuine errors and an error correction mechanism would be needed and would be coded to quantum computer program.

  2. One can also consider the possibility that the unitary automorphisms do not affect the information content so that the diagonal form of entanglement matrix coded by N phases would carry of information. Clearly, the unitary automorphisms would act like gauge transformations. Nature would take care that no errors emerge. Of course, more dramatic things are in principle allowed by NMP: for instance, the unitary entanglement matrix could reduce to a tensor product of several unitary matrices. Negentropy could be transferred from the system and is indeed transferred as the computation halts.

    By number theoretic universality the diagonalized entanglement matrix would be parametrized by N roots of unity with each having n possible values so that nN different NEs would be obtained and information storage capacity would be I=log(n)/log(2) × N bits for n=2k one would have k× N bits. Powers of two for n are favored. Clearly the option for which only the eigenvalues of E matter, looks more attractive realization of entanglement matrices. If overall phase of E does not matter as one expects, the number of full bits is k× N-1. This option looks more attractive realization of entanglement matrices.

    In fact, Fermat polygons for which cosine and sine for the angle defining the polygon are expressible by iterating square root besides basic arithmetic operations for rationals (ruler and compass construction geometrically) correspond to integers, which are products of a power of two and of different Fermat primes Fn=22n+1. l

This picture can be related to much bigger picture.
  1. In TGD framework number theoretical universality requires discretization in terms of algebraic extension of rationals. This is not performed at space-time level but for the parameters characterizing space-time surfaces at the level of WCW. Strong form of holography is also essential and allows to consider partonic 2-surfaces and string world sheets as basic objects. Number theoretical universality (adelic physics) forces a discretization of phases and number theoretically allowed phases are roots of unity defined by some algebraic extension of rationals. Discretization can be also interpreted in terms of finite measurement resolution. Notice that the condition that roots of unity are in question realizes finite measurement resolution in the sense that errors have minimum size and are thus detectable.

  2. Hierarchy of quantum criticalities corresponds to a fractal inclusion hierarchy of isomorphic sub-algebras of the super-symplectic algebra acting as conformal gauge symmetries. The generators in the complement of this algebra can act as dynamical symmetries affecting the physical states. Infinite hierarchy of gauge symmetry breakings is the outcome and the weakening of measurement resolution would correspond to the reduction in the size of the broken gauge group. The hierarchy of quantum criticalities is accompanied by the hierarchy of measurement resolutions and hierarchy of effective Planck constants heff=n× h.

  3. These hierarchies are argued to correspond to the hierarchy of inclusions for hyperfinite factors of type II1 labelled by quantum phases and quantum groups. Inclusion defines finite measurement resolution since included sub-algebra does induce observable effects on the state. By Mac-Kay correspondence the hierarchy of inclusions is accompanied by a hierarchy of simply laced Lie groups which get bigger as one climbs up in the hierarchy. There interpretation as genuine gauge groups does make sense since their sizes should be reduced. An attractive possibility is that these groups are factor groups G/H such that the normal subgroup H (necessarily so) is the gauge group and indeed gets smaller and G/H is the dynamical group identifiable as simply laced group which gets bigger. This would require that both G and H are infinite-dimensional groups.

    An interesting question is how they relate to the super-symplectic group assignable to "light-cone boundary" δ M4+/-× CP2. I have proposed this interpretation in the context of WCW geometry earlier.

  4. Here I have spoken only about dynamical symmetries defined by discrete subgroups of simply laced groups. I have earlier considered the possibility that discrete symmetries provide a description of finite resolution, which would be equivalent with quantum group description.

Summarizing, these arguments boil down to the conjecture that discrete subgroups of these groups act as effective symmetry groups of entanglement matrices and realize finite quantum measurement resolution. A very deep connection between quantum information theory and these hierarchies would exist.

Gauge invariance has turned out to be a fundamental symmetry principle, and one can ask whether unitary entanglement matrices assuming that only the eigenvalues matter, could give rise to a simulation of discrete gauge theories. The reduction of the information to that provided by the diagonal form be interpreted as an analog of gauge invariance?

  1. The hierarchy of inclusions of hyper-finite factors of type II1 suggests strongly a hierarchy of effective gauge invariances characterizing measurement resolution realized in terms of hierarchy of normal subgroups and dynamical symmetries realized as coset groups G/H. Could these effective gauge symmetries allow to realize unitary entanglement matrices invariant under these symmetries.

  2. A natural parametrization for single qubit errors is as rotations of qubit. If the error acts as a rotation on all qubits, the rotational invariance of the entanglement matrix defining the analog of S-matrix is enough to eliminate the effect on information processing.

    Quaternionic unitary transformations act on qubits as unitary rotations. Could one assume that complex numbers as the coefficient field of QM is effectively replaced with quaternions? If so, the multiplication by unit quaternion for states would leave the physics and information content invariant just like the multiplication by a complex phase leaves it invariant in the standard quantum theory.

    One could consider the possibility that quaternions act as a discretized version of local gauge invariance affecting the information qubits and thus reducing further their number and thus also errors. This requires the introduction of the analog of gauge potential and coding of quantum information in terms of SU(2) gauge invariants. In discrete situation gauge potential would be replaced with a non-integrable phase factors along the links of a lattice in lattice gauge theory. In TGD framework the links would correspond the fermionic strings connecting partonic two-surfaces carrying the fundamental fermions at string ends as point like particles. Fermionic entanglement is indeed between the ends of these strings.

  3. Since entanglement is multilocal and quantum groups accompany the inclusion, one cannot avoid the question whether Yangian symmetry crucial for the formulation of quantum TGD \citeallb/twistorstory could be involved.
For details see the chapter Negentropy Maximization Principleor the article Quantum Measurement and Quantum Computation in TGD Universe

For a summary of earlier postings see Links to the latest progress in TGD.

Sunday, August 16, 2015

Sleeping Beauty Problem

Lubos wrote polemically about Sleeping Beauty Problem. The procedure is as follows.

Sleeping Beauty is put to sleep and coin is tossed. If the coin comes up heads, Beauty will be awakened and interviewed only on Monday. If the coin comes up tails, she will be awakened and interviewed on both Monday and Tuesday. On Monday she will be put into sleep by amnesia inducing drug. In either case, she will be awakened on Wednesday without interview and the experiment ends. Any time Sleeping Beauty is awakened and interviewed, she is asked, "What is your belief now for the proposition that the coin landed heads?" No other communications are allowed so that the Beauty does not know whether it is Monday or Tuesday.

The question is about the belief of the Sleeping Beauty on basis of the information she has, not about the actual probability that the coined landed heads. If one wants to debate one imagine oneself to the position of Sleeping Beauty. There are two basic debating camps, halfers and thirders.

  1. Halfers argue that the outcome of coin tossing cannot in any manner depend on future events and one has have P(Heads)= P(Tails)=1/2 just from the fact that that the coin is fair. To me this view is obvious. Lubos has also this view. I however vaguely remember that years ago, when first encountering this problem, I was ready to take the thirder view seriously.

  2. Thirders argue in the following manner using conditional probabilities. The conditional probability P(Tails|Monday) =P(Head|Monday) (P(X/Y) denotes probability for X assuming Y) and from the basic formula for the conditional probabilities stating P(X|Y)= P(X and Y)P(Y) and from P(Monday)= P(Tuesday)=1/2 (this actually follows from P(Heads)= P(Tail)=1/2 in the experiment considered!) , one obtains P(Tails and Tuesday)= P(Tails and Monday).

    Furthermore, one also has P(Tails and Monday)= P(Heads and Monday) (again from P(Heads)= P(Tails)=1/2!) giving
    P(Tails and Tuesday)= P(Tails and Monday)=P(Heads and Monday). Since these events are independent for one trial and one of them must occur, each probability must equal to 1/3. Since "Heads" implies that the day is Monday, one has P(Heads and Monday)= P(Heads)=1/3 in conflict with P(Heads)=1/2 used in the argument. To me this looks like a paradox telling that some implicit assumption about probabilities in relation to time is wrong.

To my opinion the basic problem in the argument of thirders is there assumption that events occurring at different times can form a set of independent events. Also the difference between experienced and geometric time is involved in an essential manner when one speaks about amnesia.

When one speaks about independent events and their probabilities in physics they are must be causally independent and occur at the same moment of time. This is crucial in the application of probability theory in quantum theory and also classical theory. If time would not matter, one should be able to replace time-line with space-like line - say x-axis. The counterparts of Monday, Tuesday, and Wednesday can be located to x-axis with a mutual distance of say one meter. One cannot however realize the experimental situation since the notion of space-like amnesia does not make sense! Or crystallizing it: independent events must have space-like separation. The arrow of time is also essential. For the conditional probabilitys P(X|Y) used above X occurs before Y and this breaks the standard arrow of time.

This clearly demonstrates that philosophy and mathematics cannot be separated from physics and that the notion of time should be fundamental issued both in philosophy, mathematics and physics!


Friday, August 14, 2015

About quantum measurement and quantum computation in TGD Universe

During years I have been thinking how quantum computation could be carried out in TGD Universe (see this). There are considerable deviations from the standard view. Zero Energy Ontology (ZEO), weak form of NMP dictating the dynamics of state function reduction, negentropic entanglement (NE), and hierarchy of Planck constants define the basic differences between TGD based and standard quantum measurement theory. TGD suggests also the importance of topological quantum computation (TQC) like processes with braids represented as magnetic flux tubes/strings along them.

The natural question that popped in my mind was how NMP and Zero Energy Ontology (ZEO) could affect the existing view about TQC. The outcome was a more precise view about TQC. The basic observation is that the phase transition to dark matter phase reduces dramatically the noise affecting quantum quits. This together with robustness of braiding as TQC program raises excellent hopes about TQC in TGD Universe. The restriction to negentropic space-like entanglement (NE) defined by a unitary matrix is something new but does not seem to have any fatal consequences as the study of Shor's algorithm shows.

NMP strongly suggests that when a pair of systems - the ends of braid - suffer state function reduction, the NE must be transferred somehow from the system. How? The model for quantum teleportation allows to identify a possible mechanism allowing to achieve this. This mechanism could be fundamental mechanism of information transfer also in living matter and phosphorylation could represent the transfer of NE according to this mechanism: the transfer of metabolic energy would be at deeper level transfer of negentropy. Quantum measurements could be actually seen as transfer of negentropy at deeper level.

For details see the chapter Negentropy Maximization Principleor the article Quantum Measurement and Quantum Computation in TGD Universe

For a summary of earlier postings see Links to the latest progress in TGD.

Thursday, August 13, 2015

Flux tube description seems to apply also to low Tc superconductivity



Discussions with Hans Geesink have inspired sharpening of the TGD view about bio-superconductivity (bio-SC), high Tc superconductivity (SC) and relate the picture to standard descriptions in a more detailed manner. In fact, also standard low temperature super-conductivity modelled using BCS theory could be based on the same universal mechanism involving pairs of magnetic flux tubes possibly forming flattened square like closed flux tubes and members of Cooper pairs residing at them.

A brief summary about strengths and weakness of BCS theory

First I try to summarise what patent reminds about BCS theory.

  1. BCS theory is successful in 3-D superconductors and explains a lot: supracurrent, diamagnetism, and thermodynamics of the superconducting state, and it has correlated many experimental data in terms of a few basic parameters.

  2. BCS theory has also failures.

    1. The dependence on crystal structure and chemistry is not well-understood: it is not possible to predict, which materials are super-conducting and which are not.

    2. High-Tc SC is not understood. Antiferromagnetism is known to be important. The quite recent experiment demonstrates conductivity- maybe even conductivity - in topological insulator in presence of magnetic field (see
      this). This is compete paradox and suggests in TGD framework that the flux tubes of external magnetic field serve as the wires (see previous posting).

  3. BCS model based on crystalline long range order and k-space (Fermi sphere). BCS-difficult materials have short range structural order: amorphous alloys, SC metal particles 0-down to 50 Angstroms (lipid layer of cell membrane) transition metals, alloys, compounds. Real space description rather than k-space description based on crystalline order seems to be more natural. Could it be that the description of electrons of Cooper pair is not correct? If so, k-space and Fermi sphere would be only appropriate description of ordinary electrons needed to model the transition to to super-conductivity? Super-conducting electrons could require different description.

  4. Local chemical bonding/real molecular description has been proposed. This is of course very natural in standard physics framework since the standard view about magnetic fields does not provide any ideas about Cooper pairing and magnetic fields are only a nuisance rather than something making SC possible. In TGD framework the situation is different.

TGD based view about SC

TGD proposal for high Tc SC and bio-SC relies on many-sheeted space-time and TGD based view about dark matter as heff=n× h phase of ordinary matter emerging at quantum criticality (see this).

Pairs of dark magnetic flux tubes would be the wires carrying dark Cooper pairs with members of the pair at the tubes of the pair. If the members of flux tube pair carry opposite B:s, Cooper pairs have spin 0. The magnetic interaction energy with the flux tube is what determines the critical temperature. High Tc superconductivity, in particular the presence of two critical temperatures can be understood. The role of anti-ferromagnetism can be understood.

TGD model is clearly x-space model: dark flux tubes are the x-space concept. Momentum space and the notion of Fermi sphere are certainly useful in understanding the transformation ordinary lattice electrons to dark electrons at flux tubes but the super conducting electron pairs at flux tubes would have different description.

Now come the heretic questions.

  1. Do the crystal structure and chemistry define the (only) fundamental parameters in SC? Could the notion of magnetic body - which of course can correlate with crystal structure and chemistry - equally important or even more important notion?

  2. Could also ordinary BCS SC be based on magnetic flux tubes? Is the value of heff=n× h only considerably smaller so that low temperatures are required since energy scale is cyclotron energy scale given by E= heff=n× fc, fc = eB/me. High Tc SC would only have larger heff and bio-superconductivity even larger heff!

  3. Could it be that also in low Tc SC there are dark flux tube pairs carrying dark magnetic fields in opposite directions and Cooper pairs flow along these pairs? The pairs could actually form closed loops: kind of flattened O:s or flattened squares.

One must be able to understand Meissner effect. Why dark SC would prevent the penetration of the ordinary magnetic field inside superconductor?
  1. Could Bext actually penetrate SC at its own space-time sheet. Could opposite field Bind at its own space-time sheet effectively interfere it to zero? In TGD this would mean generation of space-time sheet with Bind=-Bext so that test particle experiences vanishing B. This is obviously new. Fields do not superpose: only the effects caused by them superpose.

    Could dark or ordinary flux tube pairs carrying Bind be created such that the first flux tube portion Bind in the interior cancels the effect of Bext on charge carriers. The return flux of the closed flux tube of Bind would run outside SC and amplify the detected field Bext outside SC. Just as observed.

  2. What happens, when Bext penetrates to SC? heff→ h must take place for dark flux tubes whose cross-sectional area and perhaps also length scale down by heff and field strength increases by heff. If also the flux tubes of Bind are dark they would reduce in size in the transition heff→ h by 1/heff factor and would remain inside SC! Bext would not be screened anymore inside superconductor and amplified outside it! The critical value of Bext would mean criticality for this heff → h phase transition.

  3. Why and how the phase transition destroying SC takes place? Is it energetically impossible to build too strong Bind? So that effective field Beff=Bdark+ Bind+Bext experienced by electrons is reduced so that also the binding energy of Cooper pair is reduced and it becomes thermally unstable. This in turn would mean that Cooper pairs generating the dark Bdark disappear and also Bdark disappears. SC disappears.

Addition:The newest news is that hydrogen sulfide - the compound responsible for the smell of rotten eggs - conducts electricity with zero resistance at a record high temperature of 203 Kelvin (–70 degrees C), reports a paper published in Nature. This super-conductor however suffers from a serious existential crisis: it behaves very much like old fashioned super-conductor for which superconductivity is believed to be caused by lattice vibrations and is therefore not allowed to exist in the world of standard physics! To be or not to be!

TGD Universe allows however all flowers to bloom: the interpretation is that the mechanism is large enough value of heff=n×h implying that critical temperature scales up. Perhaps it is not a total accident that hydrogen sulfide H2S - chemically analogous to water - results from the bacterial breakdown of organic matter, which according to TGD is high temperature super-conductor at room temperature and mostly water, which is absolutely essential for the properties of living matter in TGD Universe.

See the chapter Quantum model for bio-superconductivity: II

For a summary of earlier postings see Links to the latest progress in TGD.