https://matpitka.blogspot.com/

Saturday, October 19, 2024

Zero energy ontology, holography = holomorphy vision and TGD view of qualia

Zero energy ontology (ZEO) and holography = holomorphy vision providing an exact solution of classical field equations allow to solve some earlier problems of TGD inspired theory of consciousness and to sharpen earlier interpretations. Holography = holomorphy vision generalizes 2-D conformal invariance to 4-D situation and provides a universal solution of field equations in terms of minimal surfaces defined as roots for pairs of generalized analytic functions of the generalized complex coordinates of H=M4× CP2 (one of the coordinates is hypercomplex coordinate with light-like coordinate curves) (see this) and this).

Consider first the implications of ZEO (see this and this).

  1. ZEO predicts that in "big" state function reductions (BSFRs) as counterparts of ordinary SFRs the arrow of time changes. "Small" SFRs (SSFRs) are the counterpart for repeated measurements of the same observables, which in standard QM leave the system unaffected (Zeno effect). In SSFRs, the state of the system however changes but the arrow of time is preserved. This has profound implications for the understanding of basic facts about consciousness.
  2. The sequence of SSFR corresponds to a sequence of delocalizations in the finite-dimensional space of causal diamonds CD =cd× CP2 (see this) and consists of delocalizations (dispersion) followed by localizations as analogs of position measurements in the moduli parameterizing the CD. This sequence gives rise to subjective existence, self.
  3. BSFR has interpretation is accompanied by reincarnation with an opposite arrow of geometric time. BSFR means the death of self as a sequence of "small" SFRs (SSFRs) and corresponds to falling asleep or even death. Death is therefore a completely universal phenomenon. The next BSFR means birth with the original arrow of time: it can be wake-up in the next morning or reincarnation taking place considerably later, life time is the first guess for the time scale. This follows from the fact that causal diamond CD =cd× CP2 increases in size during the sequence of SSFRs.
  4. What forces the ZEO is holography which is slightly non-deterministic due to the classical non-determinism of an already 2-D minimal surface realized as a soap film for which the frame spanning it does not fix it uniquely. This means that the 4-D space-time surface located inside CD and identifiable as the analog of Bohr orbit determined by holography must be taken as a basic object instead of a 3-surface. In SSFRs, the state at the passive light-like boundary of CD is unaffected just as in Zeno effect but the state at the active boundary changes. Due to the dispersion in the space of CDs the size of CD increases in statistical sense and the geometric time identifiable as the distance between the tips of CD increases and correlates with the subjective time identifiable as sequence of SSFRs.
  5. In standard quantum theory, the association of conscious experience with SFRs does not allow us to understand conscious memories since the final state of state function reduction does not contain any information about the earlier states and state function reductions. Zero energy ontology leads to a concrete view of how conscious memories can be realized in the TGD Universe (see this). The superposition of space-time surfaces between fixed initial state and changing final state of SSFR contains the classical information about previous states and state function reductions and makes memory possible. The slight non-determinism of the classical time evolution implies loci of non-determinism as analogs of soap film frames and memory recall corresponds to a quantum measurement at these memory seats.
  6. SSFRs correspond to repeated measurements of the same observable and the eigenvalues of the measured observables characterize the conscious experience, "qualia", partially. Also new commuting observables related to the non-determinism can appear and the set of observables can be also reduced in size. The superposition of the space-time surfaces as analogs of non-deterministic Bohr orbits however changes in the sequence of SSFRs and the associated classical information changes and can give rise to conscious experiences perhaps involving also the qualia remaining constant as long as self exists.

    The eigenvalues associated with the repeatedly measured observables do not change during the sequence of SSFRs and one can ask if they can give rise to a conscious experience, which should be assignable to change. Could these constant qualia be experienced by a higher level self experiencing self as sub-self defining a mental image? This higher level self would indeed experience the birth and death of subself and therefore its qualia.

    The observables at the passive boundary of CD correspond qualia of higher level self and the additional observables associated with SSFRs correspond to those of self. They would be associated with self measurements.

  7. Note that self dies when the measured observables do not commute with those which are diagonalized at the passive boundary. It is quite possible that these kinds of temporary deaths take place all the time. This would allow learning by trial and error making possible conscious intelligence and problem solving since the algebraic complexity is bound to increase: this is formulated in terms of Negentropy Maximization Principle (see this).
ZEO and holography = holomorphy vision allow us to understand some earlier problems of TGD inspired theory of consciousness and also to sharpen the existing views.

Two models for how sensory qualia emerge

Concerning sensory qualia (see this) I have considered two basic views.

  1. The first view is that the sensory perception corresponds to quantum measurements of some observables. Qualia are labelled by the measured quantum numbers.
  2. The second, physically motivated, view has been that qualia correspond to increments of quantum numbers in SFR (see this). This view can be criticized since the quantum numbers need not be well-defined for the initial state of the SFR. One can however modify this view: perhaps the redistribution of quantum numbers leaving the total quantum numbers unaffected, is what gives rise to the sensory qualia.

    The proposed physical realization is based on the sensory capacitor model of qualia. Sensory receptors would be analogous to capacitors and sensory perception would correspond to dielectric breakdown. Sensory qualia would correspond to the increments of quantum numbers assignable to either cell membrane in the generalized di-electric breakdown. The total charges of the sensory capacitor would vanish but they would be redistributed so that both membranes would have a vanishing charge. Membranes could be also replaced with cell exterior and interior or with cell membrane and its magnetic body. Essential would be emergence or disappearance of the charge separation.

    This picture conforms with the recent view about the role of electric and gravitational quantum coherence assignable to charged and massive systems. In particular, electric Planck constant would be very large for charged systems like cell, neuron, and DNA and in the dielectric breakdown and its time reversal its value would change dramatically. If this is the case the dynamic character of effective Planck constant involving phase transition of ordinary to dark matter and vice versa would be essential for understanding qualia.

  3. As the above argument demonstrated, the qualia can be decomposed to internal and external qualia. The internal qualia correspond to self-measurements of sub-self occurring in SSFRs whereas the external qualia correspond to the qualia measured by self having sub-self as a mental image. They are not affected during the life-time of the mental image. Whether the self can experience the internal qualia of subself is far from clear. The sensory capacitor model would suggest that this is the case. Also the model for conscious memories suggests the same. The internal qualia would correlate with the classical dynamics for the space-time surfaces appearing in the superposition defining the zero energy state and make possible, not only conscious memory and memory recall based on the failure of precise classical determinism, but also sensory qualia as subselves experienced as sensory mental images.
Geometric and flag manifold qualia and the model for the honeybee dance

One can decompose qualia to the qualia corresponding to the measurement of discrete observables like spin and to what might be called geometric qualia corresponding to a measurement of continuous observables like position and momentum. Finite measurement resolution however makes these observables discrete and is realized in the TGD framework in terms of unique number theoretic discretization of the space-time surface.

Especially interesting qualia assignable to twistor spaces of M4 and CP2.

  1. Since these twistor spaces are flag manifolds, I have talked about flag-manifold qualia. Their measurement corresponds to a position measurement in the space of quantization axes for certain quantum numbers. For angular momentum this space would be S2= SO(3)/SO(2) and the localization S2 would correspond to a selection of the quantization axis of spin. For CP2=SU(3)U(2) the space of the quantization axis for color charges corresponds to 6-D SU(3)(U(1)× U(1), which is identifiable as a twistor space of CP2.
  2. The twistor space of M4 can be identified locally as M4× S2, where S2 is the space of light-like rays from a point of M4. This space however has a non-trivial bundle structure since for two points of M4 connected by a light-like ray, the fibers intersect.
What is the corresponding flag manifold for M4?
  1. The counterpart of the twistor sphere would be SO(1,3)/ISO(2), where ISO(2) is the isotropy group of massless momentum identifiable as a semidirect product of rotations and translations of 2-D plane. SO(1,3)/ISO(2) corresponds to the 3-D light-cone boundary (other boundary of CD) rather than S2 since it has one additional light-like degree of freedom. Is the twistor space as a flag manifold of the Poincare group locally M4× SO(1,3)/ISO(2). This is topologically 7-D but metrically 6-D. Since light rays are parametrized by S2 one can also consider the possibility of replacing M4× SO(1,3)/ISO(2) with S2 in which case the twistor space would be 6-D and represented a non-trivial bundle structure.
  2. Could one restrict M4 to E3 or to hyperbolic 3-sphere H3 for which light-cone proper time is constant? In these cases the bundle structure would trivialize. What about the restriction of M4 to the light-like boundaries of CD? The restriction to a single boundary gives non-trivial bundle structure but seems otherwise trivial. What about the union of the future and past boundaries of CD? The bundle structure would be non-trivial at both boundaries and there would also be light-like rays connecting future and past light-like boundaries.

    The unions ∪i H3i(ai) of hyperbolic 3-spaces corresponding different values a=ai of the light-cone propert time a emerge naturally in M8-H duality and could contain the loci of the singularities of space-time surfaces as analogs of frames of soap filmas. Also these would give rise to a non-trivial bundle structure.

    These identifications differ from the usual identification of the M4 twistor space as CP3: note that this identification of the M4 twistor space is problematic since it involves compactification of M4 not consistent with the Minkowski metric. Holography = holomorphy vision in its recent form involves a general solution ansatz in terms of roots of two analytic functions f1 and f2 and f2=0 (see this), which identifies the twistor spheres of the twistor spaces of M4 and CP2 represented as metrically 6-D complex surfaces of H. M4 twistor sphere corresponds to the light-cone boundary in this identification. This identification map also defines cosmological constant as a scale dependent dynamical parameter.

A basic application for the twistor space of CP2 has been in the TGD based model (see this and this) for the findings of topologist Barbara Shipman (see this) who made the surprising finding that the twistor space of CP2, naturally assignable to quarks and color interactions, emerges in the model for the dance of honeybee. This kind of proposal is nonsensical in the standard physics framework but the predicted hierarchy of Planck constants and p-adic length scales make possible scaled variants of both color and electroweak interactions and there is a lot of empirical hints for the existence of this hierarchy, in particular for the existence as a scaled up variants of hadron physics leading to a rather radical proposal for the physics of the Sun (see this).

Shipman found that the honeybee dance represents position in SU(3)/U(1)× U(1) coding for the direction and distance of the food source in 2-D plance! Why should this be the case? The explanation could be that the space-time surfaces as intersections of 6-D counterparts of the twistor spaces ISO(2)× ∪i H3(a=ai) resp. SU(3)/U(1)× U(1) identified as a root of analytic function f1 resp. f2 (see this) have space-time surface as 4-D intersection so that honeybee dance would map the point of the flag manifold SU(3)/U(1)× U(1) to a point of M4× S2 or ∪i H3(a=ai)× ISO(2) (locally). The restriction to a 2-D subset of points could be due to the measurement of the distance of the food source represented by the point of H3i (or M4).

See the article Some objections against TGD inspired view of qualia or the chapter General Theory of Qualia.

For a summary of earlier postings see Latest progress in TGD.

For the lists of articles (most of them published in journals founded by Huping Hu) and books about TGD see this.

Friday, October 18, 2024

IIT and TGD

Gary Ehlenberg sent a link to an article about Integrated Information Theory of consciousness (IIT) (see this). The article gives a nice summary of IIT. Gary wondered whether quantum theory is completely left out. The suspicion of Gary was correct: there is no mention of quantum theory.

It is good to attach here the abstract of the article "Consciousness: here, there and everywhere?" of Tononi and Koch published in the Philosophical Transactions of the Royal Society B in to give a general perspective.

The science of consciousness has made great strides by focusing on the behavioural and neuronal correlates of experience. However, while such correlates are important for progress to occur, they are not enough if we are to understand even basic facts, for example, why the cerebral cortex gives rise to consciousness but the cerebellum does not, though it has even more neurons and appears to be just as complicated. Moreover, correlates are of little help in many instances where we would like to know if consciousness is present: patients with a few remaining islands of functioning cortex, preterm infants, non-mammalian species and machines that are rapidly outperforming people at driving, recognizing faces and objects, and answering difficult questions.

To address these issues, we need not only more data but also a theory of consciousness one that says what experience is and what type of physical systems can have it. Integrated information theory (IIT) does so by starting from experience itself via five phenomenological axioms: intrinsic existence, composition, information, integration and exclusion. From these it derives five postulates about the properties required of physical mechanisms to support consciousness.

The theory provides a principled account of both the quantity and the quality of an individual experience (a quale), and a calculus to evaluate whether or not a particular physical system is conscious and of what. Moreover, IIT can explain a range of clinical and laboratory findings, makes a number of testable predictions and extrapolates to a number of problematic conditions.

The theory holds that consciousness is a fundamental property possessed by physical systems having specific causal properties. It predicts that consciousness is graded, is common among biological organisms and can occur in some very simple systems. Conversely, it predicts that feed-forward networks, even complex ones, are not conscious, nor are aggregates such as groups of individuals or heaps of sand. Also, in sharp contrast to widespread functionalist beliefs, IIT implies that digital computers, even if their behaviour were to be functionally equivalent to ours, and even if they were to run faithful simulations of the human brain, would experience next to nothing.

The article lists the 5 basic postulates of IIT leading to a numerical measure for the level of consciousness of a system. I wrote about IIT years ago and compared it with the TGD inspired theory of consciousness (see this and this). It is interesting to take a fresh look at IIT since the mathematical and physical understanding of TGD has evolved dramatically during these 8 years.

  1. The basic criticism is already raised by the idea that conscious experience means property of a system, consciousness. This reflects the materialistic view that conscious experience is a property of the system just as the mass and leads to the well-known philosophical problems. Materialism leads to problems with free will for instance.
  2. The key problem is what subjective existence means and here materialism, idealism and dualism fail. Here quantum theory comes to the rescue and allows us to assign subjective existence as experience to state function reduction (SFR), or rather the interval between two SFRs. The SFRs would be those which in standard wave mechanics correspond to repeated measurements of the same observables and in that context would have no effect on the system. In the zero energy ontology of TGD the state of system changes and "small" SSFRs give rise to the experienced flow of subjective time correlating with that of geometric time .
  3. Also the assumption that the consciousness just exists or does not, is too simplistic. Already Freud realized Id-ego-super-ego triality and physics based picture strongly suggests that conscious entities form hierarchies just as physical systems do. There would exist very naturally a hierarchy of selves. They would have subselves, perhaps as mental images, etc.. and being subselves of higher levels selves. This would however be a dramatic deviation from the western world view. Although IIT assumes panpsychism, the lack of this realization reflects the brain centered view of neuroscience very analogous to the Earth centered world view before the emergence of astrophysics.
  4. I saw no mention related to the problem of time: what is the relation between geometric time of physicists and the flow of subjective time which is the essential element of conscious experience.
  5. About what death and sleep mean, IIT does not say anything at the philosophical level. Loss of consciousness can be explained as a reduction of the level of integration (more or less connectedness of the system) measured by the number Φ.
  6. Metabolic energy feed is essential for life and consciousness and I saw no mention of this.
There are 5 postulates which are proposed to give rise to a criteria for when the system is conscious.

1. Intrinsic existence

Cause-effect power is taken as a key criterion. Cause effect power is understood classically since quantum theory is not involved. Cause effect power has several corresponds in TGD.

  1. In TGD the classical correlate of cause-effect power at the space-time level is holography stating that 3-D data (3-surface dictates the space-time surface as analog of Bohr orbit. There is however a slight failure of determinism and this forces us to take these 4-D Bohr orbits as basic objects. They are classical correlates for almost deterministic behavioral patterns and SSFRs between different superpositions of Bohr orbits give rise to subjective time evolution.
  2. In TGD "small" SFRs (SSFRs) are t quantum correlates of cause-effect power. "Big" SFRs (BSFRs) give rise to the death (sleep state) of the system and reincarnation with an opposite arrow of geometric time. Second BSFR means wake-up.

    BSFRs are essential for understanding biological processes like homeostasis. A pair of BSFRs means sleep period during which the entropy of the system is reduced and the system wakes up as a less entropic system. This is essential in the battle of the living systems against second law.

  3. Causal diamond (CD= cd×CP2) is the correlate of the cause-effect power at the level of the H=M4×CP2. cd has geometry of causal diamond and the two light-like boundaries are in asymmetric relation. At the passive boundary the states do nt changes in SSFRs. It can be said to be the causal agent. At the active boundary they change. Also the size of CD increases in statistical sense and geometric time corresponds to the increasing temporal distance between the tips of CD. In BSFR the roles of active and passive boundaries of CD change.

    I must admit that I did not understand the illustrations of cause-effect structure involving Boolean algebra. Boolean functions are one way to see causality. In physics, classical deterministic time evolution defines a more general cause-effect structure.

2. Composition

Systems are structured. In standard physics, where space-time is infinite and without topological structure, there is no fundamental definition for what this means and only phenomenological models are possible. In TGD, many-sheeted 3-space decomposes to a union of 3-surfaces which can fuse and decay and these processes occur also in scales essential for life and consciousness and also we perceive the many-sheeted space-time and these processes directly but our education make it impossible to realize this.

3. Information

Cause-effect repertoire is taken as a basic concept behind the notion of information.

  1. In TGD, a cause-effect repertoire corresponds to different 4-D Bohr orbits associated with the same 3-surfaces holographic data. These are the space-time correlates for the behaviours.
  2. As the algebraic complexity of the space-time surface increases, the size of the repertoire increases. The dimension of extension of rationals assignable to the space-time regions measures this complexity and is assumed to define effective Planck constant which in turn gives a measure for the scale of quantum coherence serving as a measure for the evolutionary level of the system. This means deviation from the standard quantum theory with single Planck constant. Field bodies as carriers of dark phases of ordinary particles means a second deviation made possible by the new view of classical fields.
  3. Number theoretic view of TGD is something completely new and allows to define the notion of conscious information. p-Adization and adelization in turn gives correlates of cognition and one can assign to the system an entanglement negentropy as the sum of its p-adic variants. Entanglement negentropy is positive and increases with the complexity of the system. It is larger than real entanglement entropy and its increase implies the increase of the latter: cognition produces unavoidably ordinary entropy.
  4. The number theoretic entanglement negentropy could be seen as a counterpart of an integrated information and measures the cognitive level of the system and the level of cognitive consciousness.

    Number theoretic evolution as an unavoidable increase of complexity in the sequence of state function reductions forces the increase of this entanglement entropy so that the potentially conscious information of the system necessarily increases.

  5. The ZEO based view of quantum jump (see this, this and this) allows to understand how systems are able to have memories about their states before SSFRs: in standard quantum theory this is not possible. Therefore Universe making SSFRs and BSFRs learns more and more about itself and is able to remember what it has learned (see this).

    In IIT, the qualia space is identified as cause-effect space. In the TGD framework SSFR leads to a final state containing information about the previous quantum state since it is identified as a superposition of classical space-time surfaces leading from the fixed initial state at the passive boundary of the CD to the active boundary of CD. The original proposal that qualia are simply labelled by the quantum numbers measured in SSFR is not quite correct. The qualia also involve classical information about the SSFR via the superposition of space-time surfaces between initial (fixed) and final classical states: this would be the counterpart for the cause-effect.

4. Integration

The counterpart of integration in the TGD framework is entanglement.

  1. Entanglement entropy to which one can assign adelic negentropy measures the degree of entanglement and integration. In SFR the entanglement is reduced: the system decomposes to two parts. This is the basic aspect of conscious experience. About this says IIT nothing.
  2. Monopole flux tubes connecting parts of the system to a single coherent whole provide a classical correlate for the entanglement and in SFRs the flux tube connections between the two parts of the system could split. More precisely, pairs of flux tubes connecting the subsystems reconnect to U-shaped flux tubes associated with the systems: the connection is split, SFR has occurred.
  3. In biology reconnection is fundamental, for instance for bio-catalysis and for the recognition of molecules by the immune system. Death of the system means splitting of these flux tubes. These flux tubes carry dark matter as large heff phases. There must be a metabolic energy feed to prevent the values of heff from decreasing. This leads to reduction of the cognitive level and geometrically to the shortening of the U-shaped flux tubes so that the system loses the control of its environment and receives information from a smaller volume.
5. Exclusion

Exclusion postulate states that cause effect structure must be definite. The notion is described in terms of a phenomenological set theoretic picture. I did not understand the Boolean illustrations of the cause effect structure. The notion of maximal irreducibility can be understood in TGD as maximal connectedness or at least connectedness of the 3-surface by connecting flux tubes (or in the weakest sense, the 4-surfaces as analog of Bohr orbit).

What precisely defined cause-effect structure could mean in ZEO? The state at the passive boundary of CD remains fixed during the sequences of SSFRs determining the life-cycle (wake-up period of self) so that one can can say that classically the almost deterministic evolution of the space-time surface is implied by the 3-surface at the passive boundary, it acts as a causal agent. The small failure of determinism means that there are also intermediate "agents" slightly affecting the time evolution. They also make possible memory and force ZEO solving the basic problem of the quantum measurement theory and allowing also free will.

What is missing from IIT?

The postulates of IIT are inspired by computationalism and materialistic neuroscience and have no connection to (quantum) physics or biology. The hierarchy of selves is a central notion missing completely in IIT and this hierarchy is essential for a real understanding of conscious entities. The levels of the hierarchy interact. For instance, the field body (magnetic body) carrying dark matter as large heff phases of dark matter serves as a boss of the biological body carrying ordinary matter. Cognitive hierarchies as hierarchies of extensions of rationals giving rise to directed entanglement hierarchies are also something not possible in the standard physics.

These hierarchies are also essential for understanding evolution. In particular, classical gravitational and electromagnetic fields give rise to field bodies with very long quantum coherence lengths, even of astrophysical size and these scales are predicted to be fundamental for understanding life and consciousness in ordinary living matter.

The somewhat surprising prediction of IIT is that ordinary computers need not be conscious. In TGD this is possible only if the quantum coherence time is longer than the clock period but the contents of consciousness need not correlate with the program. The change of the arrow of time in BSFRs makes possible the analogs of feedback loops at various layers of the self hierarchy and learning by trial and error and would be the basic aspect of living systems.

Whether ordinary computers could be conscious is an interesting question and in TGD one ends up with a quantitative criterion for this in terms of the clock frequency (see this). For the Earth's gravitational body, the lower bound for the clock frequency is 67 Hz. For the solar gravitational body, the clock frequency should be above 50 Hz which is the average EEG frequency and satisfied for the ordinary computers. Does this mean that the users of computers can entangle with them? It has been claimed that when a chicken entangles with a robot whose motion is based on a random number generator, the robot seems to take the role of Mother.

See TGD as it is towards end of 2024: part I and TGD as it is towards end of 2024: part II. See also the article About Langlands correspondence in the TGD framework describing the connection between number theoretic and geometric visions of physics.

See also the chapter Questions about IIT.

For a summary of earlier postings see Latest progress in TGD.

For the lists of articles (most of them published in journals founded by Huping Hu) and books about TGD see this.

Thursday, October 17, 2024

A TGD based resolution of the tension between neutrino mass scale deduced from neutrino mixing and from gravitational lensing

I learned about very interesting findings related to neutrinos (see this). The 3 neutrino families are known to mix and from various experiments, also from those performed in the laboratory, one can deduce estimates for the analog of the CKM matrix describing the mixing. This also allows us to estimate the sum of the neutrino masses.

One can also deduce information about neutrino masses from cosmology. The information comes from gravitational lensing. The gravitational lensing increases with so called clumpiness which measures how the inhomogeneities of mass density are. If one assumes standard cosmology with cold dark matter, one expects that the larger the average neutrino mass scale is, the larger the effects of the neutrino mass density on the clumpiness of the universe is.

According to the popular article, DESI maps out cosmic structures to determine the expansion rate through an effect known as baryon acoustic oscillations, sound waves that imprinted circular patterns on the very early universe. By tracing those patterns at different points in the universe s history, scientists can track its growth, kind of cosmic tree rings are in question.

Combining the measurements of clumpiness from the cosmic microwave background and the expansion rate from DESI two things that neutrinos affect makes it possible for scientists to deduce estimates for the sum of neutrino masses. The upper limit turned out to be unexpectedly small, about .07 eV. This is very near to the lower bound for the sum, about .06 eV deduced from the neutrino mixing. There are even experiments suggesting an upper limit of .05 eV in conflict with neutrino mixing data.

The outcome suggests that something goes wrong with the standard cosmology. Could it be that neutrinos do not affect the clumpiness so much as believed? Could neutrinos be lighter in the early cosmology? Or is the view of how clumpiness is determined, entirely wrong? Could the mechanism behind the gravitational lensing be something different from what it is believed to be?

This brings to mind what is called the clumpiness paradox about which I have written a blog posting (see this). Clumpiness paradox means that the clumpiness depends on the scale in which it is estimated. Clumpiness is smaller in long length scales. One proposal is that in long length scales clumpiness is determined to a high degree by the mass density of ultralight axions. The clumps have been now observed also in shorter scales. The strange conclusion is that cold dark matter is colder in short scales. One would expect just the opposite to be true.

The scale dependence of clumpiness suggests a fractal distribution of matter and dark matter. Indeed, in the TGD framework, cosmic strings thickened to monopole flux tubes forming scale hierarchy would be responsible for the gravitational lensing and the thickness of the monopole flux tubes would characterize the lensing. The explanation for the large size of the clumps in long scales would be the large size of the Compton length proportional to effective Planck constant heff=nh0. In the case of gravitational Planck constant heff= hgr= GMm/β0, β0 a velocity parameter, assignable to the monopole flux tubes connecting pairs formed by a large mass M and small mass m, the gravitational Compton length is equal to Λgr= GM/β0= rs/2β0, rs Schwartshild radius of M increasing with the size scale of structure (note that there is no dependence on m). The larger the scale of the studied astrophysical object, the larger Λgr as minimal gravitational quantum coherence length is, and the smaller the clumpiness in this scale.

This would predict the effect of neutrinos and also other particles on lumping and gravitational lensing is negligible. Cosmic strings would explain the lumping. The model would also explain why the upper bound for the sum of neutrino masses is inconsistent with the findings from neutrino mixing.

See the article About the Recent TGD Based View Concerning Cosmology and Astrophysics or the chapter with the same title.

For a summary of earlier postings see Latest progress in TGD.

For the lists of articles (most of them published in journals founded by Huping Hu) and books about TGD see this.

Saturday, October 12, 2024

Why the dark energy density is inversely proportional to the surface area of the volume studied?

Sabine Hossenfelder commented in her posting "Surprise Comeback: Dark Energy Could Be Holographic After All" (see this) the idea that the mysterious dark energy might not be real but an outcome of holography and assignable to the 3-D surface which in holography contains the information determining the dynamic in the interior of the space-time. The comments were inspired the the article "Evolution of perturbations in the model of Tsallis holographic dark energy" (see this) by Astashenok and Tepliakov..

The starting point observation is that the dark energy density is in a good approximation found to be proportional to 1/S, where S is the surface area of a large sphere surrounding the region studied. By the way, Sabine makes a little mistake here: she talks about dark energy rather than dark energy density. The reader can check this from the article of by Astashenok and Tepliakov. The model of Tsallis has been given up long ago but the authors represent an argument that since dark energy is not ordinary cosmic fluid, ordinary perturbation theoretic analysis does not apply.

TGD suggests however a much simpler explanation of the finding. In TGD, dark energy is identifiable as a galactic dark matter and consists of magnetic and volume energy assignable to very long monopole flux tubes with a huge string tension. No galactic dark matter halo nor exotic dark matter particles are needed. The galactic velocity spectrum is correctly predicted from the string tension which is also predicted.

To see whether TGD can explain the finding that dark energy density is proportional to 1/S, one must estimate the average density of dark energy in a large cylindrical volume around a long cosmic string. The dark energy is proportional to the length L of the string. The volume is roughly V=SL, where S, is the surface area of the cross section of the cylinder. Therefore one has that dark energy density satisfied E/V= E/SL= 1/S. Just as has been found.

See the article Some strange astrophysical and cosmological findings from the TGD point of view or the chapter About the recent TGD based view concerning cosmology and astrophysics .

For a summary of earlier postings see Latest progress in TGD.

For the lists of articles (most of them published in journals founded by Huping Hu) and books about TGD see this.

Monday, October 07, 2024

Is it possible to have objective laws of physics?

Daniel Oriti (see this) concludes that there are no objective laws of physics. 40 years ago, the views were very different but the dramatic failure of the superstring approach together with the multiverse catastrophe changed the optimistic opinions. It is of course psychologically much easier to conclude that there are no objective laws than to admit that my generation failed to find them.

The answer to the question depends on what one means with objective reality. If space-time is taken as objective reality, there is no such thing as objective reality, something existing independent of observer. In the TGD framework, one can speak only of space-time surfaces in H=M4×CP2 as analogs of Bohr orbits for particles as 3-surfaces, and obeying almost deterministic holography forcing the Bohr orbits to be basic dynamical objects. Zero energy ontology (ZEO) is the new ontology solving the basic paradox of quantum measurement theory. Quantum states are quantum superpositions of these "Bohr orbits".

There are global objective laws: they reduce the mathematical existence of TGD. H isn fixed by existence of the twistor lift and number theory-geometry duality (M8-H duality) and holography= holomorphy principle giving holomorphic 4-surfaces, as minimal surfaces, and as extremals of any general coordinate invariant action constructible in terms of the induced geometry. Action makes itself visible only at singularities. Induction fixes the dynamics for fermions: second quantized free spinor fields in H. Fermion pair creation is possible thanks to the 4-D spacetime allowing exotic smooth structures as defects of standard one. The point-like defects define vertices and are also identifiable as (self-) intersections of space-time surfaces. Dimensions D=4 and D=8 for space-time and H are crucial for non-trivial physics.

Space-time surfaces are expressible as roots for pairs(f1,f2) of analytic functions of 4 generalized complex coordinates of H (one is hypercomplex coordinate with light-like coordinate curves). They form an algebra induced by the arithmetic operators for fi. This algebra decomposes to a union of number fields with f2 fixed. Space-times are thus generalized numbers: this realizes geometric Langlands correspondence (see this and this) .

The space of space-time surfaces defining a number system, the world of classical worlds (WCW) and it exists objectively. Subjective existence means sequence of quantum jumps between states defined by WCW spinor fields, this means hopping around in WCW. ZEO allows a realization of conscious memory so that the system learns all the subjective time about physics (see this).

For a summary of earlier postings see Latest progress in TGD.

For the lists of articles (most of them published in journals founded by Huping Hu) and books about TGD see this.

Thursday, October 03, 2024

Negative group delay and Zero Energy Ontology

Paul Kirsch sent an interesting link to a finding to an article "Experimental evidence that a photon can spend a negative amount of time in an atom cloud" (see this).

The finding is very interesting from from the point of view of zero energy ontology (ZEO) defining the ontology of classical and quantum TGD (see this, this, this, this). Could the negative group delay be understood in terms of a time period with a reversed arrow of time spent by the photon around an atom?

  1. Absorption and re-emission by atom would correspond to two types of "big" state function reductions (BSFRs) taking place. In the first BSFR photon would "die" by absorption by an atom. Photon would however reincarnate with an opposite arrow of time. The same would happen in the second BSFRs and photon would reincrane with the original arrow of time.

    According to the recent view of ZEO, after the second BSFR the photon would emerge geometrically later than it was absorbed in the first BSFR. The photon wave packet would come out as less entropic, that is younger. This effect would be like waking up as a less entropic, in this sense a younger person after a well slept night.

  2. Does the group delay measure this effect? If the aging of the wave packet means widening then this might be the case. Free photon wave packet keeps its shape since it does not disperse. The widening must be of thermodynamic origin and would be due to SSFRs replacing the wave packet gradually with a wider one.
  3. In TGD, the shape preserving wave packet has as a classical geometric correlates a "massless extremal" (ME) representing a pulse propagating in a precise direction. The shape of pulse does not change but "small" state function reductions (SSFRs) would replace ME with a new one representing in general a wider pulse. This would be dissipation: ME would age. The pair of BSFRs induced by atomic absorption would lead to a reincarnation as a younger ME. This would be the counterpart for the group delay.
The finding creates a tongue in cheek consideration related to my personal life. I suffer from bad sleep and wake-up continually. BSFR means falling asleep or in an extreme case death at some level of the personal self hierarchy. Temporary reversals of the arrow of time in pairs of BSFRs would provide a universal trial and error mechanism in conscious information processing and quantum biology. For instance, homeostasis as a way to stay near quantum criticality would be based on continual change of the arrow of time. If the temporary deaths indeed provide a way to fight against the second law, they might slow down aging. The personal curse would be actually a blessing?

See the article TGD and Condensed Matter or a chapter with the same title.

For a summary of earlier postings see Latest progress in TGD.

For the lists of articles (most of them published in journals founded by Huping Hu) and books about TGD see this.

Surfaceology and TGD

The inspiration coming from the work of Nima Arkani-Hamed and colleagues concerning the twistor Grassmannian approach provided a strong boost for the development of TGD. I started from the problems of the twistor approach and ended up with a geometrization of the twistor space in terms of sub-manifold geometry with twistor space represented as a 6-surface. Also the twistor space of CP2 played a key role.

This led to rather dramatic results. Most importantly, the twistor lift of TGD is possible only for H=M4× CP2 since only M4 and CP2 allow twistor space with Kähler structure: TGD is unique. The most recent result is that one can formulate the twistor-lift in terms of 6-surfaces of H (rather than 6-surfaces in the product of the twistor spaces of M4 and CP2). These twistor surfaces represent twistor spaces of M4 and CP2 or rather their generalizations, their intersection would define the space-time surface. Therefore one can formulate the twistor lift without the the 12-D product of twistor spaces of M4 and CP2.

During last years I have not followed the work of Nima and others since our ways went in very different directions: Nima was ready to give up space-time altogether and I wanted to replace it with 4-surfaces. I was also very worried about giving up space-time since twistor is basically a notion related to a flat 4-D Minkowski space.

However, in Quanta Magazine there there was recently a popular article telling about the recent work of Nima Arkani Hamed and his collaborators (see this). The title of the article was "Physicists Reveal a Quantum Geometry That Exists Outside of Space and Time". The article discusses the notions of amplituhedron and associahedron which together with the twistor Grassmann approach led to considerable insights about theories with N=4 supersymmetry. These theories are however rather limited and do not describe physical reality. In the fall of 2022, a Princeton University graduate student named Carolina Figueiredo realized that three types of particles lead to very similar scattering amplitudes. Some kind of universality seems to be involved. This leads to developments which allow to generalize the approach based on N=4 SUSY.

This approach, called surfaceology, still starts from the QFT picture, which has profound problems. On the other hand, it suggests that the calculational algorithms of QFT lead universally to the same result and are analogous to iteration of a dynamics defined in a theory space leading to the same result irrespective of the theory from which one starts from: this is understandable since the renormalization of coupling constants means motion in theory space.

How does the surfaceology relate to TGD?

  1. What one wants are the amplitudes, not all possible ways to end up them. The basic obstacle here is the belief in path integral approach. In TGD, general coordinate invariance forces holography forcing to give up path integral as something completely unnecessary.
  2. Surfaceology and brings strongly in mind TGD. I have talked for almost 47 years about space-time as surfaces without any attention from colleagues (unless one regards the crackpot label and the loss of all support as such). Now I can congratulate myself: the battle that has lasted 47 years has ended in a victory. TGD is a more or less mature theory.

    It did not take many years to realize that space-times must be 4-surfaces in H=M4×CP2, which is forced by both the standard model symmetries including Poincare invariance and by the mathematical existence of the theory. Point-like particles are replaced with 3-surfaces or rather the 4-D analogs of their Bohr orbits which are almost deterministic. These 4-surfaces contain 3-D light-like partonic orbits containing fermion lines. Space-time surfaces can in turn be seen as analogs of Feynman graphs with lines thickened to orbits of particles as 3-surfaces as analogs of Bohr orbits.

  3. In holography=holomorphy vision space-time surfaces are minimal surfaces realized as roots of function pairs (f1,f2) of 4 generalized complex coordinates of H (the hypercomplex coordinate has light-like coordinate curves). The roots of f1 and f2 are 6-D surfaces analogous to twistor spaces of M4 and CP2 and their intersection gives the space-time surface. The condition f2=0 defines a map between the twistor spheres of M4 and CP2. Outside the 3-D light-like partonic orbits appearing as singularities and carrying fermionic lines, these surfaces are extremals of any general coordinate invariant action constructible in terms of the induced geometry. In accordance with quantum criticality, the dynamics is therefore universal.

    Holography=holomorphy vision generalizes ordinary holomorphy, which is the prerequisite of twistorialization. Now light-like 4-D momenta are replaced with 8-momenta which means that the generalized twistorialization applies also to particles massive in 4-D sense.

This indeed strongly resembles what the popular article talks about surfaceology: the lines of Feynman diagrams are thickened to surfaces and lines are drawn to the surfaces which are however not space-time surfaces. Note that also Nima Arkani-Hamed admits that it would be important to have the notion of space-time.

The TGD view is crystallized in Geometric Langlands correspondence is realized naturally in TGD and implying correspondence between geometric and number theoretic views of TGD.

  1. Space-time surfaces form an algebra decomposing to number fields so that one can multiply, divide, sum and subtract them. The classical solution of the field equations can be written as a root for a pair of analytic functions of 4 generalized complex coordinates of H. By holography= holomorphy vision, space-time surfaces are holomorphic minimal surfaces with singularities to which the holographic data defining scattering amplitudes can be assigned.
  2. What is marvelous is that the minimal surfaces emerge irrespective of the classical action as long as it is general coordinate invariant and constructed in terms of induced geometry: action makes itself visible only at the partonic orbits and vacuum functional. This corresponds to the mysterious looking finding of Figueiredo.

    There is however a unique action and it corresponds to Kähler action for 6-D generalization of twistor space as surface in the product of twistor spaces of M4 and CP2. These twistor spaces of M4 and CP2 must allow Kahler structure and this is only possible for them. TGD is completely unique. Also number theoretic vision as dual of geometric vision implies uniqueness. A further source of uniqueness is that non-trivial fermionic scattering amplitudes exist only for 4-D space-time surfaces and 8-D embedding space.

  3. Scattering amplitudes reduce at fermionic level to n-point functions of free field theory expressible using fermionic propagators for free leptonic and quark-like spinor fields in H with arguments restrict to the discrete set of self-intersections of the space-time surfaces and in more general case to intersections of several space-time surfaces. This works only for 4-D space-time surfaces and 8-dimensional H. Also pair creation is possible and is made possible by the existence of exotic smooth structures, which are ordinary smooth structures with defects identifiable as the intersection points. Therefore there is a direct correspondence with 4-D homology and intersection form (see this). One can say that TGD in its recent form provides an exact construction recipe for the scattering amplitudes.
  4. There is no special need to construct scattering amplitudes in terms of twistors although this is possible since the classical realization of twistorialization is enough and only spin 1/2 fermions are present as fundamental particles. Since all particles are bound states of fundamental fermions propagating along fermion lines associated with the partonic orbits, all amplitudes involve only propagators for free fermions of H. The analog of twistor diagrams correspond to diagrams, whose vertices correspond to the intersections and self-intersections for space-time surfaces.
For the the recent view of TGD see this and this. For the Geometric Langlands duality in the TGD framework see this .

For a summary of earlier postings see Latest progress in TGD.

For the lists of articles (most of them published in journals founded by Huping Hu) and books about TGD see this.

How a rubbing with a microfiber manages to shatter the "bullet proof" windshield of Musk Cybertruck?

I learned from Heikki Hirvonen an about Musk Cybertruck windshield that was told to be "bullet proof" but turned out to be quite not so (see this). Even worse, it has been found that interaction with microfiber and the material of Musk Windshield creates some specific style of resonance that would then shatter that material. This brings to mind opera sopranos shattering wine glasses. One might think that the system considered must be critical so that very small periodic perturbations can induce very large changes if they are of the right kind and have a correct frequency.

1. Why should one worry about sopranos shattering wine glasses?

One might wonder what the point is in building complex new physics scenarios for how sopranos manage to break wine glasses. This has been understood a long time ago.

But is this really the case? We are used to thinking that physics somehow mysteriously transforms from quantum to classical on some scale. Quantum coherence, which is not possible above atomic scales, would be replaced by classical coherence on long scales. If this is assumed, glass-breaking sopranos cease to look mysterious. This thinking has actually no justifications but only restates what is a fact. When you give this thinking up, the imagined self-evidences collapse. Phenomena that were undeniably a bit strange become impossible.

In TGD, a new view of spacetime comes to rescue. The spacetime surface defines the coherence region in both classical and quantum sense. Field bodies make long-scale quantum coherence and, as its correlate, classical coherence, possible. The entire scale of the space-time surface corresponds to the scale of classical coherence and quantum coherence (i.e. related to the magnetic body). Long-scale quantum coherence accompanies classical coherence.

Classical long-scale coherence has a quantum counterpart and would be related to classical long-range gravitational and electromagnetic fields. Gravitational and em Planck's constant, whose values can be enormous compared to h, quantify the hypothesis. This windshield effect is just one example of many.

2. Background observations and assumptions

It is good to start with some background observations.

  1. The super strength of the glass could mean that it does not break under the deformations studied. Throwing piece rock and rubbing with microfiber do not belong to the class of allowed deformations. So what could be the deformations that do not break the glass?

    Could it be that only deformations have been tested where pressure is applied to the windshield, i.e. an impulse current in the direction of the impulse, but not deformations involving shear, i.e. the direction of the impulse current is perpendicular to the transferred impulse. The second difference is that there is a direct contact with the microfiber.

    Rubbing creates a shear. The microfiber is pressed against the surface and pushed horizontally at the same time: both pressure in the normal direction and shear in the direction of the surface are created. For example, in hydrodynamics, the very poorly understood generation of vortices at the interface (turbulence is due to shear). The creation of vortices is forced by the conservation of angular momentum. In TGD based quantum hydrodynamics, this process is essentially a quantum critical process on macroscales (see this).

    Could it be that the strength of the glass, as defined in the way I guessed, was exactly the reason for the breakage. Would the glass be too rigid in this sense and unable to flex and break?

    Or could the glass be fragile in terms of certain types of deformations that have not been taken into account? Pressure wouldn't create them, but shear could do so. The characteristics of the microfiber could also be important.

3. What kind of model could be imagined for the phenomenon?

The TGD based model for the phenomenon relies on gravitational quantum coherence predicted to be possible in astrophysical scales and also possible quantum criticality. The gravitational magnetic bodies of both the Sun and Earth are assumed to play a key role. The reason is that macroscopic quantum coherence requires very large values of the effective Planck constant. It is assumed that the gravitational Compton frequency of the Sun defines a gravitational quantum coherence scale and sets a lower bound for the frequencies assignable to the acoustic oscillations inducing the instability of the windshield .

One can also consider other mechanisms of macroscopic quantum coherence. Cyclotron frequencies for the endogenous magnetic field of Earth are in EEG range and would correspond to energies above thermal energy and play a key role in the TGD inspired quantum biology and might be involved with the microfibers. This would require transformation of dark cyclotron radiation to sound waves and require a ferro electret property typical for organic materials. Quantum criticality making possible a generation of large $h_{eff}$ phases is involved and warping deformations possible for planar or nearly planar systems are considered as a possible realization of the quantum criticality.

  1. Could the strength of the glass be defined so that when a weight is placed on the glass plate, it does not develop dent: this would mean that no curvature is generated. For example, a planar sheet of metal is a good example. It does not break easily.

    However, a flat metal or glass plate (flatness is important!) is very sensitive against development of warping, which only bends but does not curve the flat surface so that it remains flat (curvature tensor vanishes). The fluttering of a metal plate is a good example of this. Another kind example is a sheet of paper unstable against fluttering. Such time-dependent warpings would decompose to 1-dimensional plane waves propagating along the surface of the metal of glass. They would be very much like transversal sound waves.

    What is important is that warping is a critical phenomenon due the large number of flat warped surfaces (the warping profile can correspond to any differentiable function). In TGD criticality involves the development of large heff phases and long-range quantum correlations, which gives strong clues concerning the understanding of the situation.

  2. Already Euler thought about what happens when a weight is placed on a bar bent upwards (Euler buckling) (see this). At a critical weight, a collapse occurs. This is one of the basic applications of catastrophe theory. The critical amplitude of the warping wave would be analogous to the critical weight for which the glass would break.
  3. One might think that the action principle contains an energy density term that is proportional to the square of the 2-D curvature (see this) for the induced metric and vanishing for warped configurations. There would be an enormous vacuum degeneracy. Stability against deformations generating curvature requires that the coefficient of this term is very large. A lot of energy would be needed to produce a dent. But bending without curving brings in the Troyan horse.

    Action would of course also contain a term proportional to the surface area, which would correspond to the normal tension that tends to oppose the increase of the surface area. For warping, the energy would be only needed to increase the surface area. Could warping waves, possibly created by the rubbing with microfiber, lead to the breakage? Shear should provide the needed momentum and energy resonance should strengthen the warping wave.

4. What happens when the window shield breaks?
  1. A catastrophe theorist might state that the system is characterized by, for example, a cusp catastrophe. When the critical shear is reached, the system undergoes a sudden transition: the system breaks down.
  2. If one starts from the quantum level, the reduction of quantum coherence comes first to mind. In collapse the quantum coherence length would decrease dramatically from the size of the whole system to the size of the fragments. If the quantum coherence with the magnetic body of the glass surface takes care of the coherence of the glass, then it would have to decrease. In the heff distribution, the average value of heff would decrease.

    This is however only the outcome, not the primary cause. Long-scale quantum coherence and quantum criticality together with energy feed occurring at resonance frequency and increasing the value of heff would be the reasons leading to the limit at which the system collapses.

  3. Why would rubbing with microfiber induce a critical shear leading to the breaking and loss of quantum coherence? Warping waves are a good candidate. The windglass would start to shake in the vertical direction. When the amplitude of the warping wave would exceed the critical limit, the result would be collapse and breaking into pieces. Rubbing with microfiber would feed into the system the necessary energy needed to generate heff phases and this would occur at quantum criticality associated with the warping waves.
5. Identifying the resonance frequency

This should include a frequency resonance that would correspond to the wavelength of the wave identifiable as a natural length scale for microfiber and/or glass. One would expect the flutter frequency to be on the Hertz scale and the acoustic resonance frequency of the windshield is a good guess. The sequel will certainly arouse academic head shaking, but it is based on the fact that in the TGD world, the planets and the sun form a quantum-coherent system, the effect of which can be seen on Earth at all levels, especially in biology. Second justification was given already in the beginning: our belief that we understand the classical world is based on an illusion about a mysterious transition from quantum to classical.

  1. Microfiber has a wavelength λ ≈ 1 micrometer as a natural scale. The IR energy scale 1 eV of infrared photons would correspond to that and it can be assumed to be the basic scale. Could photons with this energy transform into bundles of dark photons with much longer wavelength; they, in turn, would eventually end up via intermediate steps into bundles of ordinary phonons or even into a Bose-Einstein condensate or a coherent state as a quantum analog of classical state.
  2. Let's start with the Earth's gravitation (see this, this and this). The gravitational Compton length Λgr related to the Earth's gravitation Planck's constant is .5 cm (half of the Schwartschild radius), independently of particle mass, and the associated frequency is fgr= 67 GHz. The frequency is quite too big. Furthermore, the Earth's gravitation is now not decisive because the warping is not in the vertical direction but closer to the tangential direction. In any case Earth's gravitation is not enough.
  3. One must follow the example of Icarus and hope for better luck. The Sun's gravitational constant gives a frequency of fgr=50 Hz, which is the average EEG frequency and important resonance frequency of the EEG central in communications between the brain and its magnetic body (see this and this). This is a reasonable frequency. The corresponding gravitational wavelength Λ= c/fgr is half the radius of the Earth.

    Needless to emphasize that this makes no sense unless one accepts the astrophysical quantum coherence assigned with gravitation and that the oscillation takes place on the magnetic body of the glass plate on the scale of the Earth's radius.

  4. A strong objection is that fgr does not depend at all on the geometry of the glassy system, in particular on the size scale of the windshield. A reasonable expectation is that the model should apply also to shattering of wine glasses.

    A more general assumption is that the allowed frequencies are above the threshold defined by fgr= 50 Hz defining the gravitational quantum period. At frequencies above fgr gravitational quantum coherence would make itself visible. However, the frequencies coming as harmonics of fgr could be especially interesting. This assumption is analogous to that appearing in the proposal for how gravitational quantum coherence could become important in classical computers (see this). In any case, the assumption f≥fgr is rather strong and gives lower bounds for the quantal resonance frequencies.

Could the resonance (basically acoustic warping wave) correspond to a frequency above fgr or be identifiable as the frequency of dark photons generated at the magnetic body of the Sun?
  1. The phonons of the acoustic wave would couple to the dark photons, produced by shear, at the magnetic body. This is where microfiber would take the role of a Trojan horse. Note that in liquid flow for which shear occurs near boundaries, the conservation of angular momentum forces the production of vortices which in TGD based hydrodynamics would be associated with dark monopole flux tubes. Also now, Z0 magnetic vortices could be created.]
  2. The frequencies above fgr would be the same, but the energy of a dark photon would correspond to the energy of many "warping phonons": a Bose-Einstein condensate/coherent state analogy of phonons would be created. Assuming proton-Earth pair, one has ℏgr(Earth,proton) proportional to mpME. This gives 1 eV energy scale, which corresponds to 1 micrometer wavelength for ordinary photons.

    The critical reader has probably noticed that the magnetic bodies of both the Sun and the Earth are included, characterized by ℏgr(Sun,proton) and ℏgr(earth,proton) respectively. The gravitational Compton length Λgr(Sun,proton) of Sun is RE/2, which is the size scale for the Earth's magnetic body. Also ℏgr(Earth,proton) is required. Could one think that dark photons for which heff= hgr(Sun,proton) are created first, and that these break up into bunches of dark photons with heff= hgr(Earth,proton). The frequency would remain the same. These in turn break up into bunches of "warping phonons" with the same frequency.

  3. If the propagation speed of the warping wave is roughly estimated to be the sound velocity in glass, that is v=4540m/s, then the wavelength would be Λ = v/f= 90.6 m if one assumes that the value of f is smallest possible that is f=fgr= 50 Hz. The wavelength is quite too long as compared to the dimensions of the windshield. v should be 2 orders of magnitude smaller, coincidentally(?) the same order as the conduction velocity of the nerve impulse. Note also that a micrometer is the scale of a cell nucleus. However, fgr=50 Hz defines only a lower bound for the quantum resonance frequency. A resonance frequency dictated by the geometry is in question and roughly scales like the inverse size of the system.

    In the case of wine glass, one expects a frequency scale, which is by two orders of magnitude larger, in the kHz scale. The E note at the hearing threshold corresponds to 20.6 Hz and, according to net source, for a wine glass some octave of E is a reasonable estimate for the resonance frequency. The resonance frequency is k:th octave of this frequency and assuming that λ is of order .1 m, one obtains an estimate that 7:th octave is a reasonable guess. This is of order kHz. In the case of a windshield, one would expect λ to be 5 to 10 times longer so that the frequency could be around 3 or 4 octaves.

6. Summary

Microfiber rubbing would induce warping waves, whose amplitude would increase in resonance and lead to shattering.

  1. First, dark photons (piezoelectricity) would be generated at the solar magnetic body and then decay to bunches of dark photons at the magnetic body of Earth with energy of order eV, corresponding to the scale of the basic structure of the microfiber. Their frequency would be abe fgr=50 Hz corresponding to the gravitational Compton wavelength of the Sun, which is of the order of the Earth's radius/2. The dependence of the resonance frequency on the geometry requires that fgr defines only a lower bound for f and its interpretation of fgr is as a quantum coherence period.
  2. heff= hgr(Earth,proton) photons would in turn decay to a "warping phonon" beam with frequency above fgr=50 Hz. Phonons would form a coherent state or BE condensate. This could lead to an acoustic laser effect and amplification, and the result would be resonance and catastrophe, analogous to Euler buckling, when the warping amplitude becomes too large. Here, quantum criticality, which is naturally associated with warping waves, would be essential, it would make the Trojan horse effect possible.

See the article How a rubbing with a microfiber manages to shatter the "bullet proof" windshield of Musk Cybertruck? or the chapter TGD and Condensed Matter.

For a summary of earlier postings see Latest progress in TGD.

For the lists of articles (most of them published in journals founded by Huping Hu) and books about TGD see this.

Wednesday, October 02, 2024

Space-time surfaces as numbers: what could this mean from the point of view of metamathematics?

These comments were inspired by the links to Bruno Marchal's posts by Jayaram Bista (see this). The comments compare the world views behind two Platonisms, the Platonism based on integers or rationals and realized by the Turing machine as a Universal Computer and the quantum Platonism of TGD. Marchal also talks about Digital Mechanism and claims that it is not necessary to assume a fixed physical universe "out there". Marschal also speaks of mathematical theology and claims that quantum theory and even consciousness reduce to Digital Mechanism.

In the TGD Universe, the space-time surfaces form an algebra with respect to multiplication and that this algebra decomposes to a union of number fields means a dramatic revision of what computation means. The standard view of computation as a construction of arithmetic functions is replaced with a physical picture in which space-times as 4-surfaces have interpretation as almost deterministic computations. Space-time surfaces allow arithmetic operations and also the counterparts of functional composition and iteration are well-defined.

Replacement of the static universe with a Universe continuously recreating itself

It seems to me that the problems of computationalism emerge from a single ontological assumption: the "system", be it Universe in some sense or God, is fixed. In quantum TGD this is not the case. The Quantum Universe, which could be seen as a counterpart for God, is continually recreating itself and this means the unavoidable increase of algebraic complexity since the dimensions associated with extensions of rationals defining space-time regions unavoidably increase. This in turn implies evolution.

In zero energy ontology (ZEO) "small" state function reductions (SSFRs), whose sequence generalizes Zeno effect, which has no effect on physical state. SSFRs have and their sequence gives rise to conscious entities, selves. This makes possible memory: the outcome of SSFR has classical information about the initial state and also about the transition. Therefore the Universe remembers and learns consciously: one can talk about Akashic records.

This dynamical view of the Universe recreating itself and becoming more intelligent by learning about what it was before the previous SSFR is very different from the view of the Universe as a Turing machine or Universal Computer. These notions are static notions (Universe "out there") and computation is based on integers. In the TGD view one obtains an entire hierarchy of computationalisms based on the hierarchy of extensions of rationals. Even transcendental extension can be considered. TGD Universe as a counterpart of the Turing machine is also conscious and has free will.

A generalization of number

Also the notion of number generalizes from integers N to space-time surfaces. Space-time surfaces can be multiplied and summed and form an algebra. This algebra decomposes to a union of number fields with product,division, sum and subtraction. One can identify space-time surfaces forming analogs for hierarchies of algebraic integers, algebraic rationals, etc... So that the mathematics performed by Quantum Platonia is considerably more complex than counting by 5+5 fingers!

These structures are defined by the corresponding structures for function algebras and fields defined in terms of analytic functions of 8 generalized complex coordinates of H=M4×CP2. One of the coordinates is a hypercomplex coordinate with light-like coordinate curves.

  1. In TGD space-time surfaces are numbers. Their dynamics is almost deterministic (at singularities the determinism fails and this forces us to take space-time surfaces instead of 3-surfaces as basic objects). The space-time surface as an almost deterministic time evolution is analogous to a proof of a theorem. The assumptions correspond to the initial state 3-surface and the outcome of the theorem to the final 3-surface. Second interpretation is as analogs of deterministic computer programs. Space-time surface as a proof of a theorem is analogous to its own Gödel number as a generalized number.
  2. Cognition always requires a discretization and the space of space-time surfaces ("world of classical worlds", WCW) allows a hierarchy of discretizations. The Taylor coefficients of the two analytic functions defining space-time belong to some extension of rationals forming a hierarchy. Therefore a given space-time surface corresponds to a discrete set of integers/rationals in an extension so that also WCW is discretized. For polynomials and rational functions this set is discrete. This gives a hierarchy. At the level of the space-time surface an analogous discretization in terms of an extension of rationals takes place.
  3. Gödel number for a given theorem as almost deterministic time evolution of 3-surface would be parametrized by the Taylor coefficients in a given extension of rationals. Polynomials are simplest analytic functions and irreducible polynomials define polynomial primes having no decomposition to polynomials of a lower degree. They might be seen as counterparts of axioms.
  4. One can form analogs of integers as products of polynomials inducing products of space-time surfaces. The space-time surfaces are unions for the space-time surfaces defined by the factors but an important point is that they have a discrete set of intersection points. Fermionic n-point functions defining scattering amplitudes are defined in terms of these intersection points and give a quantum physical realization giving information of the quantum superpositions of space-time surfaces as quantum theorems.
Could space-time surfaces replaced as integers replace ordinary integers in computationalism?

It is interesting to play with the idea that space-time surfaces as numbers, in particular integers, could define counterparts of integers in ordinary computationalism and metamathematics.

What might be the counterpart for the possibility to represent theorems as integers deduced using logic and for the Gödel numbering for theorems by integers?

  1. In TGD space-time surfaces are numbers. Their dynamics is almost deterministic (at singularities the determinism fails and this forces us to take space-time surfaces instead of 3-surfaces as basic objects). The space-time surface as an almost deterministic time evolution is analogous to a proof of a theorem. The assumptions correspond to the initial state 3-surface and the outcome of the theorem to the final 3-surface. Second interpretation is as analogs of deterministic computer programs. Space-time surface as a proof of a theorem is analogous to its own Gödel number as a generalized number.
  2. Cognition always requires a discretization and the space of space-time surfaces ("world of classical worlds", WCW) allows a hierarchy of discretizations. The Taylor coefficients of the two analytic functions defining space-time belong to some extension of rationals forming a hierarchy. Therefore a given space-time surface corresponds to a discrete set of integers/rationals in an extension so that also WCW is discretized. For polynomials and rational functions this set is discrete. This gives a hierarchy. At the level of the space-time surface an analogous discretization in terms of an extension of rationals takes place.
  3. Gödel number for a given theorem as almost deterministic time evolution of 3-surface would be parametrized by the Taylor coefficients in a given extension of rationals. Polynomials are simplest analytic functions and irreducible polynomials define polynomial primes having no decomposition to polynomials of a lower degree. They might be seen as counterparts of axioms.
  4. One can form analogs of integers as products of polynomials inducing products of space-time surfaces. The space-time surfaces are unions for the space-time surfaces defined by the factors but an important point is that they have a discrete set of intersection points. Fermionic n-point functions defining scattering amplitudes are defined in terms of these intersection points and give a quantum physical realization giving information of the quantum superpositions of space-time surfaces as quantum theorems.
Adeles and Gödel numbering

Adeles in TGD sense inspire another interesting development generalizing the Gödelian view of metamathematics.

  1. p-Adic number fields are labelled by primes and finite fields induced by their extensions. One can organize the p-adic number fields to adele and the same applies to their extensions so that one has an infinite hierarchy of algebraic extensions of the rational adele. TGD brings something new to this picture.
  2. Two p-adic number fields for which elements are power series in powers of p1 resp. p2 with coefficients smaller than p1 resp. p2, have common elements for which expansions are in powers of integers n(k1,k2)= p1k1×p2k1, k1>0, k2>0. This generalizes to the intersection of p1,p2,..., pn. One can decompose adeles for a union of p-adic number fields which are glued together along these kinds of subsets. This decomposition is general in the description of interactions between p-adic sectors of adeles. Interactions are localized to these intersections.
  3. Mathematical cognition would be based on p-adic numbers. Could one think that ordinary integers should be replaced with the adelic integers for which the pi:th factor would consist of p-adic integers of type pi.

    These integers are not well-ordered so that the one cannot well-order theorems/programs/etc... as in Gödel numbering.

    The number of p-adic integers is much larger than natural numbers since the pinery expansion can contain an infinite number of terms and one can map p-adic integers to real numbers by what I call canonical identification. Besides this one has fusion of various p-adic number fields.

An interesting question is how this changes the Gödelian views about metamathematics. It is interesting to play with the idea that space-time surfaces as numbers, in particular generalized integers, could define counterparts of integers in ordinary computationalism and metamathematics.

Numbering of theorems by space-time surfaces?

What might be the counterpart for the possibility to represent theorems as integers deduced using logic and for the Gödel numbering for theorems by integers?

  1. In TGD space-time surfaces are numbers. Their dynamics is almost deterministic (at singularities the determinism fails and this forces us to take 4-D space-time surfaces instead of 3-surfaces as basic objects). The space-time surface as an almost deterministic time evolution is analogous to a proof of a theorem. The assumptions correspond to the initial state 3-surface and the outcome of the theorem to the final 3-surface. Second interpretation is as an analog of a deterministic computer program. The third interpretation as a biological function. Space-time surface as a proof of a theorem is analogous to its own Gödel number, but now as a generalized number. One can define the notions of prime , integer , rational and transcendental for the space-time surfaces.

    The counterparts of primes, determined by pairs of irreducible polynomials, could be seen as axioms. The product operation for space-time surfaces generates unions of space-time surfaces with a discrete set of intersection points, which appear as arguments of fermionic n-point functions allowing to define fermionic scattering amplitudes. Also other arithmetic operations are possible.

    Also functional composition, essential in computationalism, is possible. One can take any analytic h(z) function of a complex coordinate z and form a functional composite h(f1(...)) or h(f2(...)). One can also iterate this process. This would make it possible to realize recursion, essential in computationalism. This iteration leads also to fractals.

  2. Cognition always requires a discretization and the space of space-time surfaces ("world of classical worlds", WCW) allows a hierarchy of discretizations. The Taylor coefficients of the two analytic functions f1,f2 defining space-time belong to some extension E of rationals forming a hierarchy. Therefore a given space-time surface corresponds to a discrete set of integers/rationals in an extension of rationals so that also WCW is discretized for given E. For polynomials and rational functions this set is discrete. This gives a hierarchy. At the level of the space-time surface an analogous discretization in terms of E takes place.
  3. Gödel number for a given theorem as almost deterministic time evolution of 3-surface would be parametrized by the Taylor coefficients in a given extension of rationals. Polynomials are simplest analytic functions and irreducible polynomials define polynomial primes having no decomposition to polynomials of a lower degree. Polynomial primes might be seen as counterparts of axioms. General analytic functions are analogous to transcendentals.
  4. One can form analogs of integers as products of polynomials inducing products of space-time surfaces as their roots. The space-time surfaces are unions for the space-time surfaces defined by the factors but an important point is that they have a discrete set of intersection points. Fermionic n-point functions defining scattering amplitudes are defined in terms of these intersection points and give a quantum physical realization giving information of the quantum superpositions of space-time surfaces as quantum theorems.
See the articles TGD as it is towards end of 2024: part I, TGD as it is towards end of 2024: part II, and About Langlands correspondence in the TGD framework.

For a summary of earlier postings see Latest progress in TGD.

For the lists of articles (most of them published in journals founded by Huping Hu) and books about TGD see this.