https://matpitka.blogspot.com/2021/11/

Thursday, November 18, 2021

Superionic ice, possibly existing inside some planets, discovered

A superionic ice-like phase of water at high temperature and pressure (20 gigaPascals but much less than the expected pressure, which is higher than 50 gigaPascals) has been discovered. See the popular article and the research article.

The bonds between hydrogen atoms and oxygen ions are broken in this phase and ionized hydrogen atoms form a fluid,a kind of proton ocean in which the oxygen lattice floats.

In the TGD framework dark proton sequences with effective Planck constant h_eff>h at monopole magnetic tubes play a key role in quantum biology. Dark DNA codons would be 3-proton triplets at monopole flux tubes parallel to DNA strands and would give rise to a fundamental realization of the genetic code.

One can wonder whether the protons of this superionic could be dark in the TGD sense and reside in monopole flux tubes. Could they form a superfluid-like or superconductor-like phase by a universal mechanism which I call Galois confinement, which requires that the total momenta of composites of dark protons with are algebraic integer valued momenta are ordinary integers in suitable units (periodic boundary conditions): see this and this.

It is conjectured that this kind phase could reside in the interiors of Neptune and Uranus perhaps even deep inside the Earth. The TGD based view about super-conductivity leads to a rather eye-brow raising question. Could the vanishing of large scale magnetic fields of planets like Venus and Mars be due to the TGD variant of the Meissner effect and could these planet interiors be superconductors in the TGD sense (see this)?

Could superionic ice appear in the interior of Earth? Could one consider the following scenario?

Primordial Earth had a vanishing magnetic field by the Meissner effect caused by superionic ice. Part of the superconducting superionic water melted and formed ordinary water at lower temperature and pressure and gave rise to underground oceans. Superconductivity was lost in the Earth scale but the monopole flux based magnetic field and the ordinary magnetic field induced by the currents that it generated remained but did not cancel each other anymore. In the transition increasing the radius of Earth by factor 2 during the Cambrian explosion the water in these oceans bursted to the surface of Earth.

Earthquakes that should not occur

There is an interesting finding, which seems to relate to the superionic ice. It has been discovered that there are earthquakes much deeper in the interior of Earth than expected (see this). These earthquakes are in the transition zone between upper and lower mantle and (the depth range 410-620 km) even below it (750 km). The pressure range is 20-25 GPa. The temperature at the base of the transition zone is estimated to be about 1900 K (see this). This parameter range inspires the question whether superionic ice could emerge at the base of the transition zone and whether the appearance of hydrogen as liquid in pores could make possible the earthquakes below the transition zone just as the presence of ordinary liquid in pores is believed to make them possible above the transition zone.

In the crust above 20 km depth the rocks are cold and brittle and prone to breaking and most earthquakes occur in this region. At deeper the rocks deform under high pressures and no breaking occurs. Deeper in the crust the matter is hotter and pressure higher and breaking does not occur easily.

Around a depth of 400 km, just above the transition zone, the upper mantle of the rock consists of olivine, which is brittle. In the transition zone olivine is believed to transform to wadsleyite and at deeper depth ringwoodite. At 680 km, where the upper mantle ends, ringwoodite would transform to bridgmanite and periclase. The higher pressure phases are analogous to graphite, which deforms easily under pressure and does not break whereas olivine is analogous to diamond and is brittle.

One can understand the earthquakes down to 400 km near the upper boundary of the transition zone in terms of the model in which water in the proposed upper mantle is pushed away from the pores by pressure, which leads to breaking. Below this depth water is believed to be totally squeezed out from the pores so that mechanism does not work. The deepest reported earthquake occurs at a depth 750 km and looks mysterious. There are several proposals for its origin.

The area of Bonin island is a subduction zone and it has been proposed that the boundary between upper and lower mantle is at a larger depth than thought. The cold Earth crust could allow a lower temperature so that matter would remain brittle since the transition to high pressure forms of rock would not occur. Another proposal is that the region considered is not homogenous and different forms of rock are present. Even direct transition of olivine to ringwoodite is possible and it has been suggested that this could make the earthquakes possible.

Could there be a connection between superionic ice and earthquakes?

TGD allows us to consider the situation from a new perspective by bringing in the notions of magnetic flux tubes carrying dark matter. Also the zero energy ontology (ZEO) might be highly relevant. The following represents innocent and naive questions of a layman at the general level.

  1. ZEO inspires the proposal that earthquakes correspond to "big" state function reductions (BSFRs) in which the arrow of time at the magnetic body of the system changes. This would explain the generation of ELF radiation before the earthquake although one would expect it after the earthquake (see this).

    The BSFRs would occur at quantum criticality and the question is what this quantum criticality corresponds to. Could the BSFR correspond to the occurrence of a phase transition in which the superionic ice becomes ordinary water? If this is the case, the transition zone, and also a region below it, would be near quantum criticality and prone to earthquakes.

  2. The dark magnetic flux tubes are 1-D objects and possess Hagedorn temperature TH as a limiting temperature. The heat capacity increases without limit as TH is approached. Could a considerable part of thermal energy go to the flux tube degrees of freedom so that the temperature of the ordinary matter would remain lower than expected and the material could remain in a brittle olivine form.
  3. Could the energy liberated in the earthquake correspond to the dark magnetic energy (for large enough value of heff assignable to gravitational magnetic flux tubes) assignable to the flux tubes rather than to the elastic energy of the rock material? Could the liberated energy be dark energy liberated as heff decreases and flux tubes suddenly shorten? Could this correspond to a phase transition in which superionic ice transforms to an ordinary phase of water?
One can also ask more concrete questions.
  1. Suppose that water below the transition zone (P> 20 GPa and T > 1900 K) can exist in superionic ice containing hydrogen ions in liquid form. Could the high pressure force the superionic liquid out from the pores and induce the breaking?
  2. In the range 350-655 km, the temperature varies in the range 1700-1900 K (see this). The temperature at the top of transition zones would be slightly above 1700 K. Could regions of superionic ice appear already at 1700 K, which is below T=2000 K?
  3. Could the transition zone be at criticality against the phase transition to superionic water? This idea would conform with the proposal that the region in question is not homogenous.

See the article Updated version of Expanding Earth model or the chapter Expanding Earth Model and Pre-Cambrian Evolution of Continents, Climate, and Life.

For a summary of earlier postings see Latest progress in TGD.

Articles and other material related to TGD.

Mott insulators learn like living matter

Researchers in Rutgers University have found that quantum materials, in this case Mott insulators, are able to learn very much like living matter (see this). The conductivity of the quantum material represented behavior and sensory input was represented by external stimuli like oxygen, ozone and light.

The finding was that conductivity depends on these stimuli and that the system mimics non-associative learning. Non-associative learning does not involve pairing with the stimulus but habituation or sensitization with the stimulus.

I have already earlier (see this) briefly considered transition metals, Mott insulators, and antiferromagnets from the point of view of TGD inspired theory of high Tc superconductivity.

  1. By looking at Wikipedia (see this), one finds that Mott insulators are transitional metal oxides such as NiO. Transition metals, such as Ni, can have unpaired valence electrons since they can appear in electronic configurations [Ar] 3d8 4s2 or [Ar] 3d9 4s1. This should make transition metals and their oxides conductors. They are not since they seem to somehow develop an energy gap between states in the same valence band making them insulators.
  2. Mott developed a model for NiO as an insulator: the expected conduction was based on the transition for neighboring Ni2+O2- molecules

    (Ni2+O2-)2 → Ni3+O2- + Ni1+O2-.

    In the latter configuration, the number of valence electrons of Ni is odd for both neighbors.

  3. The formation of the gap can be understood as a competition between repulsive Coulomb potential U between 3d electrons and the transfer integral t of 3d electrons between neighboring atoms assignable to the transition. The total energy difference between the two states is E=U-2zt, where z is the number of neighboring atoms. A large value of U leads to a formation of a gap implying the insulator property.
  4. Also antiferromagnetic ordering is necessary for the description of Mott insulators. Even this is not enough, and the rest which is not so well understood, is colloquially called mottism. The features of Mott insulators that require mottism are listed in the Wikipedia article. They include the vanishing of the single particle Green function along a connected surface in the first Brillouin zone and the presence of charge 2e boson at low energies.
  5. The description of both Mott insulators and high Tc superconductors involves antiferromagnetism and Mott insulators exhibit extraordinary phenomena such as high Tc superconductivity and so-called colossal magnetoresistance thought to be due the interaction between charge and spin of conduction electrons.
In the TGD framework, the description of high Tc superconductors (see this, this and this) involves pairs of monopole flux tubes with opposite direction of monopole magnetic flux possible not possible in Maxwellian electrodynamics. The members of Cooper pairs, which are dark in the TGD sense having an effective Planck constant heff≥ h, reside at the monopole flux tubes. The Cooper pairs are present already above Tc but the flux tubes are short and closed so that supercurrent flows only in short scales. At Tc long flux tubes are formed by reconnection.
  1. Dark valence electrons could help to understand Mott insulators. Transition metals are known for a strange effect in which the valence electrons seem to disappear (see this, this, and this). The TGD proposal is that the electrons become dark in the TGD sense.
  2. It has become clear that dark electron can appear only as bound states for which the sum of momenta, which are algebraic integers in the extensions of rationals with dimension h=heff/h0 (this guarantees periodic boundary conditions) must be Galois singlets: one has Galois confinement. This implies that the total momentum is ordinary integer (see this and this).

    Therefore free dark electrons are not allowed and Cooper pairs and possibly also states formed by a larger number of electrons, say four as has been found (see this) are possible as Galois singlets. In the TGD inspired quantum biology dark proton triplets realize genetic codons and genes could correspond to N-codons as Galois confined states of 3N dark protons (see this).

  3. As a rule, single particle energies increase with increasing heff and the thermal energy feed could increase the effective value Planck constant for an unpaired valence electron of Mott insulator from h to heff>=nh0>h of the valence electrons and it would become dark in the TGD sense. Here n denotes the dimension of extension of rationals assignable to the space-time region. The natural assumption is that Galois confinement forces the Cooper pairing of unpaired electrons of neighboring atoms.
  4. Above Tc, the flux tubes associated with Cooper pairs would be too short for large scale superconductivity so that one would have a conductor or a Mott insulator. Under certain conditions involving low enough temperature, a supraflow in long scales would become possible by the mechanism described above. The massive magnetoresistance could involve a transfer of electrons as Cooper pairs at the magnetic flux tubes of the external magnetic field which would be too short to give rise to superconductivity or even superconductivity. External magnetic fields could also induce dark ferromagnetism as formation of dark flux tubes.
Dark electrons, protons and ions residing at the magnetic flux tubes of the "magnetic body" (MB) of the system are in a key role in the TGD based quantum biology and essential for learning as self-organization. heff serves as a measure for the number theoretical complexity and therefore "intelligence" of the system. There MB naturally acts as a "boss".

Also now the MB of the Mott insulator could play a key role: MB with heff >h would be the "boss" and learn and induce changes in the behavior of the ordinary matter, the "biological body" (BB). In the non-associative learning, adaptation and sensitization is involved and it would be MB that adapts or sensitizes. The TGD view of a neuron proposes a rather detailed model for the communication between the BB and MB (see this).

See the article TGD and condensed matter or the chapter with the same title.

For a summary of earlier postings see Latest progress in TGD.

Articles and other material related to TGD.

Wednesday, November 17, 2021

About TGD view of neuron

The realization that saltation as a conduction over the myelinated portions of the axon is still poorly understood phenomenon inspired a careful reanalysis of the earlier TGD inspired visions of nerve pulse conduction, EEG and of brain based on the new view about space-time, the notion of the magnetic body carrying heff>h phases behaving like dark matter, and the zero energy ontology (ZEO) based quantum measurement theory extending to a theory of consciousness.

The TGD view about nerve pulse replaces nerve pulse as a wave assignable to a generalized Josephson junction formed by lipid layers of the cell membrane for which Josephson frequency fc is replaced by the sum fc+Δ Ec, where Δ Ec is the difference between cyclotron frequencies for transversal flux tubes at the different sides of the axon. What propagates is the deviation of membrane potential below the critical value for the generation of action potential. There would be no action potential in the myelinated portions of the axon and it would be generated only in the non-myelinated portions of length about 1 \mum and gives rise to chemical effects and also communicate a signal to the magnetic body if the notion of generalized Josephson junction is accepted.

An interesting challenge for the model is the discovery that the density of the voltage gated ionic channels in the dendrites of neurons is considerably lower for humans than for mammals. The general model suggests that the spatiotemporal patterns of Josephson radiation emitted by segments between nearby ionic channels or pumps define analogs of sentences of language having nerve pulse as a period analogous to the stop codon for DNA, then these sentences would be longer for humans, which could relate to the emergence of the human language capacity.

See the article About TGD view of neuron or the chapter with the same title.

For a summary of earlier postings see Latest progress in TGD.

Articles and other material related to TGD.

Friday, November 12, 2021

About systems that apparently defy Newton's third law

Gary Ehlenber sent a very interesting Quanta Magazine link telling about the work of Vincenzo Vitelli (see this).
  1. The topic is extremely interesting but the popular article  produces a lot of confusion by introducing misleading metaphors for  Newton's law of reciprocity stating only that the total conserved quantities are conserved for an  N-particle system. If  conservation fails in a 2-particle system, there must be a third system present - magnetic body (MB) in the TGD framework. 
  2. Also the claim that energy is not conserved, is simply wrong. A more precise statement is that thermodynamic equilibria are not reached in some systems and this together with the existence of non-equilibrium systems suggests that the arrow of time is not always the same - zero energy ontology (ZEO).  If one  accepts  the notions of MB and ZEO, there is no need to give up conservation laws.
  3. The importance of singular points - I would call them critical or quantum critical points - is also emphasized. At these points the conservation laws would be violated. TGD interpretation is different: at these points the transfer of conserved quantities between MB and the system considered becomes important.
  4. Polariton-exciton systems are mentioned as a starting point of the work of Vitello.  This system allows Bose-Einstein condensates (BECs)  at room temperature but energy feed is required. This is something totally new. TGD predicts  forced Bose-Einstein condensates and I have discussed polariton-exciton BECs as an example. 
> The topic is highly interesting from the TGD point of view for several reasons.  
    >
  1. The  notion of magnetic body (MB) appears as a third system in non-resiprocal situations  and quite concretely can lead to small violation of energy and momentum conservation although these violations are small because MB uses energy for control purposes, biological body does the hard work. 
  2. Number theoretic TGD predicts hierarchy of Planck constants. MB carries heff>h phases. This means a larger algebraic complexity, kind of IQ, and makes it the "boss". Also the longer length scale of quantum coherence typically proportional to heff implies this.    The energy of a particle increases with heff and one must have a metabolic energy feed to  preserve the heff distribution from flattening by spontaneous reductions of heff values.  The formation of bound states can however compensate for the increase of energy when heff is increased.

    Bound state formation could be universally based on this and one ends up to quite concrete proposal for how bound states are formed as what I call Galois singlets.  4-momenta of fundamental fermions are algebraic integers for given extensions of rationals labelling the space-time region and Galois confinement says that the bound states have integer valued 4-momenta: this guarantees periodic boundary conditions.

  3. > In the  TGD framework, the hierarchy of heff phases behaving like dark matter  predicts that driven superconductivity (and various BECs) is possible. Cooper pairs and also charges with heff>h give rise to non-dissipating supra currents  at MB.   The problem is that heff is reduced spontaneously. For Cooper pairs binding energy stabilizes the pairs  since the energy of the pair reduces below the energy for free charges. This works below critical temperature. Above critical temperature one can feed energy to the system so that  the equilibrium becomes flow equilibrium. This applies to various Bose-Einstein condensates, in particular polariton-exciton condensate. 
  4. ZEO predicts that in an ordinary  ("big") state function reduction  time reversal occurs. This solves the basic problems of quantum measurement theory but also forces to generalize thermodynamics and leads to a new view about non-equilibrium systems since time reversal means that dissipation occurs in reverse time direction for a subsystem and looks like self-organization for the outsider.  

    In particular, one must give up the idea about stable equilibrium states as energy minima.   If the subsystem is ending up  to such it can make a BSFR changing the arrow of time and from the point of view of  the outsider starts to extract energy from the environment. Negentropy Maximization Principle (NMP) forces this since in thermal equilibrium information does not increase anymore.  In biology homeostasis is based on this. 

  5. Singular points as analogs of critical points are mentioned in the article. At these points one cannot distinguish between two phases. In the TGD framework quantum critical points are points at which long range fluctuations are possible and they correspond to large values of effective Planck constant heff at the MB of the system labelling phase behaving like dark matter. The phase transition creating these phases means that conservation laws are apparently violated. This provides a test for the TGD vision.
  6. Information itself is a central notion missing from  standard physics. Number theoretic physics involving p-adic number fields provides correlates for cognition and the formal  p-adic  analog of entropy can be negative and is interpreted as a measure for information associated with entanglement of 2 systems (2-particle level)  whereas ordinary entropy is related to the loss of information  about either entangled  state (1-particle level).  The sum of two Shannon entropies is by NMP non-negative and increases as the dimension of extensions of rationals increases. This implies evolution as an increase of algebraic complexity, of information sources, and quantum coherence scales. 
See the article TGD as it is towards end of 2021 and the book and the chapters of the book TGD and Condensed Matter. For a summary of earlier postings see Latest progress in TGD.

Articles and other material related to TGD.

Thursday, November 11, 2021

Humans are different

The realization that saltation as a conduction over the myelinated portions of the axon is still poorly understood phenomenon inspired a careful reanalysis of the earlier TGD inspired visions of nerve pulse conduction, EEG and of brain based on the new view about space-time, the notion of the magnetic body carrying heff>h phases behaving like dark matter, and the zero energy ontology (ZEO) based quantum measurement theory extending to a theory of consciousness.

The TGD view about nerve pulse replaces nerve pulse as a wave assignable to a generalized Josephson junction formed by lipid layers of the cell membrane for which Josephson frequency fc is replaced by the sum fc+Δ Ec, where Δ Ec is the difference between cyclotron frequencies for transversal flux tubes at the different sides of the axon.

What propagates is the deviation of membrane potential below the critical value for the generation of action potential. There would be no action potential in the myelinated portions of the axon and it would be generated only in the non-myelinated portions of length about 1 μm and gives rise to chemical effects and also communicate a signal to the magnetic body if the notion of generalized Josephson junction is accepted.

A test for this picture came from the popular article in Medicalxpress (see this) telling about highly interesting observation described in the Nature article "Allometric rules for mammalian cortical layer 5 neuron biophysics" by Mark Harnett (see this).

The finding is that the density of voltage gated channels in the human brain is dramatically lower than in other mammalian brains.

  1. The neuronal system studied was layer 5 pyramidal neurons. Dendrites of these neurons were considered. Densities of voltage gated channels per neuron volume and per brain volume were studied. The ion channels studied were Na and K channels. The channels considered are ion pumps and need metabolic energy.

    10 mammalian species were studied so that cortical thickness and neuron size were the varying parameters. As the neuron size increases, the density of neurons decreases.

  2. The first finding was that the density of ion channels for the neuron increases as the neuron size increases. The density of ion channels per brain volume was however found to be constant.

    Humans were found to be an exception. The density of the channels per brain volume is dramatically reduced. The proposed interpretation is that this reduces the amount of metabolic energy needed to generate action potentials and the metabolic energy is used for other purposes.

Before continuing, it is good to recall some basic facts about neurons. Synapses, dendrites, and myelination are the basic notions needed if one tries to understand these findings. It is enough to notice that most synaptic contacts are between axons to dendrites but that almost any other combinations are possible. Myelination is mostly for axons and only rarely for dendrites. The dendrites of the excitatory pyramidal cells studied in the article are profusely decorated with dendritic spines.

Could the TGD view about the brain and neuron allow us to interpret the difference between humans and other mammals? Why would the density of the voltage gated ionic channels be smaller for pyramidal dendrites? How could this relate the evolutionary leap leading to the emergence of humans?

TGD view about neuron and brain allows us to consider two different but not mutually exclusive explanations for the finding.

  1. The spatial resolution of the percept produced at MB by Josephson radiation would be reduced for humans. This need not be a drawback since it could be also understood as an abstraction. High spatial resolution would be needed only for local percepts in the scale of neuron soma. On longer scales it would mean generation of useless information and metabolic energy waste.

    The natural guess is that the resolution scale is proportional to ℏeff,B at intra-brain flux tubes in turn proportional to ℏeff,MB for the flux tubes at the MB of brain having quantal length scales much longer than brain size. The range of variation of the spatial resolution could correspond to the variation of ordinary photon wavelengths between visible wavelengths (of order μm) and IR wavelengths of order 14.8 μm. Note however that the lengths of myelinated portions are about 100 μm.

  2. Suppose that Josephson radiation patterns associated with the myelinated portions of axon define "sentences" and the unmyelinated portions define periods ending these "sentences" by a nerve pulse. Does the notion of "sentence" make sense also for dendrites?

    At least in the case of humans, having a reduced volume density of ion channels, this picture might generalize also to dendrites, which are usually un-myelinated since the myelination is not needed since the dendrites are typically short as compared to axons. If so, the average distance between two ion channels would define length and duration for a "sentence".

    For other mammals than humans, the "sentences" would be very short or the notion of "sentence" would not make sense at all (the spatial extent of the perturbation of the membrane potential would be of the order wavelength of the soliton). Could this reflect the emergence of language in humans? MB would not only receive long "sentences" but also send them back as control commands inducing motor actions and virtual sensory input.

  3. If the communication between pre-and postsynaptic neuron occurs via MB, dendrites would receive "sentences" from the MB of the presynaptic neuron as a feedback. If generalized motor action is in question, BSFR and time reversal would be involved. The action potentials propagate along axons in a single direction, which would reflect a fixed arrow of time. Does the reversed arrow of time imply that the action potentials along dendrites propagate outwards from the cell body?

    According to Wikipedia (see this), dendrites indeed have the ability to send action potentials back into the dendritic arbor. Known as back-propagating action potentials, these signals depolarize the dendritic arbor and provide a crucial component toward synapse modulation and long-term potentiation. Furthermore, a train of back-propagating action potentials artificially generated at the soma can induce a calcium action potential (a dendritic spike) at the dendritic initiation zone in certain types of neurons.

  4. Dendrites are usually unmyelinated. This conforms with the fact that dendrites are much shorter than axons so that myelination is not needed. Myelination would also restrict the number of synaptic contacts. Myelinated dendrites have been however found in the motochords of frog (see this) and in the olfactory bulb (OB) of some mammals, for instance mouse (see this). Their fraction is small.

    Olfactory system (OS) is very interesting in this respect since it represents the oldest parts of CNS. The axons from the nasal cavity to the olfactory bulb (OB), where odours are thought to be processed are unmyelinated as are the axons of invertebrates in general. The axons from the olfactory bulb (OB) to the olfactory cortex (OC) are myelinated. This conforms with the idea that OB corresponds to the oldest part of OS. The TGD interpretation would be OB sends the results of analysis to OC via MB as "sentences".

    OB also can have a small fraction of myelinated dendrites implying a reduction in the number of synaptic contacts. The rule "A→B" → "A→ MB→ B" (signal from A to B in brain goes via MB and involves BSFR at MB) suggests that there is an MB between olfactory epithelium and OB and that some analysis is performed at MB. If so, the myelinated dendrites would correspond to input from MB as long "sentences".

See the article About TGD view of neuron or the chapter with the same title.

For a summary of earlier postings see Latest progress in TGD.

Articles and other material related to TGD.

Tuesday, November 09, 2021

Does dark matter induce formation of more dark matter?

It has been proposed (see this) that dark matter could induce formation of dark matter. This would suggest that dark matter is a phase of ordinary matter and its formation is a phase transition generated by a seed. This supports the TGD view about dark matter as phases of matter with effective Planck constant heff=nh0 having arbitrarily large values and behaving like dark matter: phases with different values are relatively dark.
  1. The integer n corresponds to a dimension of extension of rationals associated with a polynomial determining space-time surface as surface in M8 mapped to H=M4×CP2 by M8-H duality. n corresponds also to the order of the Galois group acting as a symmetry group.
  2. Galois confinement suggests a universal mechanism for bound state formation: physical states are composites of particles with momenta coming as are algebraic integers. The components of total 4-momentum would be ordinary integers by periodic boundary conditions. This mechanism has also generalization: one has Galois singlet wave function in the space of momenta with components as algebraic integers.
  3. As a rule, particle energies increase with heff and the analog of "metabolic energy feed" is needed to prevent the reduction of heff to h. In living matter the function of metabolism is just this.
  4. The phase transitions increasing heff are possible in the presence of energy feed. Bose-Einstein condensation and formation of Cooper pairs (and even states with a larger number of particles, such as 4 electrons as observed recently) could be examples of this. The binding energy of the composite would compensate for the energy needed to increase heff. Fermi statistics with algebraic integer valued momenta allows more Galois confined bound states with a given energy favoring therefore the occurrence of the phase transition.
  5. Phase transitions quite generally have the property that a small seed induces the phase transition. This would predict that the presence of dark matter favors the emergence of more dark matter.
See the article TGD as it is towards end of 2021

For a summary of earlier postings see Latest progress in TGD.

Articles and other material related to TGD.

Moon has possessed a magnetic field

The surprise of yesterday (see this) was that the Moon has had a magnetic field, which at the surface of the Moon has had the same order of magnitude as the magnetic field of Earth about BE=.5 Gauss at the surface. The finding is deduced from the direction of the magnetization of the material at the surface. The same method is used to deduce information about the magnetic history of Earth.

The idea that the Moon would have a liquid interior, carrying a net current, which by dynamo effect would generate the current creating the magnetic field, looks rather implausible. This proposal has problems also in the case of Earth. The problem is that the current creating the magnetic field should have dissipated a long time ago. The magnetic field of Earth however stubbornly continues to exist. The same problem is encountered with magnetic fields in cosmic scales: they should not exist since in standard cosmology the currents would be short ranged.

At microscopic level, TGD replaces magnetic fields with magnetic flux quanta, flux tubes and flux sheets. The flux tubes can be monopole flux tubes in which case they are stable and no current is needed to preserve the field. This is crucial. The cross section of the flux tube is closed 2-surface, which requires non-trivial space-time topology. Second kind of flux tubes have vanishing flux and correspond to Maxwellian magnetic fields requiring current.

In the case of Earth, a good guess for the strength of monopole contribution would be about .2 Gauss, roughly 2BE/5 from experiments of Blackman et al leading to the notion of the dark matter as heff>h phases at magnetic flux tubes. This field would play a key role in TGD inspired quantum biology but this value would not be the only value possible.

This leads to a model for the maintenance of BE. When the non-monopole part of BE becomes weak enough, the magnetic body (MB) of Earth turns, and induced currents re-creating the induced part (see this).

Could the monopole part of the magnetic field at monopole flux tubes play a role analogous to field H induces a magnetization M cancelling the total field B= H+M in the case of diamagnets? H and M would be at different space-time sheets but their effects on test particles touching all sheets would sum up and at QFT limit B would be the detected effective field and vanish for dia-magnets.

Superconductors are diamagnetic. This is usually explained in terms of the Meissner effect. TGD however leads to a model of superconductivity in which supra currents are heff>h phases at flux tubes, presumably monopole flux tubes. Could magnetic fields actually penetrate to super-conductors as monopole flux tubes (or sheets) with quantized flux inducing a magnetization cancelling the total effective field at quantum field theory limit, which is the sum over fields at different space-time sheets as far as its effects are considered (see this).

Venus is in many aspects twin of Earth but does not have a detectable magnetic field. Also Mars seems to have no global magnetic field but has auroras and local magnetic fields. This inspires crazy questions. Could Venus be a diamagnet. Could the magnetic bodies of Venus, Mars and also Moon be superconductors in the scale of the entire object? But why would the MB of Earth not be able to cancel the total field? Could the rotating liquid core induce an additional field, which prevents this?

See the articles Empirical support for the Expanding Earth Model and TGD view about classical gauge fields and TGD as it is towards end of 2021 .

For a summary of earlier postings see Latest progress in TGD.

Articles and other material related to TGD.

Monday, November 08, 2021

Superdeterminism and TGD

Gary Ehlenber sent me an article of Sabine Hossenfelder and Tim Palmer (see this). The article seems like a good collection of arguments pro and con superdeterminism.

When I encounter this kind of proposal, I ask a simple question. What new phenomena are predicted and what anomalies the new approach solves? In the case of superdeterminism, the list of this kind of phenomena is very short. Therefore superdeterminism looks to me like an attempt to return to the good old days before quantum theory and to save the materialistic/physicalistic world view implying that the notions of ethics and moral are illusions.

It must be added that the entire theoretical physics community suffers and also the community of rebels from the same conservatism and superdeterminism represents only an extreme example of this conservatism.

In my view, one should start from where we are now and try to see what in our conceptual landscape is wrong and what new notions and ideas are needed.

To me, the only way forward is to accept non-determinism and the basic paradox of quantum measurement theory without attempts to "interpret" and make a simple question: What goes wrong in our ontology? Does it really make sense to give up entire ontology as Copenhagen interpretation suggests?

There are many deep problems besides the measurement problem.

  1. Is our view about time somehow wrong? Should we distinguish between causations of classical physics and of free will. We experience free will directly: should we accept it as real and perhaps assign it to quantum jump?
  2. Is the assumption about a fixed arrow of time correct? We know experienced time and geometric time are different. Should we accept this also as theoreticians?
  3. Does physics really become classical and deterministic on some scale? Does it really do so? Could this be an illusion due to wrong ontology?
  4. Is deterministic classical physics an exact part of quantum theory?: after all, every quantum measurement is interpreted in terms of classical correlates.
  5. Does the mysterious entanglement have classical, geometric space-time correlates? ER-EPR correspondence could be interpreted in this manner.
  6. There are also the notions of wave-particle duality/position-momentum duality: do we really understand them? Position momentum duality is lost in quantum field theory since coordinates as dynamical variables become parameters. Shouldn't we worry about this?
TGD allows us to answer these questions.
  1. Particle as a point-like entity is non only a source of divergence problems but also suggest local realism, which prevents classical space-time correlates for the notion of entanglement. In TGD, particles as 3-surfaces solve the divergence problem and the new view about classical fields as surfaces leads to the notion of field/magnetic body (MB). Flux tubes connecting particle-like 3-surfaces serve as space-time correlates/prerequisites of entanglement. Flux tubes replace wormholes in ER-EPR correspondence. Many-sheeted space-time is a closely related second new notion.

    MB carrying dark matter as heff>h phases brings in a totally new level to the description and has a key role in biology. heff phases emerge from a generalization of physics: number theoretic vision and geometric view about physics are dual and the duality actually generalizes the position-momentum space duality lost in quantum field theories.

  2. The measurement problem producing myriads of interpretations is the key problem. Here our notion of time is the source of problems. Despite the obvious differences between experienced and subjective time, we stubbornly continue to identify them. Second stubborn assumption is that the arrow of time is fixed despite the fact that in self-organization the arrow of time effectively changes. The standard explanation is in terms of non-equilibrium thermodynamics but this might be only a part of the story. In particular, in living matter.

    In practical quantum theory (quantum optics) one is forced to introduce also the notion of weak measurement. It has no real counterpart in the standard picture. It is analogous to classical measurement: no dramatic changes.

    In quantum theory based zero energy ontology, "big" and "small" state function reductions (BSFRs and SSFRs) emerge naturally. BSFR is the counterpart of ordinary quantum measurement and changes the arrow of time. SSFR is the counterpart of weak measurement and preserves the arrow of time. The experiments of Minev et al and those of Libet provide direct support for BSFR. BSFR also allows us to understand why physics looks classical, not only in long length scales, but always for a system which has an arrow of time opposite that of the system monitored.

    BSFR makes possible dissipation with a reversed arrow of time looking like self-organization. The postulated extremely complex biological programs would be just dissipation with an opposite arrow of time implied by a generalization of the second law. Homeostasis as a paradoxical ability to stay near (quantum) criticality would also have a trivial explanation.

    BSFR leads also to the vision about life and death as universal phenomena not limited to bio-chemical systems only.

  3. Number theoretic vision involving M8-H duality generalizing position-momentum duality to space-time level leads to the notion of cognitive representation providing not only unique discretization of space-time surface but also correlates of cognition. Galois group becomes a symmetry group and Galois confinement stating that quarks as fundamental fermions have 4-momenta which are algebraic integers form states with total 4-momenta whose components are ordinary integers by periodic boundary conditions. This means Galois confinement which could be behind the formation of bound states universally.
See the article TGD as it is towards end of 2021.

For a summary of earlier postings see Latest progress in TGD. Articles and other material related to TGD. 


Sunday, November 07, 2021

Could computable reals (p-adics) replace reals (p-adics) in physics?

For some reason I have managed to not encounter the notion of computable number (see this) as opposed to that of non-computable number (see this). The reason is perhaps that I have been too lazy to take computationalism seriously enough.

Computable real number is a number, which can be produced to an arbitrary accuracy by a Turing computer, which by definition has a finite number of internal states, has input which is natural number and produces output which is natural numbers. Turing computer computes values of a function from natural numbers to itself by applying a recursive algorithm.

The following three formal definitions of the notion are equivalent.

  1. The number a is computable, if it can be expressed in terms of a computable function n→ f(n) from natural numbers to natural numbers characterized by the property

    (f(n)-1)/n≤ a≤ (f(n)+1)/n .

    For rational a=q, f(n)= nq satisfies the conditions. Note that this definition does not work for p-adic numbers since they are not well-ordered.

  2. The number a is computable if for an arbitrarily small rational number ε there exists a computable function producing a rational number r satisfying |r-x|≤ ε. This definition works also for p-adic numbers since it involves only the p-adic norm which has values which are powers of p and is therefore real valued.
  3. The number a is computable if there exists a computable sequence of rational numbers ri converging to a such that |a-ri| ≤ 2-i holds true. This definition works also for 2-adic numbers and its variant obtained by replacing 2 with the p-adic prime p makes sense for p-adic numbers.
The set Rc of computable real numbers and the p-adic counterparts Qp,c of Rc, have highly interesting properties.
  1. Rc is enumerable and therefore can be mapped to a subset of rationals: even the ordering can be preserved. Also Qp,c is enumerable but now one cannot speak of ordering. As a consequence, most real (p-adic) numbers are non-computable. Note that the pinary expansion of a rational is periodic after some pinary digit. For a p-adic transcendental this is not the case.
  2. Algebraic numbers are computable so that one can regard Rc as a kind of completion of algebraic numbers obtained by adding computable reals. For instance, π and e are computable. 2π can be computed by replacing the unit circle with a regular polygon with n sides and estimating the length as nLn. Ln the length of the side. e can be computed from the standard formula. Interestingly, ep is an ordinary p-adic number. An interesting question is whether there are other similar numbers. Certainly many algebraic numbers correspond to ordinary p-adic numbers.
  3. Rc (Qp,c) is a number field since the arithmetic binary operations +, -, ×, / are computable. Also differential and integral calculus can be constructed. The calculation of a derivative as a limit can be carried out by restricting the consideration to computable reals and there is always a computable real between two computable reals. Also Riemann sum can be evaluated as a limit involving only computable reals.
  4. An interesting distinction between real and p-adic numbers is that in the sum of real numbers the sum of arbitrarily high digits can affect even all lower digits so that it requires computational work to predict the outcome. For p-adic numbers memory digits affect only the higher digits. This is why p-adic numbers are tailor made for computational purposes. Canonical identification ∑ xnpn → ∑ xnp-n used in p-adic mass calculations to map p-adic mass squared to its real counterpart (see this) maps p-adics to reals in a continuous manner. For integers this corresponds is 2-to-1 due to the fact that the p-adic numbers -1= (p-1)/(1-p) and 1/p are mapped to p.
  5. For computable numbers, one cannot define the relation =. One can only define equality in some resolution ε. The category theoretical view about equality is also effective and conforms with the physical view.

    Also the relations ≤ and ≥ fail to have computable counterparts since only the absolute value |x-y| can appear in the definition and one loses the information about the well-ordered nature of reals. For p-adic numbers there is no well-ordering so that nothing is lost. A restriction to non-equal pairs however makes order relation computable. For p-adic numbers the same is true.

  6. Computable number is obviously definable but there are also definanable numbers, which are not computable. Examples are Gödel numbers in a given coding scheme for statements, which are true but not provable. More generally, the Gödel numbers coding for undecidable problems such as the halting problem are uncomputable natural numbers in a given coding scheme. Chaitin's constant, which gives the probability that random Turing computation halts, is a non-computable but definable real number.
  7. Computable numbers are arithmetic numbers, which are numbers definable in terms of first order logic using Peano's axioms. First order logic does not allow statements about statements and one has an entire hierarchy of statements about... about statements. The hierarchy of infinite primes defines an analogous hierarchy in the TGD framework and is formally similar to a hierarchy of second quantizations (see this).
See the article MIP*= RE: What could this mean physically? or the chapter Evolution of Ideas about Hyper-finite Factors in TGD.

For a summary of earlier postings see Latest progress in TGD.

Articles and other material related to TGD. 


Tuesday, November 02, 2021

Does consciousness survive bodily death?

Bigelow institute organized an essay competition. What is the best possible evidence for the survival of human consciousness after bodily death? was the question of the essay. I wrote an essay, or rather a research article of almost 100 pages with the generous help of my friend Paul Kirsch, who also proposed that I should participate. It was not surprising that I did not become a half-millionaire. The concepts and ideas certainly went completely over the heads of the respected jury.

It is very difficult to provide water tight evidence for life after death since near-death experiences are subjective and do not provide objective proof.

The situation changes if one has a testable theory of consciousness. The theory of consciousness presented here is inspired by Topological Geometrodynamics (TGD). TGD was born as a proposal for a unification of fundamental interactions, and indeed provides a general theory of consciousness as a generalization of quantum measurement theory predicting that consciousness, life and death are universal phenomena. The theory relies on new views of space-time and classical fields, and provides a new ontology behind quantum theory that predicts that state function reduction involves time reversal.

The proposed hypothesis forces a new view of the relationship between experienced time and physicist's time, and generalizes thermodynamics so that the second law is replaced with what I call the Negentropy Maximization Principle. Also cognition is included and forces the extension of real number based physics to adelic physics including not only reals but also p-adic number fields. Adelic physics predicts a hierarchy of phases of ordinary matter with a non-standard value heff of the Planck constant interpreted as dark matter which for large values heff is quantum coherent at arbitrarily long scales. Theory makes testable predictions at all scales supporting the proposed view of the continuation of life beyond biological death. A model for what happens in biological death and an explanation for various aspects of near-death experiences emerges.

See the article Does consciousness survive bodily death? or a chapter with the same title.

For a summary of earlier postings see Latest progress in TGD.

Articles and other material related to TGD.