https://matpitka.blogspot.com/2013/11/

Friday, November 29, 2013

NMP and Consciousness

Alexander Wissner-Gross, a physicist at Harvard University and the Massachusetts Institute of Technology, and Cameron Freer, a mathematician at the University of Hawaii at Manoa, have developed a theory that they say describes many intelligent or cognitive behaviors, such as upright walking and tool use (see this and this ). The basic idea of the theory is that intelligent system collects information about large number of histories and preserves it. Thermodynamically this means large entropy so that the evolution of intelligence would be rather paradoxically evolution of highly entropic systems. According to standard view about Shannon entropy transformation of entropy to information (or the reduction of entropy to zero) requires a process selecting one of instances of thermal ensemble with a large number of degenerate states and one can wonder what is this selection process. This sounds almost like a paradox unless one accepts the existence of this process. I have considered the core of this almost paradox in TGD framework already earlier.

According to the popular article (see this) the model does not require explicit specification of intelligent behavior and the intelligent behavior relies on "causal entropic forces" (here one can counter argue that the selection process is necessary if one wants information gain). The theory requires that the system is able to collect information and predict future histories very quickly.

The prediction of future histories is one of the basic characters of life in TGD Universe made possible by zero energy ontology (ZEO) predicting that the thermodynamical arrow of geometric time is opposite for the quantum jumps reducing the zero energy state at upper and lower boundaries of causal diamond (CD) respectively. This prediction means quite a dramatic deviation from standard thermodynamics but is consistent with the notion of syntropy introduced by Italian theoretical physicist Fantappie already for more than half a century ago as well as with the reversed time arrow of dissipation appearing often in living matter.

The hierarchy of Planck constants makes possible negentropic entanglement and genuine information represented as negentropic entanglement in which superposed state pairs have interpretation as incidences ai↔ bi of a rule A↔ B: apart from possible phase the entanglement coefficients have same value 1/n1/2, where n=heff/h define the value of effective Planck constant and dimension for the effective covering of imbedding space. This picture generalizes also to the case of multipartite entanglement but predicts similar entanglement for all divisions of the system to two parts. There are however still some questions which are not completely settled and leave some room for imagination.

  1. Negentropic entanglement is possible in the discrete degrees of freedom assignable to the n-fold covering of imbedding space allowing to describe situation formally. For heff/h=n one can introduce SU(n) as dynamical symmetry group and require that n-particle states are singlets under SU(n). This gives rise to n-particle states constructed by contracting n-dimensional permutation symbol contracted with many particle states assignable to the m factors. Spin-statistics connection might produce problems - at least it is non-trivial - since one possible interpretation is that the states carry fractional quantum numbers- in particular fractional fermion number and charges.

  2. Is negentropic entanglement possible only in the new covering degrees of freedom or is it possible in more familiar angular momentum, electroweak, and color degrees of freedom?

    1. One can imagine that also states that are singlets with respect to rotation group SO(3) and its covering SU(2) (2-particle singlet states constructed from two spin 1 states and spin singlet constructed from two fermions) could carry negentropic entanglement. The latter states are especially interesting biologically.

    2. In TGD framework all space-time surfaces can be seen at least 2-fold coverings of M4 locally since boundary conditions do not seem to allow 3-surfaces with spatial boundaries so that finiteness of the space-time sheet requires covering structure in M4. This forces to ask whether this double covering could provide a geometric correlate for fermionic spin 1/2 suggested by quantum classical correspondence taken to extreme. Fermions are indeed fundamental particles in TGD framework and it would be nice if also 2-sheeted coverings would define fundamental building bricks of space-time.

    3. Color group SU(3) for which color triplets defines singlets can be also considered. I have been even wondering whether quark color could actually correspond to 3-fold or 6-fold (color isospin corresponds to SU(2)) covering so that quarks would be dark leptons, which correspond n=3 coverings of CP2 and to fractionization of hypercharge and electromagnetic charge. The motivation came from the inclusions of hyper-finite factors of type II1 labelled by integer n≥ 3. If this were the case then only second H-chirality would be realized and leptonic spinors would be enough. What this would mean from the point of view of separate B and L conservation remains an open and interesting question. This kind of picture would allow to consider extremely simple genesis of matter from right-handed neutrinos only (see .

      There are two objections against this naive picture. The fractionazion associated with heff should be same for all quantum numbers so that different fractionizations for color isospin and color hyper charge does not seem to be possible. One can of course ask whether the different quantum numbers could be fractionized independently and what this could mean geometrically. Second, really lethal looking objection is that fractional quark charges involve also shift of em charge so that neutrino does not remain neutral it becomes counterpart of u quark.


Negentropy Maximization Principle (NMP) resolves also the above mentioned almost paradox related to entropy contra intelligence. I have proposed analogous principle but relying on generation of negentropic entanglement and replacing entropy with number theoretic negentropy obeying modification of Shannon formula involving p-adic norm in the logarithm log(|p|p) of probability. The formula makes sense for probabilities which are rational or in algebraic extension of rational numbers and requires that the system is in the intersection of real and p-adic worlds. The dark matter matter with integer value of Planck constant and heff=nh predicts rational entanglement probabilities: their values are simply pi=1/n since the entanglement coefficients define a diagonal matrix proportional to unit matrix. Negentropic entanglement makes sense also for n-particle systems.

Negentropic entanglement corresponds therefore always to n× n density matrix proportional to unit matrix: this means maximal entanglement and maximal number theoretic entanglement negentropy for two entangled systems with number n of entangled states. n corresponds to Planck constant heff= n×h so that a connection with hierarchy of Planck constants is also obtained. Power of p-adic prime defines the largest prime power divisor of n. Individually negentropically entangled systems would be very entropic since there would be n energy-degenerate states with the same Boltzmann weight. Negentropic entanglement changes the situation: thermodynamics of course does not apply anymore. Hence TGD produces same prediction as thermodynamical model but avoids the almost paradox.

For details and background see the section "Updates since 2012" of chapter "Negentropy Maximization Principle" of "TGD Inspired Theory of Consciousness".

Thursday, November 28, 2013

Psychedelics induced experiences and magnetic body

There is a book about psychedelic induced experiences titled as "Inner paths to outer space" as "Inner paths to outer space" written by Rick Strassman, Slawek Wojtowicz, Luis Eduardo Luna and Ede Frecska. It took some time to realize that I have actually have met the Luna and Frecska.

Some background about psychedelics

Phychoactive drugs can be classified into three basic types. Some raise the activity level, some calm down, and some change the character of consciousness profoundly. Hallucinogens/psychedelics belong to the third group. Psychedelics (such as psilobin, psicylobin, DMT, LSD) are molecules containing aromatic rings and many of them (such as psilobin, psicylobin, DMT, LSD) attach to serotonin receptors.

As the official term "hallucinogens" implies, psychedelic induced experiences are regarded as hallucinations in the materialistic world view although the denial of the reality of subjective experiences themselves requires a really hard-nosed skeptic. The title of the book reveals that the question posed in the book is whether these experiences could be about real world, kind of sensory input from distant parts of the Universe. The indigenous people using ayahuasca and similar psychedelics have regarded these experiences involving meeting of representatives of other civilisations as perceptions about real worlds. Also Terence and Dennis McKenna, who are pioneers of systematic study of the effects of various pscychdelics, shared this view. In the materialistic ontology of standard physics this kind of interpretation is of course excluded. That hallucinations are in question is "obvious", too obvious actually!

The classical psychelics LSD, psilocybin, DMT, and mescaline are derivatives of two basic chemical groups: tryptamine and phenetylamine which in turn derive from the aminoacids trp and and phe. DMT is endogenous psychedelic and there is pumping of DMT blood-brain barrier so that DMT must have important function in brain. The action of classical psychedelics is on serotonin receptors and affects the serotonin intake so that the effect is stronger. The amount of serotonin in brain is not large but manifest in all functions. According to the book, the situation might be different for pineal gland which is the only nucleus of brain which does not appear as left-right pair. Descartes recarded it as the seat of soul. Pineal gland is also known as "third eye" and in lower species it indeed serves the function of eye.
Taking the title of the book seriously, one can ask whether this eye is able to see but to cosmic distances possibly using large heff photons.

Could instantaneous communications in cosmic scales be possible in TGD Universe?

In TGD inspired ontology the notion of magnetic body with astrophysical, galactic or even super-galactic size changes the situation completely. The basic communication tool would be touch of magnetic bodies generating reconnections and making possible signalling from the biological body member of distant civilization. The perception of the biological body of alien would differ in no manner from that of my neighbour since the mechanisms would be the same as involved with the transfer of sensory data to my personal magnetic body and control commands from there to biological body (at least through genome).

The basic objection against the possibility suggested by the title of the book is that finite light velocity poses absolute upper bound for the distance of objects with it is possible to be in contact during "trip". One must be howeververy cautious here: the assumption that signals propagate only to singlet direction of time is essential also and derives from classical thermodynamics. In TGD framework second law continues to hold true but the arrow of geometric time for zero energy states changes in each state function reduction occurring to the either boundary of CD. Hence intantaneous communications ("remote seeing"!)using reflection in time direction become possible even over cosmological distances and define among other things the mechanism of memory in TGD Universe.

I have proposed earlier that UFO experiences could be induced with the mediation of primitive plasmoid like lifeforms taking the role of mediums making possible flux tube connections and entanglement with representatives of distant civilisations. Time consuming and expensive space travel would become un-necessary: our magnetic body giving us cosmic size together with zero energy ontology making possible instantaneous "seeing" of both future and past by reflection of photons in time direction would be enouh. This view would also resolve Fermi paradox. We would be actually in a continual contact with the distant civilisations but without realizing it. Similar contacts could take place in psychedelic induced experiences. Memories and future plans would be examples of "seeing" in time direction. The continual re-creation of the Universe by quantum jumps would of course mean that the actual future/past need not be same as those which are "seen". Shamans identify various plants as conscious entities teaching them - in TGD framework this would translate to magnetic bodies of representatives of distant civilisations remotely teaching the representatives of more primitive civilisations.

If the claims of shamans and true, one must ask whether our brain has well develpod tools available for building contacts with distant civilisations and what these tools might be. The receptors of neural transmitters are obviously the natural candidates for the pathways to cosmos. One can argue that evolutionary pressures have forced living matter to develop highly standardized connections to various parts biological body, of the personal magnetic body and possibly also other magnetic bodies. Personal magnetic body has astrophysical size and EEG frequencies would correspond to communications in Earth size scale.

Neurotransmitter receptor complex as plug-ins to cosmic internet and new perspective on remote seeing

If the claims of shamans and true then one must ask whether our brain has well developed tools available for building contacts with distant civilisations and what these tools might be. The receptors of neural transmitters are obviously the natural candidates for the pathways to cosmos. One can argue that evolutionary pressures have forced living matter to develop highly standardized connections to various parts of the personal magnetic body and possibly also other magnetic bodies. Receptors serving as Josephson junctions emitting Josephson radiation with frequency characterised by heff are natural candidates for plug-ins.

The model for cell membrane as Josephson junction leads at the microscopic level to the view that the proteins associated with various ion pumps, channels, and receptors (of also neurotransmitters in postsynaptic junction) define Josephson junctions to which magnetic flux tubes are associated and characterized by local value of Josephson frequency, that is membrane potential and Planck constant heff. As the information molecule is attached to a receptor, a connection to some part of some magnetic body would be generated and split as the molecule is not present. These connections are possible in the scale cell, organelle, organ, organism, population and maybe even in the scale of cosmos. Psychedelics affect serotonin receptors so that serotonin spends longer time in receptor.

The simplest picture is that the connection corresponds to a pair of flux tubes. As the connection is broken, the pair has suffered reconnection cutting it to two U-shaped closed flux tubes. When molecule is attached to the receptor, these U-shape closed flux tubes reconnect. The actual situation is of course expected to be more complex but the basic principle would be this.

In this framework psychedelics would be molecules associated with the reaction pathways facilitating the binding of neurotransmitters responsible for building connections to certain parts of the magnetic bodies characterised by passwords defined by collections of cyclotron frequencies corresponding to a hierarchy of space-time sheets. The Josephson frequency associated with the receptor is inversely proportional to heff. The natural guess is that it corresponds to the cyclotron frequency of the magnetic body part for electron, proton, or some ion associated with it. Josephson frequency should serve as kind of passwords and receptors would be in one-one correspondence with these passwords defining gateways even to the outer space if the value of Planck constant is large enough.

One might be able to test this crazy hypothesis. Pineal gland could still serve as the "third eye" but utilizing large heff photons. Fishes and birds are able to navigate to their birth places: could this miracle involve "remote seeing" by pineal gland using the dark light coming along flux tubes or even active variant of this process by sending light which is reflected back in time direction (blue and red light are necessary for the ability to navigate). What about remote seeing in the usual sense: could psychedelics help also in this process?

For details and background see the chapter TGD Inspired view about remote mental interactions and paranormal of "TGD based view about living matter and remote mental interactions".

Seth Lloyd on quantum life

The notion of quantum biology is becoming accepted notion although Wikipedia contains still nothing about its most important application (photosynthesis). I can be proud that I have been a pioneer of quantum biology for about two decades. TGD remains still one of the very few theories leaving the realm of standard quantum theory and suggesting besides the new view about space-time a generalization of quantum theory involving in an essential manner quantum theory of consciousness based on the identification of quantum jump as moment of consciousness. The new view about quantum theory involves a refined view about quantum measurement based on Negentropy Maximization Principle (NMP) kenociteallb/nmpc identified as the basic variational principle and zero energy ontology (ZEO) replacing ordinary standard energy ontology. The new view providing new vision about the relationship between subjective time and geometric time, about the arrow of time, and about second law.

The hierarchy of Planck constants having as space-time correlate effective (or real -depending on interpretation) n-sheeted coverings of 8-D imbedding space (or space-time) with heff=nh defining the value of (effective) Planck constant. p-Adic physics as physics of cognition is essential part of theory and together with the hierarchy of Planck constants closely related to the notion of negentropic entanglement characterizing living matter. Negentropic entanglement is maximal involving two-particle case tge entanglement of n states characterized by n× n unit matrix with n identified in terms of heff. Also maximal m-particle entanglement with 1<m≤ n is possible and one can write explicit formulas for the entangled states relating closely to the notion of exotic atom introduced earlier. The hierarchy of Planck constants as associated with dark matter so that dark matter is what makes living matter living in TGD Universe.

The concepts of many-sheeted space-time and topological field quantization imply that the concept of field body (magnetic body) becomes a crucial ew element in the understanding of living matter. Non-locality in even astrophysical scales becomes an essential piece of the description of living matter. Remote mental interactions making possible communication between biological and magnetic bodies become standard phenomena in living matter. The reconnection of magnetic flux tubes and phase transitions changing the value of heff and thus changing the length of magnetic flux tubes become a basic piece of biochemistry. Various macroscopic quantum phases such as dark electronic Cooper pairs and of protons and even ions as well as Bose-Einstein condensate of various dark bosonic objects with large value of heff are also central. They are associated with magnetic flux bodies (magnetic flux tubes) .

TGD implies a new, still developing, view about metabolism. Magnetic body as a carrier of metabolic energy and negentropic entanglement allows to understand the deeper role of metabolism in a unified manner. The notion of high energy phosphate bond assigned to ATP is one of the poorly understood notions of biochemistry. As a matter fact, all basic biomolecules are carriers of metabolic energy liberated as they are broken down in catabolism. It is usually thought that the covalent bonds containing shared valence electron pair between atoms involved carries this energy and that covalent bond reduces to standard quantum theory. TGD challenges this belief: covalent bond could in TGD framework correspond to magnetic flux tube associated with the bond having considerably larger size than the distance between atoms: similar picture has already earlier emerged in the model of nuclei as strings with colored flux tubes connecting nucleons and having length scale much longer than nuclei kenociteallb/nuclstring: this model also explains kenociteallb/padmass5 the puzzling observation that protonic charge radius seems to be somewhat larger than predicted kenocitebpnu/shrinkproton.

The metabolic energy quantum would be associated with large heff valence electron pair being identifiable as cyclotron energy in endogenous magnetic field for which the pioneering experiments of Blackman suggests value Bend=.2 Gauss as the first guess.Of course, entire spectrum of values coming as power of two multiples of this field strength can be considered. This would require rather high value of heff/h=n of order 108. Reconnection of flux tubes would make possible to transfer these electron pairs between molecules: actually a piece of flux tube containing the electron pair would be transferred in the process. This view allows to unify the model of metabolism with the view of DNA-cell membrane system as topological quantum computer with DNA nucleotides and lipids (or molecules assigned with them) by flux tubes.

Seth Lloyd represents three examples about situations in which quantum biology seems to be a "must": photosynthesis, navigation of birds, and odour perception. Photosynthesis represents the strongest and most quantitative support for quantum biology. Navitation and odour perception suggest strongly quantum theory model but leave the details of the model open.

I have applied TGD to numerous situations during years and also discussed simple TGD inspired models for all these three phenomena . The following represents briefly the core of Lloyd's talk and comparison with TGD based views. I do not of course have access to the data basis and can represent only a general vision rather than detailed numerical models. I share Lloyd's belief that quantum models provide the only manner to understand the data although models as such are not final. The authors of course want to publish their work and therefore cannot introduce explicitly notions like high temperature super-conductivity, which I believe are crucial besides purely TGD based concepts. What is however good that the models start from data and just look how to explain the data in quantum approach. Data lead to assumptions, which are not easy to defend in the framework of standard quantum theory. For instance, the presence of long-lived entangled pairs of electrons and electron and hole with wave functions possessing rather long coherence length and somehow isolated from entanglement destroying interactions with the external world emerge from the data. In TGD large value of heff/h and associated negentropic entangelement justifies these assumptions.

Photosynthesis

The incredible effectiveness of the first step of the photosynthesis after photon absorption
kenocitebbio/qharvesting is one of the key points of Lloyd in this talk. The organisms living deep under the surface of ocean are able to gather their metabolic energy using only the visible photons of black body radiation, whose typical photon energy is much lower than that of metabolic energy. In human eyes there is even mechanism preventing the detection of less than five photons at time.

The first step of photosynthesis after the capture of photon by harvesting antenna proteins has been a long standing mystery and here only quantum mechanical approach seems to provide the needed understanding. The light harvesting antenna proteins can be visualized as small disk like objects and are associated with a membrane like structure - so called thylakoid membrane similar to cell membrane. The absorption creates what is known as exciton - electron-hole pair, which is most naturally singlet. Photon has spin so that the exciton must have unit angular momentum. After its creation the electron of the exciton reaches by a random walk like process the reaction centre. Fromthe reaction centre the process continues as a stepwise electron transfer process leading eventually to the chemical storage of the photon energy.

The capture of photon occurs with some probability and also the process continues from reaction centre only with probability of about 5 per cent. The process with which the electrons reaches the reaction centre is however amazingly effective: effectives is above 95 per cent. This is mysterious since fort the classical random walk for exciton between the chromophores the time is proportional to the square root of distance measured as number of neighboring chromophores along the path.

The quantum proposal is that exciton is spin singlet state - this minimizes the interactions with photons - and performs quantum (random) walk to the reaction centre. The model assumes only experimental data as input and all parameters are fixed. Temperature remains the variable parameter. One can consider two extreme situations. At low energy limit the random walks tends to be stuck since external perturbations (mostly thermal photons) inducing the random walk process are not effective enough and quantum walk becomes so slow so that the exciton decays before it reaches the reaction centre. At high energy limit the thermal perturbations destroy quantum coherence and classical random walk results so that the efficiency becomes essentially zero. There is a temperature range where the transfer efficiency is near unity and time for reaching the reaction centre relatively short. This range has as midpoint room temperature.

If I have understood correctly, the model accepts as experimental facts the rather long lifetime of the exciton - few nanoseconds. In quantum-computerish this assumption translates to the statement that exciton belongs to a decoherence-free subspace so that external perturbations are not able to destroy the exciton too fast. Second assumption is that the exciton is de-localized over a ring ling like structure of size scale of 7 Angstroms (actually there are two rings of this kind, inner and outer and the wave function is assumed to be rotationally symmetric for the inner ring). This de-localization increases the probability of transfer to neighboring chromophore so that it is proportional to the square N2 of the number N of chromophores rather than N. The technical term expressing this is concatenated quantum code.

Skeptic would probably claim that coherence and stability of coherence are the weak points of the model. In TGD framework the assumption that electron-hole pair is negentropically entangled would guarantee its long life time. The reason is that NMP favors negentropic entanglement. Negentropic entanglement corresponds to entanglement associated with n-sheeted effective covering of imbedding space and n has interpretation in terms of effective Planck constant heff=nh. The naive guess is that coherence scale for the wave function of exciton scales up by factor n or kenosqrtn. This entangement need not have anything to do with spin but could relate to
large hbar.

I have earlier considered a slightly different proposal. Instead of exciton the negentropically entangled system would be Cooper pair of dark electrons. Note that the negentropic entanglement need not relate to the spin but to the n-fold covering although it could be assigned with spin too in which case the state would be spin singlet. The motivation came from the fact that the transfer of electrons to the reaction centre takes as pairs (see this). The TGD inspired interpretation of electron pair would be as dark Cooper pair. Two electron pairs would come from the splitting of two water molecules to O2, 4 protons and two electron pairs, and they would end up to P680 part of photosystem II (680 refers to maximally absorbed wavelength in nanometers) and from here to P680* as two pairs. This mechanism would require that the Cooper pairs absorbs the photon as single particle. In the case of dark Cooper pairs this might be naturally true. If this requires exchange of photon between the members of the pair, the rate for this process is of the order α2 lower.

Avian navigation

Second topic discussed by Seth Lloyd is avian navigation (see this). The challenge is to understand how birds (and also fishes) are able to utilize Earth's magnetic field in order to find their way during migration. In some cases the magnetite in the beak of the bird guides the way along magnetic field lines by inducing magnetic force, and the process can be understood at least partially. Consciousness theorist could of course wonder why these animals find year after year their exact birth place.

Robins however represent an example not so easy to understand. There are three input facts:

  1. Robins are able to detect the orientation of BE but not its direction. They can also detect the angle between orientation and vertical to the Earth's surface and from this to deduce also the direction of BE.

  2. Blue or green light is necessary for the successful detection of the orientation.

  3. Oscillating em field with frequency of order MHz makes the robins totally disoriented.
The only model that seems to be able to explain the findings is that long-lived entangled pairs of electrons are created by the photon provided their energy is high enough. For red light the energy is 2 eV and is not yet quite enough. This suggests that the electrons originate from a pair of molecules or atoms of single molecule. It is not known what the molecules in question could be. The electrons of the pair are spinning in the magnetic field and this is suggested to cause the decay of the pair and second member (why not both?) of the pair would contribute to a current giving eventually rise to nerve pulse pattern.

Entangled long-lived electron pair should be created. Long lifetime is the problem. The proposed mechanism brings in mind the TGD based variant for the light harvesting mechanism of photosynthesis. Universality suggests that long lived dark negentropically entangled Cooper pairs are generated in both cases so that light harvesting is in question in both cases. These pairs assignable to membrane structures in both cases in turn would generate a supra current giving eventually rise to a generation of nerve pulses in the case of navigation and to electron transfer process in the case of photosynthesis. If the same mechanism is involved in both cases, the extreme effectiveness of this light harvesting process could make it possible for the birds to navigate even in dark. Electron has cyclotron frequency of about 1.5 MHz in the Earth's magnetic field and this makes easy to understand why oscillation with this frequency (resonance) induces disorientation by forcing the spinning of the dark Cooper pairs.

Why the energy of photon creating the dark electron Cooper pair should correspond to visible light? Cyclotron energy scale for the ordinary value of Planck constant is extremely small and corresponds to frequency in MHz range. For visible photons the frequency by order of magnitude 108 higher. Does this correspond to the value of heff? Similar order of magnitude estimate follows from several premises. If the scaling of h by n corresponds roughly to the scaling of p-adic scale by n1/2, one would have roughly 1015-fold (effective) covering of imbedding space which looks rather science-fictive! For electrons this would imply size of order cell size if dark scale corresponds to the p-adic scale. If the electrons are originally in bound states with binding energy of order eV, the value of heff could be much lower.

I smell the quantum

Quantum detection of odours was the third topic in Lloyd's talk. For decades it was believed that odor perception is based on lock and key mechanism. Human has 387 odour receptors and this would be the number of smells too. It has however turned out that humans can discriminate between about 104 smells and Luca Turin and his wife have written a book giving a catalogue of all these smells. It is clear that lock key mechanism is correct but something else is needed in order to understand the spectrum of odors.

The key observation of Turin is that the smells seems to be not purely chemically determined but is different for molecules consisting of atoms differing only by the weight of nucleus and thus being chemically identical. Therefore the vibrational spectrum of the molecule, which is typically in infrared, seems to be important. The proposal of Turin is that the process of odour perception involves the tunnelling of the vibraing electron from odour molecule. This tunnelling can be assisted by absorption of phonon coming from the receptor with frequency which corresponds to fundamental vibrational frequency or its multiple. The model has been tested in several cases. The latest test described by Lloyd is the one in which hydrogen in some molecule is replaced with deuteron, which is twice as heavy so that the vibrational frequency is reduced by a factor 1/kenosqrt2. Fruit flies took the role of odour perceivers and it turned out that they easily discriminate between the molecules.

I have considered earlier a somewhat different quantum model for odour perception by starting from the pioneering experimental work of Callahan kenocitebbio/Callahan, which led him to conclude that in the case of insects odour perception is "seeing" at infrared wavelengths. Infrared wavelengths correspond to vibrational energies for molecules so that this brings in the dependence on the square root of the inverse of the mass of the odorant and predicts that chemically identical molecules containing only different isotopes of atoms smell differently.Frequencies are same as in the model of Turin. Instead of phonons IR photons would play the key role serving as passwords exciting particular cyclotron state at particular magnetic tube. Similar mechanism could be at work in the case of ordinary vision.

Saturday, November 23, 2013

A new upper bound to electron's dipole moment as an additional blow against standard SUSY

A further blow against standard SUSY came for a couple of weeks ago. ACME collaboration has deduced a new upper bound on the electric dipole moment of electron, which is by order of magnitude smaller than the previous one. Jester and Lubos have more detailed commentaries.

The measurement of the dipole moment relies on a simple idea: electric dipole moment gives rise to additional precession if one has parallel magnetic and electric fields. The additional electric field is now that associated with the molecule containing electrons plus strong molecular electric field in the direction of spin quantization axes. One puts the molecules containing the electrons into magnetic field and measure the precession of spins by detecting the photons produced in the process. The deviation of the precession frequency from its value in magnetic field only should allow to deduce the upper bound for the dipole moment.

Semiclassically the non-vanishing dipole moment means asymmetric charge distribution with respect to the spin quantization axis. The electric dipole coupling term for Dirac spinors comes to effective action from radiative corrections and has the same form as magnetic dipole coupling involving sigma matrices except that one has an additional γ5 matrix bringing in CP breaking. The standard model prediction is of order de≈ 10-40 e× me,: this is by a factor 10-5 smaller than Planck length!

The new upper bound is de ≈ .87 × 10-32 e×me and still much larger than standard model prediction. Standard SUSY predicts typically non-vanishing dipole moment for electron. The estimate for the electron dipole moment coming from SUSYs and is by dimensional considerations of form de= c ℏ e× me/16π2M2, where c is of order unity and M is the mass scale for the new physics. The Feynman diagram in question involves the decay of electron to virtual neutrino and virtual chargino and the coupling of the latter to photon before absoption.

This upper bound provides a strong restriction on "garden variety" SUSY models (involving no fine tuning to make dipole moment smaller) and the scale at which SUSY could show itself becomes of order 10 TeV at least so that hopes for detecting SUSY at LHC should be rather meager. One can of course do fine tuning. "Naturality" idea does not favor fine tunings but is not in fashion nowadays: the existing theoretical models do not simply allow such luxury. The huge differences between elementary particle mass scales and quite "too long" proton lifetime represent basic example about "non-naturality" in the GUT framework. For an outsider like me this strongly suggests that although Higgs exist, Higgs mechanism provides only a parametrization of particle masses - maybe the only possible theoretical description in quantum field theory framework treating particles as point like - and must be eventually replaced with a genuine theory. For instance, Lubos does not see this fine tuning is not seen as reason for worrying too much. Personally I however feel worried since my old-fashioned view is that theoretical physicists must be able to make predictions rather than only run away the nasty data by repeated updating of the models so that they become more and more complicated.

Still about Equivalence Principle

Every time I have written about Equivalence Principle (briefly EP in the following ) in TGD framework I feel that it is finally fully and completely understood. But every time I am asked about EP and TGD, I feel uneasy and end up making just question "Could it be otherwise?". This experience repeated itself when Hamed made this question in previous posting.

Recall that EP in its Newtonian form states that gravitational and inertial masses are identical. Freely falling lift is the famous thought experiment leading to the vision of general relativity that gravitation is not a real force and in suitable local coordinate system can be eliminated locally. The more abstract statement is that particles falling freely in gravitational field move along geodesic lines. At the level of fields this leads Einstein's equations stating that energy momentum tensor is proportional Einstein tensor plus a possible cosmological term proportional to metric. Einstein's equations allow only the identification of gravitational and inertial energy momentum densities but do not allow to integrate these densities to four-momenta. Basically the problem is that translations are not symmetries anymore so that Noether theorem does not help. Hence it is very difficult to find a satisfactory definition of inertial and gravitational four-momenta. This difficulty was the basic motivation of TGD.

In TGD abstract gravitation for four-manifolds is replaced with sub-manifold gravity in M4× CP2 having also the symmetries of empty Minkowski space and one overcomes the mentioned problem. It is however far from clear whether one really obtains EP - even at long length scale limit! There are many questions in queue waiting for answer. What Equivalence Principle means in TGD? Just motion along geodesics in absence of non-gravitational forces or equivalence of gravitational and inertial masses? How to identify gravitational and inertial masses in TGD framework? Is it necessary to have both of them? Is gravitational mass something emerging only at the long leng scale limit of the theory? Does one obtain Einstein's equations or something more general at this limit - or perhaps quite generally?

What about quantum classical correspondence: are inertial and gravitational masses well-defined and non-tautologically identical at both quantum and classical level? Are quantal momenta (super conformal representations) and classical momenta (Noether charges for Kähler action) identical or does this apply at least to mass squared operators?

Quantum level

One can start from the fact that TGD is a generalization of string models and has generalization of super-conformal symmetries as symmetries. Quantal four-momentum is associated with quantum states - quantum superpositions of 3-surfaces in TGD framework. For the representations of super-conformal algebras (this includes both Virasoro and Kac-Moody type algebras) four-momentum appears automatically once one has Minkowski space and now one indeed has M4× CP2. One also obtains stringy mass formula. This happens also in TGD and p-adic thermodynamics and leads to excellent predictions for elementary particle masses and mass scales with minor input - one of them is five tensor factors in the representations of the super Virasoro algebra five tensor factors in the representations of the super Virasoro algebra. The details are more fuzzy since five tensor product factors for the Super Virasor algebra is the only constraint and it has turned possible to imagine many manners to satisfy the constraint. Here mathematician's helping hand would be extremely wellcome.

The basic question is obvious. Is there any need to identify both inertial and gravitational masses at the superconformal level? If so, can one achieve this?

  1. There are two super-conformal algebras involved. The supersymplectic algebra associated with the imbedding space (boundary of CD) could correspond to inertial four-momentum since it acts at the level of imbedding space and the super Kac-Moody algebra associated with light-like 3-surfaces to gravitational four momentum since its action is at space-time level.

  2. I have considered the possibility that the called coset representation for these algebras could lead to the identification of gravitational and inertial masses. Supersymplectic algebra can be said to contain the Kac-Moody algebra as sub-algebra since the isometries of light-cone boundary and CP2 can be imbedded as sub-algebra to the super-symplectic algebra. Could inertial and gravitational masses correspond to the four-momenta assignable to these two algebras? Coset representation would by definition identify inertial and gravitational super- conformal generators. In the case of scaling generator this would mean the identification of mass squared operators and in the case of they super counterparts identification of the four-momenta since the differences of super-conformal generators would annihilate physical states. The question whether one really obtains five tensor factors is far from trivial and here it is easy to fall to the sin of self deception.

    A really cute feature of this approach is that p-adic thermodynamics for the vibrational part of either gravitational or inertial scaling generator does not mean breaking of super-conformal invariance since super conformal generators in the coset representation indeed annihilate the states although this is not the for the super-symplectic and super Kac-Moody representations separately. Note that quantum superposition of states with different values of mass squared and even energies makes sense in zero energy ontology.

  3. Second option would combine these two algebras to a larger algebra with common four-momentum identified as gravitational four-momentum. The fact that this four-momentum does not follow from a quantal version of Noether theorem suggests the interpretation as gravitational momentum. In this case the simplest manner to understand the five tensor factors of conformal algebra would be by assigning them to color group SU(3), electroweak group SU(2)L× U(1)(2 factors), symplectic groups of CP2 and light-cone boundary δ M4+.

    While writing a response to the question of Hamed the following question popped up. Could it be that classical four-momentum assignable to Kähler action using Noether theorem defines inertial four-momentum equal to the gravitational four-momentum identified as four-momentum assignable to super-conformal representation for the latter option? Gravitational four-momentum would certainly correspond naturally to super-conformal algebra just as in string models. The identification of classical and quantal four-momenta might make sense since translations form an Abelian algebra,or more generally Cartan sub-algebra of product of Poincare and color groups. Even weaker identification would be identification of inertial and conformal mass squared and color Casimir operators. EP would reduce to quantum classical correspondence just and General Coordinate Invariance (GCI) would force classical theory as an exact part of quantum theory! This would be elegant and minimize the number of conjectures but could be of course be wrong.

    One can argue that p-adic thermodynamics for the vibrational part of the total scaling generator (essentially mass squared or conformal weight defining it) breaks conformal invariance badly. This objection might actually kill this option.

Classical four-momentum and classical realization of EP

In the classical case the situation is actually more complex than in quantum situations due to the extreme non-linearity of Kähler action making also impossible naive canonical quantization even at the level of principle so that quantal counterparts of classical Noether charges do not exist. One can however argue that quantum classical correspondence applies in the case of Cartan algebra. Four-momenta and color quantum numbers indeed define one possible Cartan sub-algebra of isometries.

By Noether theorem Kähler action gives rise to inertial four-momentum as classical conserved charges assignable to translations of M4 (rather than space space-time surface). Classical four-momentum is always assignable to 3-surface and its components are in one-one correspondence with Minkowski coordinates. It can be regarded as M4 vector and thus also imbedding space vector.

Quantum classical correspondence requires that the Noetherian four-momentum equals to the conformal four-momentum. This irrespectively of whether EP reduces to quantum classical correspondence or not.

Einstein's equations have been however successful. This forces to ask whether classical field equations for preferred extremals could imply that the inertial four-momentum density defined by Kähler action is expressible as a superposition of terms corresponding to Einstein tensor and cosmological term or its generalization. If EP reduces to quantum classical correspondence, one could say that not only quantum physics but also quantum classical correspondence is represented at the level of sub-manifold geometry.

In fact, exactly the same argument that led Einstein to his equations applies now. Einstein argued that energy momentum tensor has a vanishing covariant divergence: Einstein's equations are the generic manner to satisfy this condition. Exactly the same condition can be posed in sub-manifold gravity.

  1. The condition that the energy momentum tensor associated with Kähler action has a vanishing covariant divergence is satisfied for known preferred extremals. Physically it states the vanishing of Lorentz 4-force associated with induced Kähler form defining a Maxwell field. The sum of electric and magnetic forces vanishes and the forces do not perform work. In static case the equations reduce to Beltrami equations stating that the rotor of magnetic field is parallel to the magnetic field. These equations are topologically highly interesting.

    These conditions are satisfied if Einstein's equations with cosmological term hold true for the energy momentum tensor of Kähler action. The vanishing of trace of Kahler energy momentum tensor implies that curvature scalar is constant and expressible in terms of cosmological constant. If cosmological constant vanishes, curvature scalar vanishes (Reissner-Nordström metric is example of this situation and CP2 defines Euclidian metric with this property). Thus the preferred extremels would correspond to extremely restricted subset of abstract 4-geometries.

  2. A more general possibility - which can be considered only in sub-manifold gravity - is that cosmological term is replaced by a combination of several projection operators to space-time surface instead of metric alone. Coefficients could even depend on position but satisfying consistency conditions guaranteeing that the energy momentum tensor is divergenceless. For instance, covariantly constant projection operators with constant coefficients can be considered.

If this picture is correct (do not forget the objection from p-adic thermodynamcs), what remains is the question whether quantum classical correspondence is true for four-momenta or at least mass squared and color Casimir operators. Technically the situation would be the same for both interpretations of EP. Basically the question is whether inertial-gravitational equivalence and quantal-classical equivalence are one and same thing or not.

Defending intuition

What is frustrating that the field equations of TGD are so incredibly non-linear that intuitive approach based on wild guesses and genuine thinking as opposed to blind application of calculational rules is the only working approach. I know quite well that for many colleagues intuitive thinking is the deadliest sin of all deadly sins. For them the ideal of theoretical physicist is a brainless monkey who has got the Rules. Certainly intuitive approach allows only to develop conjectures, which hopefully can be proven right or wrong and intuition can lead to wrong path unless it is accompanied by a critical attitude.

I am however an optimist. We know that it has been possible to develop perturbation theory for super symmetric version of Einstein action by using twistor Grassmann approach. Stringy variant of this approach with massless fermions as fundamental particles suggests itself in TGD. TGD Universe possesses huge symmetries and this should make also classical theory simple: I have indeed made proposal about how to construct general solutions of field equations in terms of what I call Hamilton-Jacobi structure. For these and many other reasons I continue to belief in the power of intuition.

Monday, November 18, 2013

Old web page address ceased to work again: situation should change in January!


The old homepage address has ceased to work again. As I have told, I learned too late that the web hotel owner is a criminal. It is quite possible that he receives "encouragement" from some finnish academic people who have done during these 35 years all they can to silence me. Thinking in a novel way in Finland is really dangerous activity! It turned out impossible to get any contact with this fellow to get the right to forward the visitors from the old address to the new one (which by the way differs from the old one only by replacement of ".com" with ".fi").

I am sorry for inconvenience. The situation should change in January.

Sunday, November 17, 2013

Constant torque as a manner to force phase transition increasing the value of Planck constant

The challenge is to identify physical mechanisms forcing the increase of effective Planck constant heff (whether to call it effective or not is to some extent matter of taste). The work with certain potential applications of TGD led to a discovery of a new mechanism possibly achieving this. The method would be simple: apply constant torque to a rotating system. I will leave it for the reader to rediscover how this can be achieved. It turns out that the considerations lead to considerable insights about how large heff phases are generated in living matter.

Could constant torque force the increase of heff?

Consider a rigid body allowed to rotated around some axes so that its state is characterized by a rotation angle φ.
Assumed that a constant torque τ is applied to the system.

  1. The classical equations of motion are

    I d2φ/dt2= τ .

    This is true in an idealization as point particle characterized by its moment of inertia around the axis of rotation. Equations of motion are obtained from the variational principle

    S= ∫ Ldt , L= I(dφ/dt)2/2- V(φ) , V(φ)= τφ .

    Here φ denotes the rotational angle. The mathematical problem is that the potential function V(φ) is either many-valued or dis-continuous at φ= 2π.


  2. Quantum mechanically the system corresponds to a Scrödinger equation

    - hbar2/2I× ∂2Ψ/∂φ2 +τ φ Ψ = -i∂Ψ/∂ t .

    In stationary situation one has

    - hbar2/2I× ∂2Ψ/∂φ2 +τ φ Ψ = EΨ .

  3. Wave function is expected to be continuous at φ=2π. The discontinuity of potential at φ= φ0 poses further strong conditions on the solutions: Ψ should vanish in a region containing the point φ0. Note that the value of φ0 can be chosen freely.

    The intuitive picture is that the solutions correspond to strongly localized wave packets in accelerating motion. The wavepacket can for some time vanish in the region containing point φ0. What happens when this condition does not hold anymore?

    • Dissipation is present in the system and therefore also state function reductions. Could state function reduction occur when the wave packet contains the point, where V(φ) is dis-continuous?

    • Or are the solutions well-defined only in a space-time region with finite temporal extent T? In zero energy ontology (ZEO) this option is automatically realized since space-time sheets are restricted inside causal diamonds (CDs). Wave functions need to be well-defined only inside CD involved and would vanish at φ0. Therefore the mathematical problems related to the representation of accelerating wave packets in non-compact degrees of freedom could serve as a motivation for both CDs and ZEO.

    There is however still a problem. The wave packet cannot be in accelerating motion even for single full turn. More turns are wanted. Should one give up the assumption that wave function is continuous at φ=φ0+ 2π and should one allow wave functions to be multivalued and satisfy the continuity condition Ψ(φ0)=Ψ(φ0+n2π), where n is some sufficiently large integer. This would mean the replacement of the configuration space (now circle) with its n-fold covering.

The introduction of the n-fold covering leads naturally to the hierarchy of Planck constants.

  1. A natural question is whether constant torque τ could affect the system so that φ=0 ja φ=2π do not represent physically equivalent configurations anymore. Could it however happen that φ=0 ja φ= n2π for some value of n are still equivalent? One would have the analogy of many-sheeted Riemann surface.

  2. In TGD framework 3-surfaces can indeed be analogous to n-sheeted Riemann surfaces. In other words, a rotation of 2π does not produce the original surface but one needs n2π rotation to achieve this. In fact, heff/h=n corresponds to this situation geometrically! Space-time itself becomes n-sheeted covering of itself: this property must be distinguished from many-sheetedness. Could constant torque provide a manner to force a situation making space-time n-sheeted and thus to create phases with large value of heff?

  3. Schrödinger amplitude representing accelerated wave packet as a wavefunction in the n-fold covering would be n-valued in the ordinary Minkowski coordinates and would satisfy the boundary condition

    Ψ(φ)= Ψ(φ+ n2π) .

    Since V(φ) is not rotationally invariant this condition is too strong for stationary solutions.

  4. This condition would mean Fourier analysis using the exponentials exp(imφ/n) with time dependent coefficients cm(t) whose time evolution is dicrated by Schröndinger equation. For ordinary Planck constant this would mean fractional values of angular momentum

    Lz= m/n hbar .

    If one has heff=nhbar, the spectrum of Lz is not affected. It would seem that constant torque forces the generation of a phase with large value of heff! From the estimate for how many turns the system rotates one can estimate the value of heff.

What about stationary solutions?

Giving up stationary seems the only option on basis of classical intuition. One can however ask whether also stationary solutions could make sense mathematically and could make possible completely new quantum phenomena.

  1. In the stationary situation the boundary condition must be weakened to

    Ψ(φ0)= Ψ(φ0+ n2π) .

    Here the choice of φ0 characterizes the solution. This condition quantizes the energy. Normally only the value n=1 is possible.

  2. The many-valuedness/discontinuity of V(φ) does not produce problems if the condition

    Ψ(φ0,t)=Ψ(φ0+ n2π,t) =0 , & 0<t<T .

    is satisfied. Schrödinger equation would be continuous at φ=φ0+n2π. The values of φ0 would correspond to a continuous state basis.

  3. One would have two boundary conditions expected to fix the solution completely for given values of n and φ0. The solutions corresponding to different values of φ0 are not related by a rotation since V(φ) is not invariant under rotations. One obtains infinite number of continous solution families labelled by n and they correspond to different phases if heff is different from them.

The connection with WKB approximation and Airy functions

Stationary Schrödinger equation with constant force appears in WKB approximation and follows from a linearization of the potential function at non-stationary point. A good example is Schröndinger equation for a particle in the gravitational field of Earth. The solutions of this equation are Airy functions which appear also in the electrodynamical model for rainbow.

  1. The standard form for the Schrödnger equation in stationary case is obtained using the following change of variables

    u+e= kφ , k3=2τ I/hbar2 , e=2IE/hbar2k2 .

    One obtains Airy equation

    d2Ψ/du2- uΨ =0 .

    The eigenvalue of energy does not appear explicitly in the equation. Boundary conditions transform to

    Ψ(u0+ n2π k )= Ψ(u0) =0 .

  2. In non-stationary case the change of variables is

    u= kφ , k3=2τ I/hbar2 , v=(hbar2k2/2I)× t

    One obtains

    d2Ψ/du2- uΨ =i∂v Ψ .

    Boundary conditions are

    Ψ(u+ kn2π,v )= Ψ(u,v) , 0 ≤ v≤ hbar2k2/2I× T .

An interesting question is what heff=n× h means? Should one replace h with heff=nh as the condition that the spectrum of angular momentum remains unchanged requires. One would have k ∝ n-2/3 ja e∝ n4/3. One would obtain boundary conditions non-linear with respect to n.

Connection with living matter

The constant torque - or more generally non-oscillatory generalized force in some compact degrees of freedom - requires of a continual energy feed to the system. Continual energy feed serves as a basic condition for self-organization and for the evolution of states studied in non-equilibrium thermodynamics. Biology represents a fundamental example of this kind of situation. The energy feeded to the system represents metabolic energy and ADP-ATP process loads this energy to ATP molecules. Also now constant torque is involved: the ATP synthase molecule contains the analog of generator having a rotating shaft. Since metabolism and the generation of large heff phases are very closely related in TGD Universe, the natural proposal is that the rotating shaft forces the generation of large heff phases.

For details and background see the chapter Macroscopic quantum coherence and quantum metabolism as different sides of the same coin: part II" of "Biosystems as Conscious Holograms".

Addition: The old homepage address has ceased to work again. As I have told, I learned too late that the web hotel owner is a criminal. It is quite possible that he receives "encouragement" from some finnish academic people who have done during these 35 years all they can to silence me. It turned out impossible to get any contact with this fellow to get the right to forward the visitors from the old address to the new one (which by the way differs from the old one only by replacement of ".com" with ".fi"). The situation should change in January. I am sorry for inconvenience. Thinking in a novel way in Finland is really dangerous activity!