Thursday, March 31, 2016

Why Mersenne primes are so special?

Mersenne primes are central in TGD based world view. p-Adic thermodynamics combined with p-adic length scale hypothesis stating that primes near powers of two are physically preferred provides a nice understanding of elementary particle mass spectrum. Mersenne primes Mk=2k-1, where also k must be prime, seem to be preferred. Mersenne prime labels hadronic mass scale (there is now evidence from LHC for two new hadronic physics labelled by Mersenne and Gaussian Mersenne), and weak mass scale. Also electron and tau lepton are labelled by Mersenne prime. Also Gaussian Mersennes MG,k=(1+i)k-1 seem to be important. Muon is labelled by Gaussian Mersenne and the range of length scales between cell membrane thickness and size of cell nucleus contains 4 Gaussian Mersennes!

What gives Mersenne primes so special physical status in TGD Universe? I have considered this problem many times during years. The key idea is that natural selection is realized in much more general sense than usually thought, and has chosen them and corresponding p-adic length scales. Particles characterized by p-adic length scales should be stable in some well-defined sense.

Since evolution in TGD corresponds to generation of information, the obvious guess is that Mersenne primes are information theoretically special. Could the fact that 2k-1 represents almost k bits be of significance? Or could Mersenne primes characterize systems, which are information theoretically especially stable?

In the following a more refined TGD inspired quantum information theoretic argument based on stability of entanglement against state function reduction, which would be fundamental process governed by Negentropy Maximization Principle (NMP) and requiring no human observer, will be discussed.

How to achieve stability against state function reductions?

TGD provides actually several ideas about how to achieve stability against state function reductions. This stability would be of course marvellous fact from the point of view of quantum computation since it would make possible stable quantum information storage. Also living systems could apply this kind of storage mechanism.

  1. p-Adic physics leads to the notion of negentropic entanglement (NE) for which number theoretic entanglement entropy is negative and thus measures genuine, possibly conscious information assignable to entanglement (ordinary entanglement entropy measures the lack of information about the state of either entangled system). NMP favors the generation of NE. NE can be however transferred from system to another (stolen using less diplomatically correct expression!), and this kind of transfer is associated with metabolism. This kind of transfer would be the most fundamental crime: biology would be basically criminal activity! Religious thinker might talk about original sin.

    In living matter NE would make possible information storage. In fact, TGD inspired theory of consciousness constructed as a generalization of quantum measurement theory in Zero Energy Ontology (ZEO) identifies the permanent self of living system (replaced with a more negentropic one in biological death, which is also a reincarnation) as the boundary of CD, which is not affected in subsequent state function reductions and carries NE. The changing part of self - sensory input and cognition - can be assigned with opposite changing boundary of CD.

  2. Also number theoretic stability can be considered. Suppose that one can assign to the system some extension of algebraic numbers characterizing the WCW coordinates ("world of classical worlds") parametrizing the space-time surface (by strong form of holography (SH) the string world sheets and partonic 2-surfaces continuable to 4-D preferred extremal) associated with it.

    This extension of rationals and corresponding algebraic extensions of p-adic numbers would define the number fields defining the coefficient fields of Hilbert spaces (it might be necessary to assume that the coefficients belong to the extension of rationals also in p-adic sector although they can be regarded as p-adic numbers). Assume that you have an entangled system with entanglement coefficients in this number field. Suppose you want to diagonalize the corresponding density matrix. The eigenvalues belong in general case to a larger algebraic extension since they correspond to roots of a characteristic polynomials assignable to the density matrix. Could one say, that this kind of entanglement is stable (at least to some degree) against state function reduction since it means going to an eigenstate which does not belong to the extension used? Reader can decide!

  3. Hilbert spaces are like natural numbers with respect to direct sum and tensor product. The dimension of the tensor product is product mn of the dimensions of the tensor factors. Hilbert space with dimension n can be decomposed to a tensor product of prime Hilbert spaces with dimensions which are prime factors of n. In TGD Universe state function reduction is a dynamical process, which implies that the states in state spaces with prime valued dimension are stable against state function reduction since one cannot even speak about tensor product decomposition, entanglement, or reduction of entanglement. These state spaces are quantum indecomposable and would be thus ideal for the storage of quantum information!

    Interestingly, the system consisting of k qubits have Hilbert space dimension D=2k and is thus maximally unstable against decomposition to D=2-dimensional tensor factors! In TGD Universe NE might save the situation. Could one imagine a situation in which Hilbert space with dimension Mk=2k-1 stores the information stably? When information is processed this state space would be mapped isometrically to 2k-dimensional state space making possible quantum computations using qubits. The outcome of state function reduction halting the computation would be mapped isometrically back to Mk-D space. Note that isometric maps generalizing unitary transformations are an essential element in the proposal for the tensor net realization of holography and error correcting codes (see this).

    Can one imagine any concrete realization for this idea? This question will be considered in the sequel.

How to realize Mk=2k-1-dimensional Hilbert space physically?

One can imagine at least three physical realizations of Mk=2k-1-dimensional Hilbert space.

  1. The set with k elements has 2k subsets. One of them is empty set and cannot be physically realized. Here the reader might of course argue that if they are realized as empty boxes, one can realize them. If empty set has no physical realization, the wave functions in the set of non-empty subsets with 2k-1 elements define 2k-1-dimensional Hilbert space. If 2k-1 is Mersenne prime, this state state space is stable against state function reductions since one cannot even speak about entanglement!

    To make quantum computation possible one must map this state space to 2k dimensional state space by isometric imbedding. This is possible by just adding a new element to the set and considering only wave functions in the set of subsets containing this new element. Now also the empty set is mapped to a set containing only this new element and thus belongs to the state space. One has 2k dimensions and quantum computations are possible. When the computation halts, one just removes this new element from the system, and the data are stored stably!

  2. Second realization relies on k bits represented as spins such that 2k-1 is Mersenne prime. Suppose that the ground state is spontaneously magnetized state with k+l parallel spins, with the l spins in the direction of spontaneous magnetization and stabilizing it. l>1 is probably needed to stabilize the direction of magnetization: l ≤ k suggests itself as the first guess. Here thermodynamics and a model for spin-spin interaction would give a better estimate.

    The state with the k spins in direction opposite to that for l spins would be analogous to empty set. Spontaneous magnetization disappears, when a sufficient number of spins is in direction opposite to that of magnetization. Suppose that k corresponds to a critical number of spins in the sense that spontaneous magnetization occurs for this number of parallel spins. Quantum superpositions of 2k-1 states for k spins would be stable against state function reduction also now.

    The transformation of the data to a processable form would require an addition of m≥ 1 spins in the direction of the magnetization to guarantee that the state with all k spins in direction opposite to the spontaneous magnetization does not induce spontaneous magnetization in opposite direction. Note that these additional stabilizing spins are classical and their direction could be kept fixed by a repeated state function reduction (Zeno effect). One would clearly have a critical system.

  3. Third realization is suggested by TGD inspired view about Boolean consciousness. Boolean logic is represented by the Fock state basis of many-fermion states. Each fermion mode defines one bit: fermion in given mode is present or not. One obtains 2k states. These states have different fermion numbers and in ordinary positive energy ontology their realization is not possible.

    In ZEO situation changes. Fermionic zero energy states are superpositions of pairs of states at opposite boundaries of CD such that the total quantum numbers are opposite. This applies to fermion number too. This allows to have time-like entanglement in which one has superposition of states for which fermion numbers at given boundary are different. This kind of states might be realized for super-conductors to which one at least formally assigns coherent state of Cooper pairs having ill-defined fermion number.

    Now the non-realizable state would correspond to fermion vacuum analogous to empty set. Reader can of course argue that the bosonic degrees of freedom assignable to the space-time surface are still present. I defend this idea by saying that the purely bosonic state might be unstable or maybe even non-realizable as vacuum state and remind that also bosons in TGD framework consist of pairs of fundamental fermions.

    If this state is effectively decoupled from the rest of the Universe, one has 2k-1-dimensional state space and states are stable against state function reduction. Information processing becomes possible by adding some positive energy fermions and corresponding negative energy fermions at the opposite boundaries of CD. Note that the added fermions do not have time-like quantum entanglement and do not change spin direction during time evolution.

    The proposal is that Boolean consciousness is realized in this manner and zero energy states represents quantum Boolean thoughts as superposition of pairs (b1⊗ b2) of positive and negative energy states and having identification as Boolean statements b1→ b2. The mechanism would allow both storage of thoughts as memories and their processing by introducing the additional fermion.

So: why Mersenne primes would be so special?

Returning to the original question "Why Mersenne primes are so special?". A possible explanation is that elementary particle or hadron characterized by a p-adic length scale p= Mk=2k-1 both stores and processes information with maximal effectiveness. This would not be surprising if p-adic physics defines the physical correlates of cognition assumed to be universal rather than being restricted to human brain.

In adelic physics p-dimensional Hilbert space could be naturally associated with the p-adic adelic sector of the system. Information storage could take place in p=Mk=2k-1 phase and information processing (cognition) would take place in 2k-dimensional state space. This state space would be reached in a phase transition p=2k-1→ 2 changing effective p-adic topology in real sector and genuine p-adic topology in p-adic sector and replacing padic length scale ∝ p1/2≈ 2k/2 with k-nary 2-adic length scale ∝ 2k/2.

Electron is characterized by the largest not completely super-astrophysical Mersenne prime M127 and corresponds to k=127 bits. Intriguingly, the secondary p-adic time scale of electron corresponds to .1 seconds defining the fundamental biorhythm of 10 Hz.

This proposal suffers from deficiencies. It does not explain why also Gaussian Mersennes are special. Gaussian Mersennes correspond ordinary primes near power of 2 but not so near as Mersenne primes are. Neither does it explain why also more general primes p≈ 2k seem to be preferred. Furthermore, p-adic length scale hypothesis generalizes and states that primes near powers of at least small primes q: p≈ qk are special at least number theoretically. For instance, q=3 seems to be important for music experience and also q=5 might be important (Golden Mean).

Could the proposed model relying on criticality generalize. There would be p<2k-dimensional state space allowing isometric imbedding to 2k-dimensional space such that the bit configurations orthogonal to the image would be unstable in some sense. Say against a phase transition changing the direction of magnetization. One can imagine the variants of above described mechanism also now. For q>2 one should consider pinary digits instead of bits but the same arguments would apply (except in the case of Boolean logic).

For a summary of earlier postings see Links to the latest progress in TGD.

See the chapter Unified Number Theoretic Vision of "Physics as Generalized Number Theory" or the article Why Mersenne primes are so special?.

Monday, March 28, 2016

Evidence for rho or omega meson of M89 hadron physics

Evidence for M89 hadron physics is accumulating rapidly. I am grateful for Lubos for keeping book about the bumps: this helps enormously. In the latest posting I told about evidence for Z' a la TGD and indications for M89 J/Psi, which is vector meson. Now Lubos tells about excess, which could have interpretation as the lightest M89 vector meson - ρ89 or ω89. Mass is the predicted correctly with 5 per cent accuracy by the familiar p-adic scaling argument: multiply the mass of ordinary meson with 512.

Physics is sometimes simple but this does not mean that is numerology - as simple minded colleague, who prefers ultraheavy numerics instead of imaginative thinking, might argue: deep principles distilled through a work of 38 years are behind this simple rule.

This 375 GeV excess might indeed represent the lightest vector meson of M89 hadron physics. ρ and ω of standard hadron physics have mass 775 MeV and the scaled up mass is about 397 GeV, which is about 5 per cent heavier than the mass of Zgamma excess.

The decay ρ→ Z+γ describable at quark level via quark exchange diagram involving emission of Z and γ. The effective action would be proportional to Tr(ρ*γ*Z), where the product and trace are for antisymmetric field tensors. This kind effective action should describe also the decay to gamma pair. By angular momentum conservation the photons of gamma pairs should be in relative L=1 state. Since Z is relativistic, L=1 is expected to be favored also for Z+γ final state. Professional could immediately tell whether this is correct view.
Similar argument applies to the decay of ω which is isospin singlet. For charged ρ also decays to Wγ and WZ are possible. Note that the next lightest vector meson would be K* with mass 892 MeV. K89 should have mass 457 GeV.

For some reason also Lubos has got interested in powers of two and notices that 375 GeV is 1/2 of the famous 750 GeV. p-Adic length scale hypothesis might allow to understand these factors if octaves or even half octaves of particles are realized.

See the article Indications for the new physics predicted by TGD and chapter New Particle Physics Predicted by TGD: Part I of "p-Adic Physics".

For a summary of earlier postings see Links to the latest progress in TGD.

Sunday, March 27, 2016

Tetrahedral equation of Zamolodchikov

I encountered in Facebook a link to very interesting article by John Baez telling about tetrahedral equation of Zamolodchikov - Zamolodchikov is one of the founders of conformal field theories. I should have been well-aware about this equation. Already because I worked some time ago with the question how non-associativity and language might emerge from fundamental physics [A(BC) is different from (AB)C: language is excellent example]. The illustrations in the article of Baez are necessary to obtain some idea about what is involved and I strongly recommend them.

From the illustrations of the text of Baez one learns that a the 2-D surface in 4-D space-time is deformation known as third Reidermeister move. Physicists talk about Yang-Baxter equation (YBE) and it says that it does nothing for the topology. YBE tells that it does nothing to the quantum staet.

One can however assume that "doing nothing" is replaced with what is called 2-morphism. "Kind of gauge transformation" takes place would be the physicist's first attempt to assign to this something familiar. The outcome is unitarily equivalent with the original but not the same anymore. This actually requires a generalization of the notion of group to quantum group. Braid statistics emerges: the exchange of braid strands brings in phase or even non-commutative operation on two braid state.

Tetrahedral equation generalizes "Yang-Baxterator" so that it is not an identity anymore but becomes what is called 2-morphism. One however obtains an identity for two different combinations of 4 Reidemeister moves performed for 4 strands instead of 3. To make things really complicated one could give up also this identity and consider next level in the hierarchy.

What makes this so interesting that in 4-D context of TGD that also 2-knots formed by 2-D objects (such as string world sheets and partonic 2-surfaces) in 4-D space-time become possible. Quite generally: D-2 dimensional things get knotted in D dimensions. I have proposed that 2-knots could be crucial for information processing in living matter. Knots and braids would represent information such as topological quantum computer programs, 2-knots information processing such as developing of these programs.

In TGD one would something much more non-trivial than Reidermeister moves. The ordinary knots could really change in the operations represented by 2-knots unlike in Reidermeister moves. 2-knots/2-braids could represent genuine modifications of 1-knots since the reconnections at which knot strands can go through each other could open the knot partially or make it more complex (remember what Alexander the Great did to open the Gordion knot). The process of forming of knot invariant means gradual opening of knot in systematic stepwise manner. This kind of process could take in 4-D and be represented by string world sheet and corresponding evolution of quantum state in Zero Energy Ontology (ZEO) would represent opening of knot.

One of the basic questions in consciousness theory is whether problem solving could have as a universal physical or topological counterpart. A crazy question: could opening of 1-knot - a process defining 2-knot- serve as the topological counterpart of problem solving and give rise to its quantal counterpart in ZEO? Or could Reidermeister moves transforming trivial knot to manifestly trivial form correspond to problem solving. It would seem that the Alexandrian manner to solve problems is what happens in the real world;-).

What about higher-D knots? 4-D space-time surfaces can get knotted in 6-D space-times. If the twistorialization of TGD by lifting space-time surfaces to 6-D surface in the product of twistor spaces of Minkowski space and CP2 makes sense then space-time surfaces have representations as 4-surfaces in their 6-D twistor space. Could space-time surfaces get 4-knotted in twistor-space? If so, poor space-time surface - classical world - would be in really difficult situation!;-). By the way, also light-like 3-surfaces representing parton orbits could get knotted at the 5-D boundaries of 6-D twistor space regions assignable to space-time regions with Euclidian or Minkowskian signature!

For a summary of earlier postings see Links to the latest progress in TGD.

Wednesday, March 23, 2016

Direct evidence for Z' a la TGD and M89 J/Psi

The bumps indicating the presence of new physics predicted by TGD have begun to accumulate rapidly and personally I dare to regard the situation as settled: individual bumps do not matter but when an entire zoo of bumps with predicted masses emerges, the situation changes (see this, this and this). Colleagues (especially the finnish ones) will encounter quite a demanding challenge in explaining how it is possible that I am still forced visit in bread quee in order to cope with the basic metabolic needs;-).

Lubos told that there is direct evidence for Z' boson now: earlier the evidence was only indirect: breaking of universality and anomaly in angle distribution in B meson decays. Z' bump has mass around 3 TeV. TGD predicts 2.94 TeV mass for second generation Z breaking universality. The decay width by direct scaling would be .08 TeV and is is larger than deviation .06 TeV from 3 TeV. Lubos reported half year ago about excess at 2.9 GeV which is also consistent with TGD prediction.

Lubos tells also about 3 sigma bump at 1.650 TeV assigned to Kaluza-Klein graviton in the search for Higgs pairs hh decaying to bbbar+ bbbar. Kaluza-Klein gravitons are rather exotic creatures and in absence of any other support for superstring model they are not the first candidate coming into my mind. I do not know how strong the evidence for spin 2 is but I dare to consider the possibility of spin 1 and ask whether M89 hadronic physics could allow an identification for this bump.

  1. Very naively scaled up J/Psi of M107 hadron physics having spin J=1 and mass equal to 3.1 GeV would have mass 1.585 TeV: error is about 4 per cent. The effective action would be based on gradient coupling similar in form to Zhh coupling. The decays via hh → bbbar+bbbar could take place also now.

  2. This scaling might be too naive: the quarks of M89 might be same as those of ordinary hadron physics and only the color magnetic energy would be scaled up by factor 512. c quark mass is equal 1.29 GeV so that the magnetic energy of ordinary J/Psi would be equal to .52 GeV. If so, M89 version of J/Psi would have mass of only 269 GeV. Lubos tells also about evidence for a 2 sigma bump at 280 GeV identified as CP odd Higgs - this identification of course reflects the dream of Lubos about standard SUSY at LHC energies. However, the scaling of eta meson mass 547.8 MeV by 512 gives 280.4 GeV so that the interpretation as eta meson proposed already earlier is convincing. The naive scaling might be the correct thing to do also for mesons containing heavier quarks.

In any case, even if one forgets J/Psi, there is now direct evidence for as many as 3 new branches of physics predicted by TGD! Two scaled variants of hadron physics (M89 and MG,79) and second generation weak physics (MG,79)!

Colleagues have realized that history is in making. I read from popular article that theoreticians left their ongoing projects and have started to study 750 GeV bump and certainly also other bumps. Ellis talked already about entire new physics. TGD message has gone through! But no one mentions TGD although all is published in Huping Hu's journals and in Research Gate! No need for this in the recent science community based on ethics of brutal opportunism: steal, lie, betray as hippies expressed it.

See the article Indications for the new physics predicted by TGD and chapter New Particle Physics Predicted by TGD: Part I.

For a summary of earlier postings see Links to the latest progress in TGD.

Tuesday, March 22, 2016

Causal loophole, zero energy ontology, subjective time, geometric time

Finnish experimental physicists K. S. Kumar, A. Vepsäläinen, S. Danilin and G. S. Paraoanu (the leader of the group) working at Aalto University have published a very interesting article in Nature Communications. One can find also a popular article about the discovery.

On studies transition from state 1 to 2 to 3. Usual causality implies that you must first induce transition from state 1 to 2 - by suitable chosen pulse in the experiments: energy of photons in pulse must correspond to energy difference between 2 and 1. After than you can induce transition from 2 to 3 by second suitably chosen pulse. In quantum world you can make this in different order. First the pulse inducing transition 2 to 3 (state is 1 so that nothing happens). Then you generate pulse 1 and inducing transition 1 to 2 and transition 2 to 3 takes place! Weird! This might have profound implications for quantum information processing.

Layman description for this loop in causality is following. Suppose you must get out from parking hall. In classical world you first reverse the car and then drive away. In quantum world you can first drive away and then reverse the car! Good choice if you have a really big hurry! Maybe you should however not try this without the guidance of quantum physicist.

This is really crazy looking idea, which can be understood only in 4-D context. Zero Energy Ontology plus the fact that geometric time and subjective time (and therefore the corresponding causalities) are not one and the same thing explains the nicely. The "subjecively first" pulse represents 4-D wave as 4-D geometric entity (here time is geometric), which can induce the transition from state 2 to 3 if 2 is present in 4-D domain - causal diamond (CD) in TGD. Otherwise nothing happens: this is the case now! One kicks by second pulse state 1 to 2 "subjectively after" the first pulse. The state is 2 in entire CD and now the "subjectively first" pulse in CD can induce the transition from 2 to 3 in 4-D geometric space-time domain (CD)!

For a summary of earlier postings see Links to the latest progress in TGD.

Saturday, March 19, 2016

Tensor nets and S-matrices

The concrete construction of scattering amplitudes has been the toughest challenge of TGD and the slow progress has occurred by identification of general principles with many side tracks. One of the key problems has been unitarity. The intuitive expectation is that unitarity should reduce to a local notion somewhat like classical field equations reduce the time evolution to a local variational principle. The presence of propagators have been however the the obstacle for locally realized unitarity in which each vertex would correspond to unitary map in some sense.

TGD suggests two approaches to the construction of S-matrix.

  1. The first approach is generalization of twistor program (this). What is new is that one does not sum over diagrams but there is a large number of equivalent diagrams giving the same outcome. The complexity of the scattering amplitude is characterized by the minimal diagram. Diagrams correspond to space-time surfaces so that several space-time surfaces give rise to the same scattering amplitude. This would correspond to the fact that the dynamics breaks classical determinism. Also quantum criticality is expected to be accompanied by quantum critical fluctuations breaking classical determinism. The strong form of holography would not be unique: there would be several space-time surfaces assignable as preferred extremals to given string world sheets and partonic 2-surfaces defining "space-time genes".

  2. Second approach relies on the number theoretic vision and interprets scattering amplitudes as representations for computations with each 3-vertex identifiable as a basic algebraic operation (this). There is an infinite number of equivalent computations connecting the set of initial algebraic objects to the set of final algebraic objects. There is a huge symmetry involved: one can eliminate all loops moving the end of line so that it transforms to a vacuum tadpole and can be snipped away. A braided tree diagram is left with braiding meaning that the fermion lines inside the line defined by light-like orbit are braided. This kind of braiding can occur also for space-like fermion lines inside magnetic flux tubes and defining correlate for entanglement. Braiding is the TGD counterpart for the problematic non-planarity in twistor approach.

Third approach involving local unitary as an additional key element is suggested by tensor networks relying on the notion of perfect entanglement discussed by Preskill et al (see this and this). A detailed representation can be found in the article of Preskill et al ).
  1. Tensor networks provide an elegant representation of holography mapping interior states isometrically (in Hilbert space sense) to boundary states or vice versa for selected subsets of states defining the code subspace for holographic quantum error correcting code. Again the tensor net is highly non-unique but there is some minimal tensor net characterizing the complexity of the entangled boundary state.

  2. Tensor networks have two key properties, which might be abstracted and applied to the construction of S-matrix in zero energy ontology (ZEO): perfect tensors define isometry for any subspace defined by the index subset of perfect tensor to its complement and the non-unique graph representing the network. As far as the construction of Hilbert space isometry between local interior states and highly non-local entangled boundary states is considered, these properties are enough.

One cannot avoid the idea that these three constructions are different aspects of one and same construction and that tensor net construction with perfect tensors representing vertices could provide and additional strong constraint to the long sought for explicit recipe for the construction of scattering amplitudes. How tensor networks could the allow to generalize the notion of unitary S-matrix in TGD framework?

Objections

It is certainly clear from the beginning that the possibly existing description of S-matrix in terms of tensor networks cannot correspond to the perturbative QFT description in terms of Feynman diagrams.

  1. Tensor network description relates interior and boundary degrees in holography by a isometry. Now however unitary matrix has quite different role. It could correspond to U-matrix relating zero energy states to each other or to the S-matrix relating to each other the states at boundary of CD and at the shifted boundary obtained by scaling. These scalings shifting the second boundary of CD and increasing the distance between the tips of CD define the analog of unitary time evolution in ZEO. The U-matrix for transitions associated with the state function reductions at fixed boundary of CD effectively reduces to S-matrix since the other boundary of CD is not affected.

    The only manner one could see this as holography type description would be in terms of ZEO in which zero energy states are at boundaries of CD and U-matrix is a representation for them in terms of holography involving the interior states representing scattering diagram in generalized sense.

  2. The appearance of small gauge coupling constant tells that the entanglement between "states" in state spaces whose coordinates formally correspond to quantum fields is weak and just opposite to that defined by a perfect tensor. Quite generally, coupling constant might be the fatal aspect of the vertices preventing the formulation in terms of perfect entanglement.

    One should understand how coupling constant emerges from this kind of description - or disappears from standard QFT description. One can think of including the coupling constant to the definition of gauge potentails: in TGD framework this is indeed true for induced gauge fields. There is no sensical manner to bring in the classical coupling constants in the classical framework and
    the inverse of Kähler coupling strength appears only as multiplier of the Kähler action analogous to critical temperature.

    More concretely, there are WCW spin degrees of freedom (fermionic degrees of freedom) and WCW orbital degrees of freedom involving functional integral over WCW. Fermionic contribution would not involve coupling constants whereas the functional integral over WCW involving exponential of vacuum functional could give rise to the coupling constants assignable to the vertices in the minimal tree diagram.

  3. The decomposition S= 1+iT of unitary S-matrix giving unitarity as the condition -i(T-T) +TT=0 reflects the perturbative thinking. If one has only isometry instead of unitary transformation, this decomposition becomes problematic since T and T whose some appears in the formula act in different spaces. One should have the generalization of Id as a "trivial" isometry. Alternatively, one should be able to extend the state space Hin by adding a tensor factor mapped trivially in isometry.

  4. There are 3- and 4-vertices rather than only -say, 3-vertices as in tensor networks. For non-Abelian Chern-Simons term for simple Lie group one would have besides kinetic term only 3-vertex Tr(A∧ A ∧ A) defining the analog of perfect tensor entanglement when interpreted as co-product involving 3-D permutation symbol and structure constants of Lie algebra. Note also that for twistor Grassmannian approach the fundamental vertices are 3-vertices. It must be however emphasized that QFT description emerges from TGD only at the limit when one identifies gauge potentials as sums of induced gauge potentials assignable to the space-time sheets, which are replaced with single piece of Minkowski space.

  5. Tensor network description does not contain propagators since the contractions are between perfect tensors. It is to make sense propagators must be eliminated. The twistorial factorization of massless fermion propagator suggest that this might be possible by absorbing the twistors to the vertices.

These reasons make it clear that the proposed idea is just a speculative question. Perhaps the best strategy is to look this crazy idea from different view points: the overly optimistic view developing big picture and the approach trying to debunk the idea.

The overly optimistic vision

With these prerequisites one can follow the optimistic strategy and ask how tensor networks could allow to generalize the notion of unitary S-matrix in TGD framework.

  1. Tensor networks suggests the replacement of unitary correspondence with the more general notion of Hilbert space isometry. This generalization is very natural in TGD since one must allow phase transitions increasing the state space and it is quite possible that S-matrix represents only isometry: this would mean that SS=Idin holds true but SS=Idout does not even make sense. This conforms with the idea that state function reduction sequences at fixed boundary of causal diamonds defining conscious entities give rise evolution implying that the size of the state space increases gradually as the system becomes more complex. Note that this gives rise to irreversibility understandandable in terms of NMP (this). It might be even impossible to formally restore unitary by introducing formal additional tensor factor to the space of incoming states if the isometric map of the incoming state space to outgoing state space is inclusion of hyperfinite factors.

  2. If the huge generalization of the duality of old fashioned string models makes sense, the minimal diagram represesenting scattering is expected to be a tree diagram with braiding and should allow a representation as a tensor network. The generalization of the tensor network concept to include braiding is trivial in principle: assign to the legs connecting the nodes defined by perfect tensors unitary matrices representing the braiding - here topological QFT allows realization of the unitary matrix. Besides fermionic degrees of freedom having interpretation as spin degrees of freedom at the level of "World of Classical Worlds" (WCW) there are also WCW orbital degrees of freedom. These two degrees of freedom factorize in the generalized unitarity conditions and the description seems much simpler in WCW orbital degrees of freedom than in WCW spin degrees of freedom.

  3. Concerning the concrete construction there are two levels involved, which are analogous to descriptions in terms of boundary and interior degrees of freedom in holography. The level of fundamental fermions assignable to string world sheets and their boundaries and the level of physical particles with particles assigned to sets of partonic 2-surface connected by magnetic flux tubes and associated fermionic strings. One could also see the ends of causal diamonds as analogous to boundary degrees of freedom and the space-time surface as interior degrees of freedom.

The description at the level of fundamental fermions corresponds to conformal field theory at string world sheets.
  1. The construction of the analogs of boundary states reduces to the construction of N-point functions for fundamental fermions assignable to the boundaries of string world sheets. These boundaries reside at 3-surfaces at the space-like space-time ends at CDs and at light-like 3-surfaces at which the signature of the induced space-time metric changes.

  2. In accordance with holography, the fermionic N-point functions with points at partonic 2-surfaces at the ends of CD are those assignable to a conformal field theory associated with the union of string world sheets involved. The perfect tensor is assignable to the fundamental 4-fermion scattering which defines the microscopy for the geometric 3-particle vertices having twistorial interpretation and also interpretation as algebraic operation.

    What is important is that fundamental fermion modes at string world sheets are labelled by conformal weights and standard model quantum numbers. No four-momenta nor color quantum numbers are involved at this level. Instead of propagator one has just unitary matrix describing the braiding.

  3. Note that four-momenta emerging in somewhat mysterious manner to stringy scattering amplitudes and mean the possibility to interpret the amplitudes at the particle level.

Twistorial and number theoretic constructions should correspond to particle level construction and also now tensor network description might work.
  1. The 3-surfaces are labelled by four-momenta besides other standard model quantum numbers but the possibility of reducing diagram to that involving only 3-vertices means that momentum degrees of freedom effectively disappear. In ordinary twistor approach this would mean allowance of only forward scattering unless one allows massless but complex virtual momenta in twistor diagrams. Also vertices with larger number of legs are possible by organizing large blocks of vertices to single effective vertex and would allow descriptions analogous to effective QFTs.

  2. It is highly non-trivial that the crucial factorization to perfect tensors at 3-vertices with unitary braiding matrices associated with legs connecting them occurs also now. It allows to split the inverses of fermion propagators into sum of products of two parts and absorb the halves to the perfect tensors at the ends of the line. The reason is that the inverse of massless fermion propagator (also when masslessness is understood in 8-D sense allowing M4 mass to be non-vanishing) to be express as bilinear of the bi-spinors defining the twistor representing the four-momentum. It seems that this is absolutely crucial property and fails for massive (in 8-D sense) fermions.

For the details see the new chapter Holography and Quantum Error Correcting Codes: TGD View of "Hyper-Finite Factors, p-Adic Length Scale Hypothesis, And Dark Matter Hierarchy" or the article with the same title.

For a summary of earlier postings see Links to the latest progress in TGD.

Thursday, March 17, 2016

Evidence for the eta meson of M89 hadron physics

Lubos has had two postings about evidence for bumps at LHC. See the recent post about Moriond meeting and and earlier post about ATLAS gluino workshop.

The post about Moriond meeting mentions a rumor spread by Jester telling about 5 sigma evidence for 750 GeV resonance from ATLAS. ATLAS refuses to comment. Remember that 750 GeV bump would correspond to one of the mesons of M89 hadron physics, whose masses are obtained by scaling those of ordinary hadron physics by factor 512 (see the earlier posting). There are many of them in the range 600-900 GeV with precisely predicted masses and Lubos indeed mentions that this is the region still allowing possibility for stop. I can only regret if the decays of M89 mesons could be resposible for wrong hopes about standard SUSY;-). I can estimate their masses and do some other simple things and even I confess that TGD predicted them but I am not responsible for their existence!;-)

In Moriond meeting the existence of 750 GeV resonance - now christened as Chernette - was questioned. One might expect that it decays also via Zγ channel. It doesn't. Could meson property explain this? Ordinary neutral mesons decay to gamma pairs and these decays are exceptional resulting axial anomaly term (instanton term for electric field coupled to pseudoscalar meson). This should be the case also for their scaled up M89 variants. The decay rate should be exceptionally high since the instanton term is proportional to mass scale squared and decay rate to mass scale to fourth. This could make these decays of Chernette much faster than other decays and at the same time serve as a demonstration that new hadron physics predicted by TGD (not me) is to be blamed for the anomaly.

Lubos mentions also indications for 285 GeV bump decaying to gamma pair. The mass of the eta meson or ordinary hadron physics is .547 GeV and the scaling of eta mass by factor 512 gives 280.5 GeV : the error is less than 2 per cent. I have already earlier demonstrated (see the earlier posting) that the mesons of ordinary hadron physics have bumps at the scaled up masses. After having worked with the idea about two decades, I dare to make bet that M89 is there.

The production of M89 protons with mass about 4.8 TeV would be a really dramatic verification of M89 hadron physics. If the M89 quarks are ordinary current quarks and the mass of M89 proton is due to its magnetic body characterized by M89 instead of M107, M89 proton could be created as the magnetic body of ordinary proton makes p-adic phase transition and contracts by a scale factor 1/512. A more plausible option it that Planck constant increases by 512 so that the size does not change but the resulting proton (like also other M89 hadrons) would be dark.

M89 proton should decay to ordinary proton by transforming the energy of its magnetic body to particles: the same mechanism would produce ordinary matter in TGD variant of inflaton decay. Does the dark proton transform to ordinary M89 proton first and then deay to ordinary proton plus meson or does it decay first to M89 hadrons, which eventually decay to ordinary hadrons? What is the life-time of the dark proton: is it so long that it leaves the reactor volume so that M89 dark proton would make itself visible as missing energy? I cannot answer these questions. In any case, this kind of phase transition is possible when the cm energy of proton in beam exceeds 4.8 TeV. The energy of 6.5 TeV per beam was reached last May so that the effect might have been observed if it is there.

There is evidence also for other pieces of new physics predicted by TGD. First evidence for MG,79 hadron physics which should be also there with mass scale 214 times that of ordinary hadron physics and for the Higgs of the second generation weak bosons at the same mass scale and having mass 4 TeV. There is evidence also for the Z boson of the second generation weak physics inducing the breaking of lepton universality (see this). I know that my colleagues are not so stupid as they pretend to be, and the breakthrough of TGD is unavoidable and doomed to occur within few years.

For a summary of earlier postings see Links to the latest progress in TGD.

Tuesday, March 15, 2016

Cyclic cosmology from TGD perspective

The motivation for this piece of text came from a very inspiring (interview of Neil Turok by Paul Kennedy in CBS radio ). The themes were the extreme complexity of theories in contrast to the extreme simplity of physics, the mysterious homegeny and isotropy of cosmology, and the cyclic model of cosmology developed also by Turok himself. In the following I will consider these issues from TGD viewpoint.

1. Extreme complexity of theories viz. extreme simplicity of physics

The theme was the incredible simplicity of physics in short and long scales viz. equally incredible complexity of the fashionable theories not even able to predict anything testable. More precisely, super string theory makes predictions: the prediction is that every imaginable option is possible. Very safe but not very interesting. The outcome is the multiverse paradigm having its roots in inflationary scenario and stating that our local Universe is just one particular randomly selected Universe in a collection of infinite number of Universes. If so then physics has reached its end. This unavoidably brings to my mind the saying of Einstein: "Any intelligent fool can make things bigger, more complex, and more violent. It takes a touch of genius – and a lot of courage – to move in the opposite direction.".

Turok is not so pessimistic and thinks that some deep principle has remained undiscovered. Turok's basic objection against multiverse is that there is not a slightest thread of experimental evidence for it. In fact, I think that we can sigh for relief now: multiverse is disappearing to the sands of time, and can be seen as the last desperate attempt to establish super string theory as a respectable physical theory.

Emphasis is now in the applications of AdS/CFT correspondence to other branches of physics such as condensed matter physics and quantum computation. The attempt is to reduce the complex strongly interaction dynamics of conformally invariant systems to gravitational interaction in higher dimensional space-time called bulk. Unfortunately this approach involves the effective field theory thinking, which led to the landscape catastrophe in superstring theory. Einstein's theory is assumed to describe low energy gravitation in AdS so that higher dimensional blackholes emerge and their interiors can be populated with all kinds of weird entities. For TGD view about the situation see (see this)

One can of course criticize Turok's view about the simplicity of the Universe. What we know that visible matter becomes simple both at short and long scales: we actually know very little about dark matter. Turok also mentions that in our scales - roughly the geometric mean of shortest and longest scales for the known Universe - resides biology, which is extremely complex. In TGD Universe this would due to the fact that dark matter is the boss for living systems and the complexity of visible matter reflects that of dark matter. It could be that dark matter levels corresponding to increasing values of heff/h get increasingly complex in long scales and complexity increases. We just do not see it!

2. Why the cosmology is so homogenous and isotropic?

Turok sees as one of the deepest problems of cosmology the extreme homogeny and isotropy of cosmic microwave background implying that two regions with no information exchange have been at the same temperature in the remote past. Classically this is extremely implausible and in GRT framework there is no obvious reason for this. Inflationary scenario is one possible mechanism explaining this: the observed Universe would have been very small region, which expanded during inflationary period and all temperature gradients were smoothed out. This paradigm has several shortcomings and there exists no generally accepted variant of this scenario.

In TGD framework one can also consider several explanations.

  1. One of my original arguments for H=M4× CP2 was that the imbeddability of the cosmology to H forces long range correlations (see this, this and this). The theory is Lorentz invariant and standard cosmologies can be imbedded inside future light-cone with its boundary representing Big Bang. Only Roberton-Walker cosmologies with sub-critical or critical mass are allowed by TGD. Sub-critical ones are Lorentz invariant and therefore a very natural option. One would have automatically constant temperature. Could the enormous reduction of degrees of freedom due to the 4-surface property force the long range correlations? Probably not. 4-surface property is a necessary condition but very probably far from enough.

  2. The primordial TGD inspired cosmology is cosmic string dominated: one has a gas of string like objects, which in the ideal case are of form X2× Y2⊂ M4× CP2, where X2 is minimal surface and Y2 complex surface of CP2. The strings can be arbitrarily long unlike in GUTs. The conventional space-time as a surface representing the graph of some map M4→ CP2 does not exist during this period. The density goes like 1/a2, a light-cone proper time, and the mass of co-moving volume vanishes at the limit of Big Bang, which actually is reduced to "Silent Whisper" amplified later to Big Bang.

    Cosmic string dominated period is followed by a quantum critical period analogous to inflationary period as cosmic strings start to topologically condense at space-time sheets becoming magnetic flux tubes with gradually thickening M4 projections. Ordinary space-time is formed: the critical cosmology is universal and uniquely fixed apart from single parameter determining the duration of this period.

    After that a phase transition to the radiation dominated phase takes place and ordinary matter emerges in the decay of magnetic energy of cosmic strings to particles - Kähler magnetic energy corresponds to the vacuum energy of inflaton field. This period would do analogous to inflationary period. Negative pressure would be due to the magnetic tension of the flux tubes.

    Also the asymptotic cosmology is string dominated since the corresponding density of energy goes like 1/a2 as for primordial phase whereas for matter dominated cosmology it goes like 1/a3. This brings in mind the ekpyrotic phase of the cyclic cosmology.

  3. This picture is perhaps over-simplified. Quite recently I proposed a lift of Kähler action to its 6-D twistorial counterpart (see this). The prediction is that a volume term with positive coefficient representing cosmological constant emerges from the 6-D twistorial variant of Kähler action via dimensional reduction. It is associated with the S2 fiber of M4 twistor space and Planck length characterizes the radius of S2. Volume density and magnetic energy density together could give rise to cosmological constant behind negative pressure term. Note that cosmological term for cosmic strings reduces to similar form as that from Kähler action and depending on the value of cosmological constant only either of them or both are important. TGD suggest strongly that cosmological constant Λ has a spectrum determined by quantum criticality and is proportional to the inverse of p-adic length scale squared so that both terms could be important. If cosmological constant term is small always the original explanation for the negative pressure applies.

    The vision about quantum criticality of TGD Universe would suggest that the two terms has similar sizes. For cosmic strings the cosmological term does not give pressure term since it come from the string world sheet alone. Thus for cosmic strings Kähler action would define the negative pressure and for space-time sheets both. If the contributions could have opposite signs, the acceleration of cosmic expansion would be determined by competing control variables. To my best understanding the signs of the two contributions are same (my best understanding does snot however guarantee much since I am a numerical idiot and blundering with numerical factors and signs are my specialities). If the signs are opposite, one cannot avoid the question whether quantum critical Universe could be able to control its expansion by cosmic homeostasis by varying the two cosmological constants. Otherwise the control of the difference of accelerations for expansion rates of cosmic strings and space-time sheets would be possible.

  4. A third argument explaining the mysterious temperature correlations relies on the hierarchy of Planck constants heff/h=n labelling the levels of dark matter hierarchy with quantum scales proportional to n. Arbitrary large scales would be present and their presence would imply a hierarchy of arbitrary large space-time sheets with size characterized by n. The dynamics in given scale would be homogenous and isotropic below the scale of this space-time sheet.

    One could see the correlations of cosmic temperature as a signature of quantum coherence in cosmological scales involving also entanglement is cosmic scales (see this). Kähler magnetic flux tubes carrying monopole flux requiring no currents to generate the magnetic fields inside them would serve as correlates for the entanglement just as the wormholes serve as a correlate of entanglement in ER-EPR correlations. This would conform with the fact that the analog of inflationary phase preserves the flux tube network formed from cosmic strings. It would also explain the mysterious existence of magnetic fields in all scales.

3. The TGD analog of cyclic cosmology

Turok is a proponent of cyclic cosmology combining so called ekpyrotic cosmology and inflationary cosmology. This cosmology offers a further solution candidate for the homogeny/isotropy mystery. Contracting phase would differ from the expanding phase in that contraction would be much slower than expansion and only during the last state there would be a symmetry between the two half-periods. In concrete realizations inflaton type field is introduced. Also scenarios in which branes near each other collide with each other cyclically and generate in this manner big crunch followed by big bang is considered. I find difficult to see this picture as a solution of the homogeny/isotropy problem.

I however realized it is possible to imagine a TGD analog of cyclic cosmology in Zero Energy Ontology (ZEO). There is no need to assume that this picture solves the homogeny/isotropy problem and cyclicity corresponds to kind of biological cyclicity or rather sequence of re-incarnations.

3.1 A small dose of TGD inspired theory of consciousness

  1. In ZEO the basic geometric object is causal diamond (CD), whose M4 projection represents expanding spherical light-front, which at some moment begis to contract - this defines an intersection of future and past directed light-cones. Zero energy states are pairs of positive and negative energy states at opposite light-like boundaries of CD such that all conserved quantum numbers are opposite. This makes it possible to satisfy conservation laws.

  2. CD is identified as 4-D perceptive field of a conscious entity in the sense that the contents of conscious experiences are from CD. Does CD is represent only the perceptive field of an observer getting sensory representation about much larger space-time surface continuing beyond the boundaries of CD or does the geometry of CD imply cosmology, which is Big Bang followed by a Big Crunch. Or do the two boundaries of CD define also space-time boundaries so that space-time would end there.

    The conscious entity defined by CD cannot tell whether this is the case. Could a larger CD containing it perhaps answer the question? No! For larger CD the CD could represent the analog of quantum fluctuation so that space-time of CD would not extend beyond CD.

  3. The geometry of CD brings in mind Big Bang - Big Crunch cosmology. Could this be forced by boundary conditions at future and past boundaries of CD meeting along the large 3-sphere forcing Big Bang at both ends of CD but in opposite directions. If CD is independent geometric entity, one could see it as Big Bang followed by Big Crunch in some sense but not in a return back to the primordial state: this would be boring and in conflict with TGD view about cosmic evolution.

  4. To proceed some TGD inspired theory of consciousness is needed. In ZEO quantum measurement theory extends to a theory of consciousness. State function reductions can occur to either boundary of CD and Negentropy Maximization Principle (NMP) dictates the dynamics of consciousness (see this).

    Zeno effect generalizes to a sequence of state function reductions leaving second boundary of CD and the members of zero energy states at it unchanged but changing the states at opposite boundary and also the location of CD so that the distance between the tips of CD is increasing reduction by reduction. This gives rise to the experienced flow of subjective time and its correlation with the flow of geometric time identified as the increase of this distance.

    The first reduction to opposite boundary is forced to eventually occur by NMP and corresponds to state function reduction in the usual sense. It means the death of the conscious entity and its re-incarnation at opposite boundary, which begins to shift towards opposite time direction reduction by reduction. Therefore the distance between the tips of CD continues to increase. The two lifes of self are lived in opposite time directions.

  5. Could one test this picture? By fractality CDs appear in all scales and are relevant also for living matter and consciousness. For instance, mental images should have CDs as correlates in some scale. Can one identify some analogy for the Big Bang-Big Crunch cosmology for them? I have indeed considered what time reversal for mental images could mean and some individuals (including me) have experienced it concretely in some altered states of consciousness.

3.2 Does cyclic cosmology correspond to a sequence of re-incarnations for a cosmic organism?

The question that I am ready to pose is easy to guess by a smart reader. Could this sequence of life cycles of self with opposite directions of time serve as TGD analog for cyclic cosmology?

  1. If so, the Universe could be seen a gigantic organism dying and re-incarnating and quantum coherence even in largest scales would explain the long range correlations of temperature in terms of entanglement - in fact negentropic entanglement, which is basic new element of TGD based generalization of quantum theory.

  2. Big Crunch to primordial cosmology destroying all achievements of evolution should not occur at any level of dark matter hierarchy. Rather the process leading to biological death would involve the deaths of various subsystems with increasing scale and eventually the death in the largest scale involved.

  3. The system would continue its expansion and evolution from the state that it reached during the previous cycle but in opposite time direction. What would remain from previous life would be the negentropic entanglement at the evolving boundary fixed by the first reduction to the opposite boundary, and this conscious information would correspond to static permanent part of self for the new conscious entity, whose sensory input would come from the opposite boundary of CD after the re-incarnation. Birth of organism should be analogous to Big Bang - certainly the growth of organism is something like this in metaphoral sense. Is the decay of organism analogous to Big Crunch?

  4. What is remarkable that both primordial and asymptotic cosmology are dominated by string like objects, only their scales are different. Therefore the primordial cosmology would be dominated by cosmic strings thickened to cosmic strings also for the reversed cycle. Even more, the accelerated expansion could rip the space-time into pieces - this is one of the crazy looking predictions of accelerated expansion - and one would have free albeit thickened cosmic strings and in rough enough resolution they would look like ideal cosmic strings.

    The cycling would not be trivial and boring (dare I say stupid) repeated return to the same primordial state in conflict with NMP implying endless evolution. It would involve scaling up at each step. The evolution would be like a repeated zooming up of Mandelbrot fractal! Breathing is a good metaphor for this endless process of re-creation: God is breathing! Or Gods, since the is fractal hierarchy of CDs within CDs.

  5. There is however a trivial problem that I did not first notice. The light-cone proper times a+/- assignable to the two light-cones M4+/- defining CD are not same. If future directed light-cone M4+ corresponds to a+2= t2-rM2 with the lower tip of CD at (t,rM)=(0,0), the light-cone proper time associated with M4- corresponds a-2= (t-T)2-rM2= a+2-2tT+T2 = a+2-2(a+2+rM2)1/2T +T2. The energy density would behave near the upper tip like ρ ∝ 1/a+2 rather than ρ ∝ 1/a-2. Does this require that a Big Crunch occurs and leads to the phase where one has gas of cosmic strings in M4-? This does not seem plausible. Rather, the gas of presumably thickened cosmic strings in M4- is generated in the state function reduction to the opposite boundary. This state function reduction would be very much like the end of world and creation of a new Universe.
To sum up, single observation - the constancy of cosmic temperature - gives strong support for extremely non-trivial and apparently completely crazy conclusion that quantum coherence is present in cosmological scales and also that Universe is living organism. This should prove how incredibly important the interaction between experiment and theory is.

For details see the chapter TGD and Cosmology of "Physics in Many-Sheeted Space-time" or the article Cyclic Cosmology from TGD Perspective

For a summary of earlier postings see Links to the latest progress in TGD.

Monday, March 14, 2016

Holography and Quantum Error Correcting Codes: TGD View

Strong form of holography is one of the basic tenets of TGD, and I have been working with topological quantum computation in TGD framework with the braiding of magnetic flux tubes defining the space-time correlates for topological quantum computer programs. Flux tubes are accompanied by fermionic strings, which can become braided too and would actually represent the braiding at fundamental level. Also time like braiding of fermionic lines at light-like 3-surfaces and the braiding of light-like 3-surfaces themselvs is involved and one can talk about space-like and time-like braidings. These two are not independent being related by dance metaphor (think dancers at the parquette connected by threads to a wall generating both time like and space-like braidings). I have proposed that DNA and the lipids at cell membrane are connected by braided flux tubes such that the flow of lipids in lipid layer forming liquid crystal would induce braiding storing neural events to memory realized as braiding.

I have a rather limited understanding about error correcting codes. Therefore I was happy to learn that there is a conference in Stanford in which leading gurus of quantum gravity and quantum information sciences are talking about these topics. The first lecture that I listened was about a possible connection between holography and quantum error correcting codes. The lecturer was Preskill and the title of the talk was "Holographic quantum error-correcting codes: Toy models for the bulk/boundary correspondence" (see this and this). A detailed representation can be found in the article of Preskill et al ).

The idea is that time= constant section of AdS, which is hyperbolic space allowing tessellations, can define tensor networks. So called perfect tensors are building bricks of the tensor networks providing representation for holography. There are three observations that put bells ringing and actually motivated this article.

  1. Perfect tensors define entanglement which TGD framework corresponds negentropic entanglement playing key role in TGD inspired theory of consciousness and of living matter.

  2. In TGD framework the hyperbolic tesselations are realized at hyperbolic spaces H3(a) defining light-cone proper time hyperboloids of M4 light-cone.

  3. TGD replaces AdS/CFT correspondence with strong form of holography.

Could one replace AdS/CFT correspondence with TGD version of holography?

One can criticize AdS/CFT based holography because it has Minkowski space only as a rather non-unique conformal boundary resulting from conformal compactification. Situation gets worse as one starts to modify AdS by populating it with blackholes. And even this is not enough: one can imagine anything inside blackhole interiors: wormholes connecting them to other blackholes, anything. Entire mythology of mystic creatures filling the white (or actually black) areas of the map. Post-modernistic sloppiness is the problem of recent day theoretical physics - everything goes - and this leads to inflationary story telling. Minimalism would be badly needed.

AdS/CFT is very probably mathematically correct. The question is whether the underlying conformal symmetry - certainly already huge - is large enough and whether its proper extension could allow to get rid of admittedly artificial features of AdS/CFT.

In TGD framework conformal symmetries are generalized thanks due to the metric 2-dimensionality of light-cone boundary and of light-like 3-surfaces in general. The resulting generalization of Kac-Moody group as super-symplectic group replaces finite-dimensional Lie group with infinite-dimensional group of symplectic transformations and leads to what I call strong form of holography in which AdS is replaced with 4-D space-time surface and Minkowski space with 2-D partonic 2-surfaces and their light-like orbits defining the boundary between Euclidian and Minkowskian space-time regions: this is very much like ordinary holography. Also imbedding space M4× CP2 fixed uniquely by twistorial considerations plays an important role in the holography.

AdS/CFT realization of holography is therefore not absolutely essential. Even better, its generalization to TGD involves no fictitious boundaries and is free of problems posed by closed time-like geodesics.

Perfect tensors and tensor networks realized in terms of magnetic body carrying negentropically entangled dark matter

Preskill et al suggest a representation of holography in terms of tensor networks associated with the tesselations of hyperbolic space and utilizing perfect tensors defining what I call negentropic entanglement. Also Minkowski space light-cone has hyperbolic space as proper time=constant section (light-cone proper time constant section in TGD) so that the model for the tensor network realization of holography cannot be distinguished from TGD variant, which does not need AdS at all.

The interpretational problem is that one obtains also states in which interior local states are non-trivial and are mapped by holography to boundary states are: holography in the standard sense should exclude these states. In TGD this problem disappears since the macroscopic surface is replaced with what I call wormhole throat (something different as GRT wormhole throat for which magnetic flux tube is TGD counterpart) can be also microscopic.

Physics of living matter as physics condensed dark matter at magnetic bodies?

A very attractive idea is that in living matter magnetic flux tube networks defining quantum computational networks provide realization of tensor networks realizing also holographic error correction mechanism: negentropic entanglement - perfect tensors - would be the key element! As I have proposed, these flux tube networks would define kind of central nervous system make it possible for living matter to experience consciously its biological body using magnetic body.

These networks would also give rise to the counterpart of condensed matter physics of dark matter at the level of magnetic body: the replacement of lattices based on subgroups of translation group with infinite number of tesselations means that this analog of condensed matter physics describes quantum complexity.

I am just a novice in the field of quantum error correction (and probably remain such) but from experience I know that the best manner to learn something new is to tell the story with your own words. Of course, I am not at all sure whether this story helps anyone to grasp the new ideas. In any case, if one have a new vision about physical world, the situation becomes considerably easier since creative elements enter to the story re-telling. How these new ideas could be realized in the Universe of TGD bringing in new features relating to the new views about space-time, quantum theory, and living matter and consciousness in relation to quantum physics.

For the details see the new chapter Holography and Quantum Error Correcting Codes: TGD View or the article with the same title

For a summary of earlier postings see Links to the latest progress in TGD.

Thursday, March 10, 2016

New evidence for second generation weak bosons predicted by TGD

Already earlier evidence for the breaking of lepton universality has been found in the decays of beauty meson B consisting of b quark and d quark. The breaking of lepton universality means that lepton generations (electron, muon, tau and corresponding neutrinos) are not identical with respect to weak interactions. Indeed, there were indications that the decays do not occur with the same rate to electron -, muon, and tau pairs (there are small corrections breaking the universality due to different lepton masses). A possible reason is that there exists new weak bosons, whose couplings are not universal. What is known as Z' boson would make itself visible in the decays of B.

Now additional evidence for the existence of this kind of weak boson has emerged. If I understood correctly, the average angle between the decay products of B meson is not quite what it is predicted to be. This is interpreted as an indication that Z' type boson appears as an intermediate state in the decay.

What says TGD? TGD predicts three gauge boson families and the new boson families have couplings to fermions which are not universal (see the earlier posting) . There is indeed evidence for the Higgs of the second family as a bump predicted to have mass 32 times higher than ordinary Higgs, which makes rather precisely 4 TeV.

This coupling could explain the breaking of universality in the decays of B boson. In TGD Z' would correspond to second generation Z boson. p-Adic length scale hypothesis plus assumption that new Z boson corresponds to Gaussian Mersenne MG,79 =(1+i)79-1 predicts that its mass is by factor 32 higher than mass of ordinary Z boson making 2.9 TeV for 91 GeV mass for Z. If I remember correctly, there are indications for a bump at this mass value. Leptoquark made of right handed neutrino and quark is less plausible explanation but predicted by TGD as squark.

The breaking of the universality is characterized by charge matrices of weak bosons for the dynamical SU(3) assignable with family replication. The first generation corresponds to unit matrix whereas higher generation charge matrices can be expressed as orthogonal combinations of isospin and hypercharge matrices I3 and Y. I3 distinguishes between tau and lower generations (third experiment) but not between the lowest two generations. There is however evidence for this (the first two experiments above). Therefore a mixing the I3 and Y should occur.

Does the breaking of universality occurs also for color interactions? If so, the predicted M89 and MB,79 hadron physics would break universality in the sense that the couplings of their gluons to quark generations would not be universal. This also forces to consider to the possibility that there are new quark families associated with these hadron physics but only new gluons with couplings breaking lepton universality. This looks somewhat boring at first.

One the other hand, there exist evidence for bumps at masses of M89 hadron physics predicted by scaling to be 512 time heavier than the mesons of the ordinary M107 hadron physics (see the earlier posting) . According to the prevailing wisdom coming from QCD, the meson and hadron masses are however known to be mostly due to gluonic energy and current quarks give only a minor contribution. In TGD one would say that color magnetic body gives most of the meson mass. Thus the hypothesis would make sense. One can also talk about constituent quark masses if one includes the mass of corresponding portion of color magnetic body to quark mass. These masses are much higher than current quark masses and it would make sense to speak about constituent quarks for M89 hadron physics.

For background see the chapter New particle physics predicted by TGD: part I.

For a summary of earlier postings see Links to the latest progress in TGD.

Thursday, March 03, 2016

Twistor googly problem transforms from a curse to blessing in TGD framework

There was a nice story with title "Michael Atiyah’s Imaginative State of Mind" about mathematician Michael Atyiah in Quanta Magazine. The works of Atyiah have contributed a lot to the development of theoretical physics. What was pleasant to hear that Atyiah belongs to those scientists who do not care what others think. As he tells, he can afford this since he has got all possible prices. This is consoling and encouraging even for those who have not cared what others think and for this reason have not earned any prizes. Nor even a single coin from what they have been busily doing their whole lifetime!

In the beginning of the story "twistor googly problem" was mentioned. I had to refresh my understanding about googly problem. In twistorial description the modes of massless fields (rather than entire massless fields) in space-time are lifted to the modes in its 6-D twistor-space and dynamics reduces to holomorphy. The analog of this takes place also in string models by conformal invariance and in TGD by its extension.

One however encounters googly problem: one can have twistorial description for circular polarizations with well-defined helicity +1/-1 but not for general polarization states - say linear polarizations, which are superposition of circular polarizations. This reflects itself in the construction of twistorial amplitudes in twistor Grassmann program for gauge fields but rather implicitly: the amplitudes are constructed only for fixed helicity states of scattered particles. For gravitons the situation gets really bad because of non-linearity.

Mathematically the most elegant solution would be to have only +1 or -1 helicity but not their superpositions implying very strong parity breaking and chirality selection. Parity parity breaking occurs in physics but is very small and linear polarizations are certainly possible! The discusion of Penrose with Atyiah has inspired a possible solution to the problem known as "palatial twistor theory". Unfortunately, the article is behind paywall too high for me so that I cannot say anything about it.

What happens to the googly problem in TGD framework? There is twistorialization at space-time level and imbedding space level.

  1. One replaces space-time with 4-surface in H=M4×CP2 and lifts this 4-surface to its 6-D twistor space represented as a 6-surface in 12-D twistor space T(H)=T(M4)×T(CP2). The twistor space has Kähler structure only for M4 and CP2 so that TGD is unique. This Kähler structure is needed to lift the dynamics of Kähler action to twistor context and the lift leads to the a dramatic increase in the understanding of TGD: in particular, Planck length and cosmological constant with correct sign emerge automatically as dimensional constants besides CP2 size.

  2. Twistorialization at imbedding space level means that spinor modes in H representing ground states of super-symplectic representations are lifted to spinor modes in T(H). M4 chirality is in TGD framework replaced with H-chirality, and the two chiralities correspond to quarks and leptons. But one cannot superpose quarks and leptons! "Googly problem" is just what the superselection rule preventing superposition of quarks and leptons requires in TGD!

One can look this in more detail.
  1. Chiral invariance makes possible for the modes of massless fields to have definite chirality: these modes correspond to holomorphic or antiholomorphic amplitudes in twistor space and holomorphy (antiholomorphy is holomorphy with respect to conjugates of complex coordinates) does not allow their superposition so that massless bosons should have well-defined helicities in conflict with experimental facts. Second basic problem of conformally invariant field theories and of twistor approach relates to the fact that physical particles are massive in 4-D sense. Masslessness in 4-D sense also implies infrared divergences for the scattering amplitudes. Physically natural cutoff is required but would break conformal symmetry.

  2. The solution of problems is masslessness in 8-D sense allowing particles to be massive in 4-D sense. Fermions have a well-defined 8-D chirality - they are either quarks or leptons depending on the sign of chirality. 8-D spinors are constructible as superpositions of tensor products of M4 spinors and of CP2 spinors with both having well-defined chirality so that tensor product has chiralities (ε1, ε2), εi=+/- 1, i=1,2. H-chirality equals to ε=ε1ε2. For quarks one has ε= 1 (a convention) and for leptons ε=-1. For quark states massless in M4 sense one has either (ε12) = (1,1) or (ε12) = (-1,-1) and for massive states superposition of these. For leptons one has either (ε1, ε2) = (1,-1) or (ε1, ε2) = (-1,1) in massless case and superposition of these in massive case.


  3. The twistorial lift to T(M4)× T(CP2) of the ground states of super-symplectic representations represented in terms of tensor products formed from H-spinor modes involves only quark and lepton type spinor modes with well-defined H-chirality. Superpositions of amplitudes in which different M4 helicities appear but M4 chirality is always paired with completely correlating CP2 chirality to give either ε=1 or ε=-1. One has never a superposition of of different chiralities in either M4 or CP2 tensor factor. I see no reason forbidding this kind of mixing of holomorphicities and this is enough to avoid googly problem. Linear polarizations and massive states represent states with entanglement between M4 and CP2 degrees of freedom. For massless and circularly polarized states the entanglement is absent.

  4. This has interesting implications for the massivation. Higgs field cannot be scalar in 8-D sense since this would make particles massive in 8-D sense and separate conservation of B and L would be lost. Theory would also contain a dimensional coupling. TGD counterpart of Higgs boson is actually CP2 vector, and one can say that gauge bosons and Higgs combine to form 8-D vector. This correctly predicts the quantum numbers of Higgs. Ordinary massivation by constant vacuum expectation value of vector Higgs is not an attractive idea since no covariantly constant CP2 vector field exists so that Higgsy massivation is not promising except at QFT limit of TGD formulated in M4. p-Adic thermodynamics gives rise to 4-D massivation but keeps particles massless in 8-D sense. It also leads to powerful and correct predictions in terms of p-adic length scale hypothesis.

Addition: Anonymous reader gave me a link to the paper of Penrose and this inspired further more detailed considerations of googly problem.
  1. After the first reading I must say that I could not understand how the proposed elimination of conjugate twistor by quantization of twistors solves the googly problem, which means that both helicities are present (twistor Z and its conjugate) in linearly polarized classical modes so that holomorphy is broken classically.

  2. I am also very skeptic about quantizing of either space-time coordinates or twistor space coordinates. To me quantization is natural only for linear objects like spinors. For bosonic objects one must go to higher abstraction level and replace superpositions in space-time with superpositions in field space. Construction of "World of Classical Worlds" (WCW) in TGD means just this.

  3. One could however think that circular polarizations are fundamental and quantal linear combination of the states carrying circularly polarized modes give rise to linear and elliptic polarizations. Linear combination would be possible only at the level of field space (WCW in TGD), not for classical fields in space-time. If so, then the elimination of conjugate Z by quantization suggested by Penrose would work.

  4. Unfortunately, Maxwell's equations allow classically linear polarisations! In order to achieve classical-quantum consistency, one should modify classical Maxwell's equations somehow so that linear polarizations are not possible.
    Googly problem is still there!

What about TGD?
  1. Massless extremals representing massless modes are very "quantal": they cannot be superposed classically unless both momentum and polarisation directions for them (they can depend space-time point) are exactly parallel. Optimist would guess that the classical local classical polarisations are circular. No, they are linear! Superposition of classical linear polarizations at level of WCW can give rise to local linear but not local circular polarization! Something more is needed.

  2. The only sensible conclusion is that only gauge boson quanta (not classical modes) represented as pairs of fundamental fermion and antifermion in TGD framework can have circular polarization! And indeed, massless bosons - in fact, all elementary particles- are constructed from fundamental fermions and they allow only two M4, CP 2 and M4× CP2 helicities/-chiralities analogous to circular polarisations. B and L conservation would transform googly problem to a superselection rule as already described.

To sum up, both the extreme non-linearity of Kähler action, the representability of all elementary particles in terms of fundamental fermions and antifermions, and the generalization of conserved M4 chirality to conservation of H-chirality would be essential for solving the googly problem in TGD framework.

For background see the chapter From Principles to giagrams or the article From Principles to Diagrams.

For a summary of earlier postings see Links to the latest progress in TGD.

Wednesday, March 02, 2016

Chi Energy - master gets animals to sleep

In Thinking Allowed there was an interesting link from Jeff Hall to a video with title Chi Energy - master gets animals to sleep. The video was very impressive and I recommend seeing it. Below I propose an explanation for the feats of the master.

I have constructed a theory of remote mental interactions but always said that I do not believe in them - I just take their possibility very seriously. To be honest, the only reason for this attitude is that they emerge naturally from TGD inspired theory of consciousness. This video made me a believer. I know that skeptic "knows" that the video is hoax and demands 10 sigma statistical proof that every chi master in every corner of the Universe can put animals to sleep under controlled laboratory conditions by weaving his hands. It does not matter: we can laugh together to my gullibility if this helps skeptic to avoid despair in his intellectual isolation.

We had a long discussion about the video and Ulla noticed the similarity with hypnosis: even the word "hypnosis" originally means some kind of sleep like state. In TGD framework hypnosis could be seen as a particular example of remote mental interactions. Simplifying: hypnotizer would in some sense hijack some part of brain of the subject by quantum entangling with it so that it becomes part of hypnotizer and obeys his commands. Note that the social explanation of hypnosis as the desire of subject ot please the hypnotiser does not explain what happens to the animals.

In the discussion consciousness was of course mentioned and consciousness was compared to field. As a philosophically oriented physicist I get worried when one says "consciousness is a field" or something like that. I would prefer to speak about field patterns as correlates for contents of consciousness. To me consciousness itself is an independent form of existence not reducing to a property of physical system as materialist believes. This looks like pedantry but becomes absolutely crucial if one really wants to understand consciousness. Real progress is science is mostly getting rid of sloppy language implying sloppy thinking.

I have explained so many times the basic ideas of TGD inspired theory of consciousness (call it TTC for short) and I am afraid that most readers have not got the message. I think that independently rediscovering TTC is the only manner to realize what I am trying to say. Therefore only few paragraphs.

One needs a new ontology - a vision about what exists. This ontology is neither materialistic nor dualistic and in which consciousness is not a property of physical state as "-ness" would suggest but resides in the nowhere-nowhen-land between two quantum states replaced with analogs of quantum evolutions of Schrödinger equation. I call the new ontology Zero Energy Ontology (ZEO) and it leads to a new view about quantum measurement theory and state function reduction giving theory of consciousness as by-product by transforming observer from an outsider to the Universe a part of quantum physics. Conscious entity is the outcome of Zeno effect - a sequence of state function reductions which would not change the state in standard ontology at all but gives rise to the experienced flow of time in ZEO.

A lot of unexpected predictions follow. Mention only the possibility of exotic unexpected phenomena such as time reversed consciousness, the re-incarnation of conscious entity in different time after biological death, and the predicted hierarchy of conscious entities with mental images identifiable as sub-selves - conscious entities. Also a detailed view about quantum biology and about remote mental interactions emerges.

Quantum biology involves a generalization of both classical physics and quantum physics.

  1. Classical physics is generalized by replacing space-time with space-time surfaces bringing in notions like many-sheeted space-time, magnetic flux quanta/tubes, field body and topological light rays essential for understanding living matter. Magnetic body (MB) becomes what might be called intentional agent. Our MB is the "real us" using our biological body (BB) as a motor instrument and sensory receptor. EEG and its scaled variants mediate sensory information from neuronal/cell membranes to parts of magnetic body having onion-like structure and control commands from MB to genome initiating gene expressions and possible other hitherto unknown genome related functions such as topological quantum computation and communications with dark photons which an decay to bio-photons.

    Magnetic flux tubes accompany and are space-time correlates of entanglement: note that also superstringers have ended up with this idea but talk about wormholes instead of flux tubes.

    Concerning remote mental interactions, the crucial difference from Maxwell's linear and relatively simple theory is that flux tubes make possible precisely targeted communications such that the signal does not weaken with distance. This is like replacing radio station with something sending laser signals along cable: replacing mass communication like radio broadcast with email. The signals - I call them topological light rays - are analogous to laser light beams travelling along flux tubes: also their existence distinguishes TGD from Maxwell's theory where light signals travel in all directions and weaken like 1/r2.

  2. The generalization of quantum physics involves the hierarchy of Planck constants coming as multiples of ordinary Planck constants and identified in terms of dark matter which becomes a key player in living systems. Scaling of Planck constant scales up quantum lengths and gives rise to macroscopic quantum coherence, which is the key property of living matter. p-Adic physics and fusion of real physics (correlates of sensory experience) and various p-adic physics (correlates of cognition and imagination) is an essential element of the theory too.

Consider now what remote mental interactions might be.
  1. Attention is obviously an essential element. This master intensively attends. Magnetic flux tubes are correlates for attention. When I attend something the flux tubes connecting some part of me to this something are formed. This something could be mental image perhaps localizable to my brain or an object of external world - say my cat. Or the animals in the amazing video, which motivated the writing of this posting. Magnetic flux tubes are like tentacles studying the environment and when they find tentacle of another BB, reconnection to a bridge connecting the biological bodies can happen if the magnetic field strengths are nearly the same. This implies that cyclotron frequencies are same so that the reconnection involves resonance.

    This is a good reason to identify the prerequisites/correlates for remote mental interactions as magnetic flux tubes, which are TGD counterparts of Maxwellian magnetic fields but differ from them since they are topologically quantized.

  2. Remote mental interactions are not anything exotic in this world view: the communications from BB and control of my BB by my MB rely on remote mental interactions. What we are used to call remote mental interactions is the same phenomenon except that the target is not my BB but something else: say patient in remote healing or computer in experiments testing whether intention can affect random number generator.

What might happen in the video?
  1. What could happen as the master in the video weaves his hands? Same as in hypnosis, which is also a remote mental interactions. The magnetic flux tubes for a part of hypnotizer's MB reconnect with those for a part of subject's MB fusing two conscious entities single one with chi master serving as boss for the unit formed in this manner. Both supra currents and analogs of laser light signals can proceed along these bridges thus formed. This is the same effect as the fusion of mental images - subselves - producing stereo vision. Fusion can occur also for mental images in different brains: our consciousness is not so private as we think - be cautious with your thoughts;-). Your brain children are not always only your brain children!

  2. What makes a fellow who just weaves his hands "superhuman" - as the video says? How the movement of his hands can have so magic effect? It cannot. MB acting as an intentional agent is needed. The skills of the master in using his MB give him his magic looking powers - he is a master in magnetic gymnastics:-). Yoga trains your BB, meditation trains your MB. Using the tentacles emanating from is hands the master can get a contact even to the MBs of members of different species and make them part of this own MB and give commands to them. As the master weaves his hands he helps the flux tubes to form reconnections with the MBs of the subject animals. I wonder whether the master can "see" the flux tubes of foreign magnetic bodies (not necessarily consciously at his level of self hierarchy). This would make his task much easier.

For a summary of earlier postings see Links to the latest progress in TGD.