https://matpitka.blogspot.com/2008/04/topological-geometrodynamics-overall.html

Thursday, April 24, 2008

Topological Geometrodynamics: an Overall View

The participation in a conference inspired me to write a brief overall view about the recent situation in TGD. This was badly needed since even the last year has brought many important steps of progress. I glue here only the abstract of the article.

A brief summary of various competing visions about the basic principles of quantum Topological Geometrodynamics (TGD) and about tensions between them is given with emphasis on the recent developments.

  1. Physics as the classical spinor field geometry of the world of classical worlds is the oldest and best developed approach to TGD and means generalization of Einstein's program of geometrizing classical physics so that it applies to entire quantum physics.

  2. Parton level formulation of quantum TGD as an almost topological quantum field theory using light-like 3-surfaces as fundamental objects allows a detailed understanding of super-conformal symmetries generalizing those of super string models. A category theoretical interpretation of M-matrix as a functor is possible. This picture has tight connections to the physics as configuration space geometry approach and implies it.

  3. Physics as generalized number theory represents third vision about TGD. Number theoretic universality meaning a fusion of real and p-adic physics to single coherent whole forces a formulation in terms of so called number theoretic braids. The symmetries of classical number fields strongly suggest the interpretation in terms of standard model symmetries and a number theoretic interpretation of the imbedding space M4× CP2. Associativity condition would define laws of classical and quantum TGD. The notion of infinite prime defines a third thread in the braid of number theoretical ideas and it is now possible to give a surprisingly realization for the number theoretic Brahman=Atman identity (algebraic holography) based on the generalization of the number concept by allowing infinite number of real units representable as ratios of infinite integers having interpretation as representations for physical states of super-symmetric arithmetic QFT. The infinitely rich number theoretic anatomy for the points of number theoretic braids allows to represent the information about quantum state of the Universe which remains below measurement resolution.

  4. The idea about hierarchy of Planck constants was inspired by certain empirical facts. The hierarchy leads to a generalization of the notion of imbedding space emerges naturally from the requirement that the choice of quantization axes has a geometric correlate also at the level of imbedding space. The physical implication is the identification of dark matter in terms of a hierarchy of macroscopically quantum coherent phases with quantized values of Planck constant having arbitrarily large values and playing a key role, not only in biology but also in astrophysics and cosmology of TGD Universe. The hierarchy of Planck constants can be seen as necessary for the realization of quantum criticality. The generalization of imbedding space is also essential for the construction of the Kähler function of configuration space.

  5. The fifth vision about quantum TGD is that the mere finiteness of measurement resolution fixes the scattering matrix of quantum TGD. In zero energy ontology S-matrix must be generalized to M-matrix identified as time-like entanglement coefficients between positive and negative energy parts of zero energy states. M-matrix can be regarded as a "complex square root" of density matrix expressible as product of a real square root of density matrix and unitary S-matrix: thermodynamics becomes part of quantum theory. Hyper-finite factors of type II1 (HFFs) emerge naturally through Clifford algebra of the "world of classical worlds" and allow a formulation of quantum measurement theory with a finite measurement resolution. The notion of finite measurement resolution expressed in terms of inclusion of HFFs with included algebra defining the measurement resolution leads to an identification of M-matrix in terms of Connes tensor product and a simple argument shows that M-matrix is unique apart from the presence of the square root of density matrix needed by thermodynamics. Coupling constant evolution corresponds to a hierarchy of measurement resolutions and p-adic coupling constant hypothesis follows as a consequence with an additional prediction assigning to particles an additional time scale characterizing temporal distance between positive and negative energy parts of corresponding zero energy state: for electron this time scale is .1 second, fundamental biorhythm. Thus zero energy ontology implies a direct connection between elementary particle physics and biology.

  6. Consciousness theory interpreted as generalization of quantum measurement theory the most plausible vision about quantum TGD and has already shown its power and brought into theory notions which cannot be imagined in the standard conceptual framework of quantum physics.

7 comments:

Anonymous said...

Moi Matti, I hope you are doing well. I pop in here only randomly.

From the first time I sensed attractive elements of your thinking, but I have a hard time to extract what would be the first principles from my point of view, and your papers are massive.

Your note

"The hypothesis that the mere finiteness of measurement resolution could determine the laws of quantum physics [C3] completely belongs to the category
of not at all obvious first principles."

then you combine that with something you call zero energy ontology. I don't understand what you are saying.

Is there a simple definition of this concept somewhere, without me having to read tons of pages? I mean really first princple definition, not assuming any of the standard notions of space, time or energy?

I am interested because this sounds somewhat similar to some ponderings I've made.

I consider finite information capacity and a finite capacity of the communication channel (which means observation). However I still pictures these finite capacities to be subject to change. Both the channel capacity and the storage can grow or shrink. so it somehow sets the bounds for a microstructures size and interface to the environment.

I like the associaton that this dictates the laws of physics though. The information capacity limits the number of possible distinguishable interactions, and the communiation channel must constraint the part of transformations that are are indeterministic?

I'm trying to understand what happens when the complexity increases. I suspect this one way or the other relates to the hierarchy thinking. But I wonder if it's just the words that are common. As the complexity increases, one would statistically expect some kind of spontaneous transformations or branchings to occur. This is exactly how I imagine it. But the exact expectation forms aren't clear. Do you somehow have this explicitly worked out?

/Fredrik

Matti Pitkänen said...

Hi,

thank you for interesting and challenging questions.

Concerning zero energy ontology there is a lot of context involved (three decades of hard work!), and it is probably impossible to give a short comprehensive explanation without this context. I however try.

a) The motivating problem behind ZEO is posed by the fact that for Kahler action defining classical theory and Kahler function giving the Kahler geometry of the "world of classical worlds", Robertson Walker cosmologies have vanishing currents of inertial 4-momentum but nonvanishing and nonconserved currents of gravitational 4-momentum. Does Equivalence Principle really fail in TGD? This is the question.

[Already now you could ask "What do yo mean with Kahler geometry, with Kahler function, with the world of classical worlds,..?": You begin to see;-)]

b) The replacement of Kahler action with curvature scalar would of course realize Equivalence Principle exactly but gravitational four-momentum would be conserved and standard Robertson Walker cosmologies would not solve field equations. Also the vacuum degeneracy crucial for the quantum classical correspondence and quantum criticality would be lost (the failure of complete non-determinism of space-time dynamics as space-time correlate for quantal non-determinism as generalization of Bohr orbitology).
[You can probably continue with the list of questions "What..?";-)]

c) I believe that zero energy ontology resolves the problem. ZEO means that quantum states are identified as physical *events*: pairs of positive and negative energy states identified as initial and final states of particle reaction for instance. Every physical state in this sense is creatable from vacuum: this is the deep principle which resolves troublesome philosophical questions such as "What was the initial state of universe", "What are the total values of conserved quantities such as fermion number and em charge?", etc..

d) Zero energy state has space-time correlate: space-time sheet of finite time duration whose ends are at boundaries of causal diamond formed by future and past directed lightcones of M^4(xCP_2). T assigns a new macroscopic time scale to elementary particle: .1 seconds in case of electron (10 Hz biorhythm).

[Also now several questions emerge: "What is M^4xCP_2?", etc.]

e) In time scale shorter than T ZE state looks like positive or negative energy state. In scale longer it looks like quantum fluctuation.

f) The vanishing of inertial energy density for RW cosmologies would mean that one cannot perceive the positive energy part of zero energy state in the extremely long time resolution used in cosmological observations: everything would look like quantum fluctuation.

Gravitational 4-momentum could correspond to four-momentum of positive energy part of state. In this sense one would have Equivalence Principle at least for states created from elementary particles.

The statement that M-matrix is dictated by finite measurement resolution deserves a couple of comments.

a) In ZEO one replaces S-matrix with M-matrix giving time-like entanglement coefficients between positive and negative energy parts of state. M-matrix is identified as "square root" of density matrix and is product of positive root of density matrix and unitary matrix identifiable as S-matrix. Quantum physics becomes square root thermodynamics.

b) M-matrix or actually hierarchy of them, one for each measurement resolution, is determined by finite measurement resolution realized in terms of inclusion of hyperfinite factors of type II_1: included algebra N subset M defines resolution which means that it action generates a state which cannot be distinguished from original one. Complex rays of Hilbert space are replaced by N rays.

c) The laws of physics are dictated in the sense that this requirement implies that M-matrix is defined by Connes tensor product which is essentially unique for HFF of type II_1. For direct sum of HFFs one has non-trivial density matrix and thermodynamics.

Anonymous said...

Thanks for your additional comments. That sounds like alot of context indeed :) I need to read that a few times and see if I can see what it means. From my last posts here I remember that you start out differently, so I've got a little hard to find the point in your thinking that would correspond to my starting points.

I'll be back later. Thanks for taking your time.

/Fredrik

Anonymous said...

Hello, some brief basic questions if you have the time to comment. You have alot of mathematical constructs that I wonder where the context is. I can find the definition of terms, but I guess the magic is the context in where you use them.

a1) Is this measure of the geometry of classical worlds, objective?

a2) Can a real inside observer, by means of real processes determine/guess this geometry? how? Or is this not observable? (with observable I don't mean in the formal sense of QM, I mean that it can be guessed by a real observer, by subjective information available to her)

(Ie, do you use a conjectured objectivity to explain subjectivity - and take consistency as an indirect evidence for the correctness of the conjecture?
...
or are you trying to explain emergent collective objectivity from a given subjectivity by means of a conjectured principles of interaction?
- I mean to probe, what's the starting point of your reasoning? I think I need to understand your way of seeing the task, before I can see your context)

When you say you think the ZEO "solves the problem", what would remain to gain more confidence in it, or even prove it? Experimental testing, more theoretical work or both?

/Fredrik

Matti Pitkänen said...

Thank you for questions. I try to answer them.


Q: a1) Is this measure of the geometry of classical worlds, objective?

A: Instead of objectivity (I am not quite sure what you mean with it) I would speak about the uniqueness forced by the mere mathematical existence. The geometry of the world of classical worlds is Kahler geometry from the requirement that the spinor fields in this space provide geometrization of quantum theory.

The existence of this geometry requires huges symmetries: otherwise Riemann connection, etc do not exist. In case loop spaces the Kahler geometry is unique and possesses Kac-Moody symmetries.

Now some 3-D generalization of this is involved. Assuming for a moment that basic objects are 3-surfaces in higher-D spacetime one can conclude that by 4-D general coordinate invariance the definition of Kahler geometry must assign to given 3-surface or more or less unique space-time surface: kind of Bohr orbit.

By general coordinate invariance one can restrict the consideration to light-like 3-surfaces and this leads to a construction of exponent of Kahler function defining the configuration space measure as Dirac determinant: lightlikeness implies generalized super-conformal symmetries allows only Chern-Simons action and its fermionic counterpart.

The well-motivated guess is that resulting exponent is just exponent of Kahler action for preferred extremal: this by the completely unique properties of Kahler action. The only parameter is Kahler coupling strength and is analogous to temperature: its values are determined by quantum criticality. Hence one can say that geometrization program fixes the theory highly uniquely and the special number theoretical and physical properties of M^4xCP_2 suggest that it is a unique choice. In particular, 4-D Minkowski space is necessary for super canonical conformal symmetries and 4-D nature of space-time for superconformal invariances associated with light-like 3-surfaces.

One basic principle is finiteness: in infinite-D context everything tends to diverge. The cancellation of infinities has led to the discovery of super symmetries, superstring models, and also of TGD.

Q: a2) Can a real inside observer, by means of real processes determine/ guess this geometry? how? Or is this not observable? (with observable I don't mean in the formal sense of QM, I mean that it can be guessed by a real observer, by subjective information available to
her)

A: Depends on what one means with these real processes. Those whose existence we know and can do in laboratory? Or the thinking process itself which we do not understand very well because we do not possess theory of consciousness?

I guess that what mathematicians are doing is kind of higher level perception using extremely powerful constraints of mathematical existence and internal consistency.

Guessing fundamental principles is certainly very high level process which I do believe to be understandable in terms of what happens in some neural network. As a matter fact, cognition- if based on p-adic physics- is literally cosmic phenomenon, not localizable to any particular brain.

The hope is that general principles such quantum classical correspondence, uniqueness of infinite-D Kahler geometric existence, the postulate that any quantum state is creatable from vacuum (zero energy ontology), general coordinate invariance, dynamics from associativity condition in number theoretical approach, M-matrix dictated by finiteness of measurement resolution, quantum TGD as almost topological quantum field theory,... fix the theory uniquely.

The choice M^4xCP_2 is fixed uniquely also by the known quantum numbers characterizing elementary particles so that in this sense the geometry of worlds of classical worlds is observable. If one accepts number theoretic Brahman=Atman identity supported by generalization of number concept, imbedding space with points endowed with additional number theoretical anatomy not visible at the level of real norm of number, allow to realize configuration space spinor fields in the space M^4xCP_2 with points possessing this number theoretical anatomy.

Q: (Ie, do you use a conjectured objectivity to explain subjectivity - and
take consistency as an indirect evidence for the correctness of the conjecture? ...or are you trying to explain emergent collective objectivity from a given subjectivity by means of a conjectured principles of interaction?
- I mean to probe, what's the starting point of your reasoning? I think I need to understand your way of seeing the task, before I can see your context)


A: I of course have geometrization of physics in terms of M^4xCP_2 as starting point since this is the only option consistent with the standard model assuming one accepts the basic mathematical wisdom about realization of symmetries in quantum theory.

The goal is to identify some simple fundamental principles and deduce from these the quantum theory, including M^4xCP_2 and 4-D character of space-time. The article describes roughly 5 different first principle approaches to quantum TGD. They are highly dependent and at this moment I cannot say whether some of them could imply others.

Q: When you say you think the ZEO "solves the problem", what would remain
to gain more confidence in it, or even prove it? Experimental testing,
more theoretical work or both?

A: The best justification for ZEO is as fundamental principle: every quantum state is creatable from vacuum. ZEO combined with quantum non-determinism resolve very troublesome questions challenging the very existence of theoretical physics as a sensible discipline: "What are the conserved quantum numbers of the universe?" "Why they were just those what they are?" and "What was the initial state of the Universe" (in deterministic framework this would mean that at the level of principle the testing of the theory is impossible since single solution of field equations is selected in ad hoc manner. ZEO together with finiteness of measurement resolution also allows to identify M-matrix highly uniquely plus interpret quantum theory as square root of thermodynamics. Also evolution emerges naturally in ZEO.


Experimental testing, applications, and more theoretical work are of course needed. One application is to quantum biology: zero energy ontology assigns to electron time scale .1 seconds corresponding to 10Hz fundamental biorhytm. Also other time scales associated with other elementary particles. These time scales should reflect themselves as biorhythms.

Anonymous said...

Thanks for your follow up comments. I went back to your website and found more.

> Instead of objectivity (I am not quite sure what you mean with it) I would speak about the uniqueness forced by the mere mathematical existence.

I think I see. I guess I this this as a form of objectivity in the way I use the word: the objectivity of the deductive rules you think of as "mathematical uniqueness". So the structures are objective in the same that they are the result of an objective process?

I try to think a bit different in that the rules of reasoning can never be objective in any meaningful sense. I prefer to think in terms of soft inductive reasoning, based on incomplete information where even the rules of reasoning themselves are similarly soft in nature, more like qualified guesses.

> A: Depends on what one means with these real processes. Those whose existence we know and can do in laboratory? Or the thinking process itself which we do not understand very well because we do not possess theory of consciousness?

I mean any process that can take place in nature. As far as I know processes in the human brain are very much real :) However, objectivity is a problem. I see the objectivity as emergent as a result of interactions between the subjective views. But unlike say GR, where different observers measure different things objectivity is at the level of the connections. The relations are objective.

But that doesn't make sense to me as a starting point. I think even the relations are constrained by information capacity - for example if we ask, what is the information content in the relation itself? And where is this relation encoded? I am doubtful about the strategy that inflating this more, and imagining even larger spaces where higher level relations which restores objectivity.

Certainly this kind of inflation makes sense, but I think the inflation is constrained by a complexity bound - this is hte information capacity of the observer.

I thought I sensed a connection here to your thinking, but it is clear that you think differently. You seem more guided by some uniquness of mathematical reasoning than I am. But that would make the connection even more interesting.

> A: I of course have geometrization of physics in terms of M^4xCP_2 as starting point since this is the only option consistent with the standard model assuming one accepts the basic mathematical wisdom about realization of symmetries in quantum theory.

Ok, I think I see how you think. I also understood more from some of your other webpages.

> The goal is to identify some simple fundamental principles and deduce from these the quantum theory, including M^4xCP_2 and 4-D character of space-time.

If I understnad you right, you have somehow already a strong hunch what you want to deduce? And you are looking for lets call it "a set of rules of reasoning" that allows you to conclude that your intuitively chose structures follows uniquely from this reasoning?

I consider the rules of reasoning in this case to be part of the problem. That the rules of reasoning, are connected to the laws of physics. And both are subject to evolution. Learning usually doesn't mean storing more data, it also means changing the rules of reasoning. Once we are saturated with data, further evolution seems to take place at two levels - grow larger memory, or optimse the rules of reasoning (here I association to the physical action). Growing larger memory is where I like to make associations to gravity. It would also make it plausible why it's universal.

What does evolution mean in your sense?

The observer view is one of my starting points, so perhaps your "consiousness" view is more appropriate for me.

/Fredrik

Matti Pitkänen said...

From physical considerations the basic structure of TGD is fixed. One of my dreams is to deduce the same structure from the uniqueness of Kahler geometric existence in infinite-D context.

Rules of reasoning are part of the problem and one must have also vision about how logic and logic deductions are represented physically.

a) There is strong self-referentiality involved. Each quantum jump recreates Universe and provide a representation about previous state. At space-time level this means that space-time surface provide a symbolic representation for contents of consciousness.

b) Quantum variant of Boolean logic which is naturally realized in terms of fermionic Fock states representing configuration space spinors.

c) Zero energy states can be interpreted also as quantum superpositions of rules A-->B: initial state to final state in physical event. Fermionic stats A and B represent quantum variants of Boolean statements.

d) Finite resolution of cognition implies quantum spinors and impossibility to have strictly true or false statements.

Some comments about how I see evolution.

a) Evolution means sequence of quantum jumps recreating the Universe again and again.

b) Negentropy Maximization Principle states that the reduction of entanglement entropy is maximal in quantum jump. Hence NMP realizes the evolutionary principle.

c) If number theoretic variants of Shannon entropy are allowed, it is possible of have entanglement with *negative* entanglement entropy and quantum jumps tend to generate information carrying entanglement. This is new and makes sense if entanglement probabilities are rationals or at most algebraic numbers.

d) The emergence of increasing values of Planck constants into the "personal" hierarchy of Planck constants characterizing onionlike layers of field/magnetic body is concrete correlate for evolutionary leaps: something qualitatively new emerges. This reflects itself directly as the emergence of new longer time scale of memory and planned action.