*Was von Neuman Right After All?*.

The work with TGD inspired model for quantum computation led to the realization that von Neumann algebras, in particular hyper-finite factors of type II_1 could provide the mathematics needed to develop a more explicit view about the construction of S-matrix. In this chapter I will discuss various aspects of type II_1 factors and their physical interpretation in TGD framework.

### 1. Philosophical ideas behind von Neumann algebras

The goal of von Neumann was to generalize the algebra of quantum mechanical observables. The basic ideas behind the von Neumann algebra are dictated by physics. The algebra elements allow Hermitian conjugation and observables correspond to Hermitian operators. A measurable function of operator belongs to the algebra. The predictions of quantum theory are expressible in terms of traces of observables. The highly non-trivial requirement of von Neumann was that identical a priori probabilities for a detection of states of infinite state system must make sense. Since quantum mechanical expectation values are expressible in terms of operator traces, this requires that unit operator has unit trace. For finite-dimensional case it is easy to build observables out of minimal projections to 1-dimensional eigen spaces of observables. For infinite-dimensional case the probably of projection to 1-dimensional sub-space vanishes if each state is equally probable. The notion of observable must thus be modified by excluding 1-dimensional minimal projections, and allow only projections for which the trace would be infinite using the straightforward generalization of the matrix algebra trace as dimension of the projection. The definitions of adopted by von Neumann allow more general algebras than algebras of II_1 for with traces are not larger than one. Type I_n algebras correspond to finite-dimensional matrix algebras with finite traces whereas I_\infty does not allow bounded traces. For algebras of type III traces are always infinite and the notion of trace becomes useless.### 2. von Neumann, Dirac, and Feynman

The association of algebras of type I with the standard quantum mechanics allowed to unify matrix mechanism with wave mechanics. Because of the finiteness of traces von Neumann regarded the factors of type II_1 as fundamental and factors of type III as pathological. The highly pragmatic and successful approach of Dirac based on the notion of delta function, plus the emergence of Feynman graphs and functional integral meant that von Neumann approach was forgotten to a large extent. Algebras of type II_1 have emerged only much later in conformal and topological quantum field theories allowing to deduce invariants of knots, links and 3-manifolds. Also algebraic structures known as bi-algebras, Hopf algebras, and ribbon algebras to type II_1 factors. In topological quantum computation based on braids and corresponding topological S-matrices they play an especially important role.### 3. Factors of type II_1 and quantum TGD

There are good reasons to believe that hyper-finite (ideal for numerical approximations) von Neumann algebras of type II_1 are of a direct relevance for TGD.*3.1 Equivalence of generalized loop diagrams with tree diagrams*The work with bi-algebras led to the proposal that the generalized Feynman diagrams of TGD at space-time level satisfy a generalization of the duality of old-fashioned string models. Generalized Feynman diagrams containing loops are equivalent with tree diagrams so that they could be interpreted as representing computations or analytic continuations. This symmetry can be formulated as a condition on algebraic structures generalizing bi-algebras. The new element is the possibility of vacuum lines having natural counterpart at the level of bi-algebras and braid diagrams. At space-time level they correspond to vacuum extremals.

*3.2. Inclusions of hyper-finite II_1 factors as a basic framework to formulate quantum TGD*The basic facts about von Neumann factors of II_1 suggest a more concrete view about the general mathematical framework needed.

- The effective 2-dimensionality of the construction of quantum states and configuration space geometry in quantum TGD framework makes hyper-finite factors of type II_1 very natural as operator algebras of the state space. Indeed, the elements of conformal algebras are labelled by discrete numbers and also the modes of induced spinor fields are labelled by discrete label, which guarantees that the tangent space of the configuration space is a separable Hilbert space and Clifford algebra is thus a hyper-finite type II_1 factor. The same holds true also at the level of configuration space degrees of freedom so that bosonic degrees of freedom correspond to a factor of type I_{\infty} unless super-symmetry reduces it to a factor of type II_1.
- Four-momenta relate to the positions of tips of future and past directed light cones appearing naturally in the construction of S-matrix. In fact, configuration space of 3-surfaces can be regarded as union of big-bang/big crunch type configuration spaces obtained as a union of light-cones with parameterized by the positions of their tips. The algebras of observables associated with bounded regions of M^4 are hyper-finite and of type III_1. The algebras of observables in the space spanned by the tips of these light-cones are not needed in the construction of S-matrix so that there are good hopes of avoiding infinities coming from infinite traces.
- Many-sheeted space-time concept forces to refine the notion of sub-system. Jones inclusions N subset M for factors of type II_1 define in a generic manner imbedding interacting sub-systems to a universal II_1 factor which now corresponds naturally to infinite Clifford algebra of the tangent space of configuration space of 3-surfaces and contains interaction as M:N-dimensional analog of tensor factor. Topological condensation of space-time sheet to a larger space-time sheet, formation of bound states by the generation of join along boundaries bonds, interaction vertices in which space-time surface branches like a line of Feynman diagram: all these situations could be described by Jones inclusion characterized by the Jones index M:N assigning to the inclusion also a minimal conformal field theory and conformal theory with k=1 Kac Moody for M:N=4. M:N=4 option need not be realized physically as quantum field theory but as string like theory whereas the limit D=4-epsilon--> 4 could correspond to M:N--> 4 limit. An entire hierarchy of conformal field theories is thus predicted besides quantum field theory.

*3.3 Generalized Feynman diagrams are realized at the level of M as quantum space-time surfaces*The key idea is that generalized Feynman diagrams realized in terms of space-time sheets have counterparts at the level of M identifiable as the Clifford algebra associated with the entire space-time surface X^4. 4-D Feynman diagram as part of space-time surface is mapped to its beta= M:N<=4-dimensional quantum counterpart.

- von Neumann algebras allow a universal unitary automorphism A--> Delta^{it}A Delta^{-it} fixed apart from inner automorphisms, and the time evolution of partonic 2-surfaces defining 3-D light-like causal determinant corresponds to the automorphism N_i--> Delta^{it}N_iDelta^{-it performing a time dependent unitary rotation for N_i along the line. At configuration space level however the sum over allowed values of t appear and should gives rise to the TGD counterpart of propagator as the analog of the stringy propagator INT_0^t exp(iL_0t)dt. Number theoretical constraints from p-adicization suggest a quantization of t as t=SUM_i n_iy_i>0, where z_i=1/2+y_i are non-trivial zeros of Riemann Zeta.
- At space-time level the "ends" of orbits of partonic 2-surfaces coincide at vertices so that also their images N_i subset M also coincide. The condition N_i= N_j=...=N, where the sub-factors N at different vertices differ only by automorphism, poses stringent conditions on the values t_i and Bohr quantization at the level of M results. Vertices can be obtained as a vacuum expectations of of the operators creating the states associated with the incoming lines (crossing symmetry is automatic).
- The equivalence of loop diagrams with tree diagrams would be due to the possibility to move the ends of the internal lines along the lines of the diagram so that only diagrams containing 3-vertices and self energy loops remain. Self energy loops are trivial if the product associated with fusion vertex and co-product associated with annihilation compensate each other. The possibility to assign quantum group or Kac Moody group to the diagram gives good hopes of realizing product and co-product. Octonionic triality would be an essential prerequisite for transforming N-vertices to 3-vertices. The equivalence allows to develop an argument proving the unitarity of S-matrix.
- A formulation using category theoretical language suggests itself. The category of space sheets has as the most important arrow topological condensation via the formation of wormhole contacts. This category is mapped to the category of II_1 sub-factors of configurations space Clifford algebra having inclusion as the basic arrow. Space-time sheets are mapped to the category of Feynman diagrams in M with lines defined by unitary rotations of N_i induced by Delta^{it}.

*3.4 Is hbar dynamical?*The work with topological quantum computation inspired the hypothesis that hbar might be dynamical, and that its values might relate in a simple manner to the logarithms of Beraha numbers giving Jones indices M:N. The model for the evolution of hbar implied that hbar is infinite for the minimal value M:N=1 of Jones index. The construction of a model explaining the strange finding that planetary orbits seem to correspond to a gigantic value of "gravitational" Planck constant led to the hypothesis that when the system gets non-perturbative so that the perturbative expansion in terms of parameter k=alpha Q_1Q_2 ceases to converge, a phase transition increasing the value of hbar to hbar_s= k*hbar/v_0, where v_0=4.8*10^{-4} is the ratio of Planck length to CP_2 length, occurs. This involves also a transition to a macroscopic quantum phase since Compton lengths and times increase dramatically. Dark matter would correspond to ordinary matter with large value of hbar, which is conformally confined in the sense the sum of complex super-canonical conformal weights (related in a simple manner to the complex zeros of Riemann Zeta) is real for the many-particle state behaving like a single quantum coherent unit. The value of hbar for M:N=1 is large but not infinite, and thus in conflict with the original original proposal. A more refined suggestion is that the evolution of hbar as a function of M:N=4cos^2(pi/n) can be interpreted as a renormalization group evolution for the phase resolution. The earlier identification is replaced by a linear renormalization group equation for 1/hbar allowing as its solutions the earlier solution plus an arbitrary integration constant. Hence 1/hbar can approach to a finite value 1/hbar(3)= v_0/k*hbar(n-->\infty) at the limit n--> 3. The evolution equation gives a concrete view about how various charges should be imbedded in Jones inclusion to the larger algebra so that the value of hbar appearing in commutators evolves in the required manner. The dependence of hbar on the parameters of interacting systems means that it is associated with the interface of the interacting systems. Instead of being an absolute constant of nature hbar becomes something characterizing the interaction between two systems, the "position" of II_1 factor N inside M. The interface could correspond to wormhole contacts, join along boundaries bond, light-like causal determinant, etc... This property of hbar is consistent with the fact that vacuum functional expressible as an exponent of K\"ahler action does not depend at all on hbar. For more details see the

**new**chapter Was von Neumann Right After All? Matti Pitkanen

## No comments:

Post a Comment