Path integrals have caused a lot of suffering amongst theoretical physicists.
Lubos Motl gives a nice summary about Wick-rotation used quite generally as a trick to give some meaning to these poorly defined objects which have caused so much frustration.
The idea of Wick rotation is to define path integrals of quantum field theory in Minkowskian space M^4 by replacing M^4 temporarily by Euclidian space E^4, by calculating the integral here as Euclidian functional integral having more meaning, and returning back to M^4 by analytically continuing in various parameters such as the momenta of particles. The trick has been also applied in the hope of making sense of path integral of General Relativity as well as in string models.
I have never liked the trick, not because it is a trick, but just for the fact that this trick is needed at all. Something must be fatally wrong at the fundamental level. To see what this something might be, one must recall what Feynman did for long time ago.
How one ends up to path integral?
The path integral approach was abstracted by Feynman from a unitary time evolution operator by decomposing the time evolution to a product of infinite number of infinitesimally short time evolutions. After this "obvious" generalizations of the formalism lacking a real mathematical justification were made. Despite all the work done it can be safely stated, that the notion of path integral does not mathematically exist. The tricky definition of the functional integral through Wick rotation transforming it to functional (I will drop the attribute Euclidian in the sequel) integral is certainly not enough for a mathematician.
I hasten to add that even the functional integrals are deeply problematic since the introduction of local interactions automatically induces infinities, and only in the case of so called renormalizable theories there exist a prescription for getting rid of these infinities.
What are the implicit philosophical ideas behind path integral formulation?
When the best brains have been unable to give a real meaning to the notion of path integral despite a work of about six decades, it is time to ask what might be behind these difficulties and whether it could relate to some cherished philosophical assumptions.
a) Feynman's approach starts from Hamiltonian quantization and the notion of time is that of Newtonian mechanics. The representability of the unitary time evolution operator as sum over paths is natural in this context. No absolute time exists in the world of Special Relativity so that there are reasons to get worried. It might not be necessary nor even sensible in the Minkowskian context.
c) The sexy idea about the sum of all histories with the classical physics identified as the history corresponding to the stationary phase might be simply wrong. Even Feynman could be sometimes wrong, believe or not! One can quite well consider some other, more sensible, approach to define S-matrix elements.
d) Infinities are the basic problem of modern physics and are present for both path- and functional integrals. Local divergences are practically always present always as one tries to make a free theory interacting by introducing local interactions consistent with classical field theory. The basic assumption behind locality is that fundamental particles are pointlike. In string models this assumption is given up and there are indeed reasons to believe that superstrings of various kinds allow perturbation theory free of infinities. The unfortunate fact is that this perturbation series very probably does not converge to anything well-defined and is only an asymptotic series. The now-disappearing hope was that M-theory could resolve this problem by providing a non-perturbative approach to strings.
d) In the perturbative approach the functional integrals give rise to Gaussian determinants, which are typically infinite formally. They can be eliminated but are aesthetically very awkward. TOE should be maximally aesthetic!
These observations do not lead us very far but give some hints about what might go wrong. Perhaps the entire idea about sum over all possible paths with classical physics resulting via stationary phase approximation is utterly wrong. Perhaps the idea about space-time-local interactions is wrong and perhaps higher-dimensional fundamental objects might allow to get over the problems.
Neither Hamiltonian formalism nor path integral works in TGD
When I started to develop mathematical theory around the basic idea that space-times can be regarded as 4-dimensional surfaces in H=M^4xCP_2, I soon learned that perturbative approach fails completely. Indeed, it would be natural to construct a perturbation theory around canonically imbedded M^4 but for the only reasonable candidate for the action, Kähler action, the functional power series vanishes in the third order at M^4 so that the kinetic terms defining propagators vanish identically. For the same reason also Hamiltonian formalism fails completely. This is the case much more generally, and the enormous vacuum degeneracy (any 4-surface for which CP_2 projection belongs to at most 2-D Lagrange manifold is a non-deterministic vacuum extremal) kills all hopes about conventional quantization.
Geometrization of quantum physics as a solution to the problems
This puzzling state of affairs led to the idea that if quantization is not possible one should not quantize! This idea grew gradually to the vision that quantum states correspond to the modes of completely classical spinor fields of an infinite-dimensional configuration space CH of 3-surfaces, the world of classical worlds. This allows also the geometrization of fermionic statistics and super-conformal symmetries in terms of gamma matrices associated with the Kähler metric.
The breakthrough came from the realization that general coordinate invariance in 4-dimensional sense is the key requirement. The obvious problem is that you have only 3-dimensional basic objects but you want 4-dimensional Diff invariance. Obviously the very definition of the configuration space geometry should assign to a given 3-surface X^3 a unique four-surface X^4(X^3) for 4-D general coordinate transformations to act on it.
What would be the physical interpretation of this? X^4(X^3) defines the classical physics associated with X^3. Actually something more: X^4(X^3) is an analog of Bohr orbit since it is unique so that one can expect a quantization of various classical observables. Classical physics in the sense of Bohr orbitology would become part of quantum physics and of configuration space geometry. This is certainly something totally new and would mean a partial return from the days of Feynman to the good old days of Bohr when everything was still understandable and concrete.
There are also other implications. Oscillator operators are the essence of quantum theory and can be geometrized only if configuration space has Kähler metric defined by so called Kähler function. Since classical physics should be coded by this Kähler function, it should be defined by a preferred extremal X^4(X^3) of some physically meaningful action principle. The so called Kähler action, which is formally Maxwell action for CP_2 Kähler form induced to space-time surface, is the unique candidate.
The first guess is that X^4(X^3) could be identified as an absolute minimum of Kähler action. This is however a little bit questionable option since there is no lower bound for the value of Kahler action and if it gets negative and infinite, vacuum functional defined as the exponent of Kahler function vanishes. Indeed, it took 15 years to learn that this need not be the quite correct definition. A candidate for a more realistic identification came from a proposal for a general solution of field equations in terms of so called
Kähler calibration.
The magnitude of Kähler action would be minimized separately in regions where Lagrangian density L_K has a definite sign. This means that X^4(X^3) is as near as possible to a vacuum extremal. The Universe is maximally lazy energy saver! By minimizing energy of solution it might be possible to fix the time derivatives of the imbedding space coordinates at X^3 in order to find the X^4(X^3) by solving the partial differential equations as initial value problem at X^3. A considerable reduction of computational labor. This is of extreme importance, and even more so because Kähler action does not define a fully deterministic variational principle. There are indeed hopes of understanding the theory even numerically!
Generalized Feynman diagrams as computations/analytic
continuations
A generalization of the notion of Feynman diagram emerging in TGD framework replaces sum over classical paths with what might be regarded as computation or analytic continuation.
The first observation is that the path integral over all space-time surfaces with a fixed collection of 3-surfaces as a boundary does not make sense in TGD framework. Sum reduces to a single 3-surface X^4(X3) since classical physics in the sense of Bohr's orbitology is a quintessential part of configuration space geometry and quantum theory. Classical world is not anymore identified as a path with a stationary phase.
This suggests completely different approach to the notion of Feynman diagram. It however took quite a long time before I realized how to formulate this approach more precisely. The idea came when I constructed a TGD inspired model for topological quantum computation. In topological quantum computation braids are the basic structures and quantum computation coded into the knotting and linking of the threads of the braid.
This leads to a view that generalized Feynman diagrams do not represent sum over all classical paths but represent something analogous to computations with vertices representing some fundamental algebraic operations. A given computation can be carried out in very many equivalent manners and there always exists a minimal computation. In the language of generalized Feynman diagrams this would mean that diagrams with loops are always equivalent with tree diagrams. The summation over loops would be obviously multiple counting in this framework. This would be nothing but a far reaching generalization of the duality symmetry, which originally lead to string models. I have formulated this generalization in terms of Hopf (ribbon-) algebras
here and in a different manner
here.
That there are several equivalent diagrams would conform with the non-determinism of Kähler action implying several equivalent space-time surfaces having given 3-surfaces as boundaries. This of course correlates directly with the fact that the functional integral and canonical quantization fail completely.
The generalized Feynman diagrams could be also interpreted as space-time counterparts for different analytic continuations of configuration space spinor fields (classical spinor fields in the world of classical worlds) from a sector of configuration space with a given 3-topology to another sector with different topology (initial and final states of particle reaction in the language of elementary particle physicist). This continuation can be performed in very many manners but the final result is same always, just as in case of equivalent computations.
Getting rid of standard divergences
It is possible to get rid of path integrals in TGD framework but not from the functional integral over the infinite-dimensional world of classical worlds. This integration means performing an average over these well-defined generalized Feynman diagrams, one might say over predictions of finite quantum field theories. This functional integral in question could bring back the basic difficulties but it does not.
a) The vacuum functional over quantum fluctuating degrees of freedom defining the functional integral is completely analogous to a thermal partition function defined as an exponent of Hamiltonian in thermodynamics at a critical temperature. Kähler coupling strength is analogous to critical temperature, which means that the values of the only free parameter of the theory are predicted as they should in any respectable TOE.
The good news is that Kähler function is a non-local functional of 3-surface X^3. Hence the local divergences unavoidable in any local QFT are absent. If one would try to integrate over all X^4, one would have Kähler action and locality and all the problems of standard approach would be magnified since the action is extremely non-linear.
b) Vacuum functional is the exponent of Kähler function and in the perturbation theory configuration space contravariant metric becomes propagator. The Gaussian determinant is the inverse of the metric determinant and these two ill-defined determinants neatly cancel each other so that also aesthetic is perfect! Note that the coefficient of the exponent of Kähler function is also fixed.
A further good news is that there are hopes that the functional integral might be carried out exactly by performing perturbation theory around the maxima of Kähler function. These hopes are stimulated by the fact that the world of classical worlds is a union of symmetric spaces and for a symmetric space all points are metrically equivalent. In the finite-dimensional case there are a lot of examples about the occurrence of this phenomenon.
The conclusion is that the standard divergences are not present and that this result is basically due to a new philosophy rather than some delicate cancellation mechanism.
What about zero modes?
Is there something that could still go wrong? Yes. The existence of configuration space metric requires that it is a union over infinite dimensional symmetric spaces labelled by zero modes whose contribution to CH line element vanishes. An infinite union is indeed in question: if CH would reduce to single symmetric space, a 3-surface with size of galaxy would be equivalent with a 3-surface associated with electron. The zero modes characterize classical degrees of freedom: shape, size, and the induced Kähler form defining a classical Maxwell field on X^4(X^3).
In zero modes there is no proper definition of the functional integral. Here comes however quantum measurement theory in rescue. Zero modes are non-quantum fluctuating degrees of freedom and thus behave like genuine classical macroscopic degrees of freedom. Therefore a localization in these degrees of freedom is expected to occur in each quantum jump as a counterpart of quantum measurement. These degrees of freedom should be also correlated in one-one manner with quantum fluctuating degrees of freedom like the pointer of measurement apparatus with the direction of electron spin. A kind of duality between quantum fluctuating degrees of freedom and zero modes is required.
We would experience the macroworld as completely classical because each moment of consciousness identifiable as quantum jump makes it classical. It is made again non-classical during the unitary U process stage of the next quantum jump. Dispersion in zero modes, localization in zero modes, dispersion in zero modes,.... Like Djinn getting out of the bottle and representing a very long list of classical wishes of which just one is realized.
With this complete localization or localization to a discrete union of points in zero mode degrees of freedom, S-matrix elements become well defined. Note however that the most general option would be a localization into finite-dimensional symplectic subspaces of zero modes in each quantum jump. The reason is that zero modes allow a symplectic structure and thus all possible finite-dimensional integrals are well defined using the exterior powers of symplectic form as integration measure.
What about renormalization?
The elimination of infinities relates closely to the renormalization of coupling constants, and one could argue that the proposed beautiful scenario is in conflict with basic experimental facts. This not the case if Kähler coupling strength has an entire spectrum of values labelled by primes or subset of primes labelling p-adic length scales. In this picture p-adic primes label p-adic effective topologies of non-deterministic extremals of Kähler action. p-Adic field equations possess an inherent non-determinism and the hypothesis is that this non-determinism gives in an appropriate length scale rise to fractality characterized by an effective p-adic topology such that the prime p is fixed from the constraint that the non-determinism of Kähler action correspond to the inherent p-adic non-determinism in this length scale range.
The highly non-trivial prediction is that quantum non-determinism is not just randomness since p-adic non-determinism involves long range correlations due to the fact that in p-adic topology evolution is continuous. The proposal is that the long range correlations of locally random looking intentional/purposeful behavior could correspond to p-adic non-determinism with p characterizing the "intelligence quotient" of the system. This is a testable prediction.
The coupling constant evolution is replaced by the p-adic length scale dependence of Kähler coupling strength and of other coupling constants determined by it. The emergence of the analogs of loop corrections might be understood if there is a natural but non-orthogonal state basis labelled by quantum numbers which allow a natural grading. The orthogonalized state basis would be obtained by a Gram-Schmidt orthonormalization procedure respecting the grading. Orthonormalization introduces a cloud of virtual particles and the dependence of this Gram-Schmidt cloud on prime p would induces the TGD counterpart of renormalization group evolution.
It is clear that the classical non-determinism of Kähler action is the red thread of the proposed vision. By quantum classical correspondence space-time surface is not only a representation for a quantum state but even for the final states of quantum jump sequence. Classical non-determinism would thus correspond to the space-time correlate of quantum non-determinism.
Matti Pitkänen
No comments:
Post a Comment