1. What breaks electroweak symmetry?
Lubos gives the text book answer: the electroweak symmetry is broken by the Higgs field's vacuum expectation value. TGD allows Higgs but reduces the description of the symmetry breaking to much deeper level. CP2 geometry breaks the electroweak symmetry: for instance, color partial waves for different weak isospin states of imbedding space spinors have hugely different masses. The point is that electroweak gauge group is the holonomy group of spinor connection and not a symmetry group unlike color group, which acts as isometries.
For physical states are massless before p-adic thermal massivation due to the compensation of conformal weights of various operators. The most plausible option is that both the non-half integer part of vacuum conformal weight for particle and Higgs expectation are expressible in terms of the same parameter which corresponds to a generalized eigenvalue of the modified Dirac operator. Higgs expectation-massivation relation is transformed from causation to correlation.
2. What is the ultraviolet extrapolation of the Standard Model?
As Lubos violently explains that "UV extrapolation" in the above statement should be replaced with "UV completion". I would replace it with "the unified theory of fundamental interactions". Lubos of course answers as a proponent of string theory. The problem is that there is practically infinite number of completions so that the predictivity is lost.
TGD geometrizes the symmetries of the standard model and reduces them to the symmetries of classical number fields. Also octonionic infinite primes, one of the most exotic notions inspired by TGD, code standard model symmetries. The most general formulation of the World of Classical Worlds is as the space of hyper-quaternionic of co-hyper-quaternionic subalgebras of the local hyper-octonionic Clifford algebra of M8 or equivalent M4× CP2.
The answers by both Lubos and me involve also supersymmetry but in different sense. In TGD framework the oscillator operators of the induced spinor fields define the analog of the space-time SUSY so that the algebra of second quantization is replaced with N=∞ SUSY. This requires a modification of SUSY formalism but N=1 SUSY associated with the right handed coveriantly constant neutrinos emerges as preferred sub-SUSY and counterpart of N=1 SUSY. The construction of infinite primes involves also supersymmetry.
3. Why is there a large hierarchy between the Planck scale, the weak scale, and the vacuum energy?
These are the two most famous hierarchy problems of current physics as Lubos notices. In TGD framework Planck scale is replaced with CP2 length scale, which is roughly by a factor 104 longer than Planck length scale. Instead of Planck length it might be more appropriate to talk about gravitational constant which follows as a prediction in TGD framework.
p-Adic length scale hierarchy is needed to understand the hierarchy of mass scales. The inverse of the mass squared scale comes as primes which are very near to octaves of a fundamental scale. Powers of two near Mersenne primes or Gaussian Mersennes are favored and this predicts a scaled up copy of hadron physics, which should become visible at LHC. Quite generally, unlimited number of scaled versions of standard model physics are in principle possible.
The vacuum energy density is the basic problem of super string approach. How desperate the situation is is clear from the fact that rhetoric tricks such as anthropic principle are considered seriously. Empirical findings- for some reason neglected by colleagues - suggests that cosmological constant depends on time. In TGD framework the cosmological constant is predicted to depend on the p-adic length scale of the space-time sheet and behaves roughly like 1/a2, where a is cosmic time identified as light-cone property time. Actually the time parameter a is replaced by a corresponding p-adic length scale. The recent value is predicted correctly under natural assumptions.
What dark energy is is a second question. TGD suggests the identification as a matter at space-time sheets mediating gravitational interaction having gigantic values of Planck constant implying extremely long Compton lengths for elementary particles. This guarantees that the energy density is constant in excellent approximation. If gravitational space-time sheets correspond to dark magnetic flux tubes- expanded cosmic strings- the mysterious negative pressure can be identified classically in terms of magnetic tension. If one takes seriously the correlation of the intelligence of conscious entities with the value o Planck constant, these gravitational space-time sheets can be God like entities.
4. How do strongly-interacting degrees of freedom resolve into weakly-interacting ones?
Lubos regards this question as strange and expresses this using colorful rhetoric. Maybe Carroll refers to QCD and hadronization. M8-M4× CP2 duality relates low energy and higher energy hadron physics to each other in TGD framework and corresponds group theoretically to SU(3)-SO(4) duality, where SO(4) is the well-known strong isospin symmetry of low energy hadron physics. Or maybe Carroll talks about the technical problem of calculating the behavior of strongly interacting systems. Nature might have solved the latter problem by a phase transition increasing Planck constant so that perturbation theory based on larger value of Planck constant works. The particle spectrum however changes and system becomes anyonic in general.
5. Is there a pattern/explanation behind the family structure and parameters of the Standard Model?
I can only echo Lubos: of course there is. In super string models the large number of explanations tells that the real explanation is lacking. In TGD framework fermion families correspond to various genera for partonic 2-surfaces (genus tells the number of handles attached to sphere to get the 2-dimensional topology). There is an infinite number of genera but the 3 lowest genera are mathematically very special (hyper-ellipticity as a universal property), which makes them excellent candidates for light fermion families. The successful predictions for masses using p-adic thermodynamics and relying strongly on the genus dependent contribution from conformal moduli supports the explanation.
Bosons correspond to wormhole contacts and are labeled by pairs of general implying a dynamical SU(3) symmetry with ordinary bosons identified as SU(3) singlets. SU(3) octet bosons perhaps making themselves visible at LHC are predicted and serve as a killer test.
The symmetries of standard model reduce to the geometry of CP2 having a purely number theoretical interpretation in terms of the hyper-octonionic structure. Number theory fixes through associativity condition the dynamics of space-surfaces completely (hyper-quaternionicity or its co-property in appropriate sense).
6. What is the phenomenology of the dark sector?
Lubos sees the dark matter as something relatively uninteresting. Just some exotic weakly acting particles. How incredibly blind a theorist accepting 11-D space-time and landscape having absolutely no empirical support can be when it comes to actual experimental facts!
In TGD framework dark matter means a revolution in the world view. Its description relies on the hierarchy of Planck constants requiring a generalization of the 8-D imbedding space M4 × CP2 to a book like structure with pages partially characterized by the value of Planck constant. The most fascinating implications are in biology. Also the implications for our view about the nature of consciousness and our position in World Order are profound.
7. What symmetries appear in useful descriptions of nature?
As Lubos says, one must be careful what types of symmetries we are talking about. As Lubos says "Only global unbroken symmetries are "really objective" features of the reality. It's very likely that we have found the full list and it includes the CPT-symmetry, Poincare symmetry (including Lorentz, translational, and rotational symmetries), and the U(1) from the conservation of the electric charge. By adding color symmetry and separate baryuon and lepton conservation one obtains the symmetries of quantum TGD: this prediction follows from number theoretical vision alone.
Lubos mentions dualities relating descriptions based on different symmetries. In TGD M8-M42 duality manifests as the dual descriptions of hadrons using low energy hadron phenomenology (SO(4))and parton picture at high energies (color SU(3)).
There are good reasons to believe that TGD Universe is able to emulate almost any gauge theory for which gauge group is simply laced Lie group and stringy system (Mc-Kay correspondence, inclusions of hyper-finite factors and the book like structure of generalized imbedding space). These symmetries would be however engineered rather than fundamental symmetries.
8. Are there surprises at low masses/energies?
Lubos believes that there are no surprises without realizing that we ourselves are the most surprising surprise. Eye is not able to see itself without a mirror. The fact is that standard physics cannot say anything really interesting about life and consciousness. p-Adic physics, hierarchy of Planck constants, zero energy ontology,.... ; I believe that all this is necessary if one really wants to understand living matter.
9. How does the observable universe evolve?
Lubos believes in standard cosmology described by General Relativity as such. TGD predicts quantum version of standard cosmology. Smooth cosmological evolution is replaced by a sequence of rapid expansion periods serving as space-time correlates for quantum jumps increasing Planck constant for appropriate space-time sheets. This applies in all length scales and one especially fascinating application is to the evolution of Earth itself. Expanding Earth hypothesis finds a physical justification and one ends up to an amazingly simple and predictive vision about pre-Cambrian and Cambrian periods: this includes both meteorology, geology, and biology.
Zero energy ontology strongly suggests that the proper quantum description is in terms of the moduli space for causal diamonds (CDs identified as intersections of future and past light-cones). The entire future light-cone labeling the "upper" tips of CD and analogous to Robertson-Walker cosmology is replaced with a discrete set of points. In particular, the values of cosmic time come as octaves of basic scale for a given value of Planck constant. The spectrum of planck constants means that all rational multiples of CP2 time scale are in principle possible. Cosmic evolution as endless re-creation of the Universe- can be seen as the emergence of CDs with larger and larger size.
10. How does gravity work on macroscopic scales?
General Relativity is part of the description but zero energy ontology and hierarchy of Planck constants bring in new elements. The gigantic values of gravitational Planck constant make possible astroscopic quantum coherence for the dark matter at magnetic flux tubes mediating gravitational interaction and explain dark energy. Quantum classical correspondence suggests that the exchanges of virtual particles has classical description allowed by Einstein's tensor. In the case of planetary system a possible manifestation is the observation of Grusenick that a Michelson interferometer rotating horizontal plane produces constant interference pattern but in a vertical plane the interference pattern varies during rotation. If real this find is revolutionary. It might also directly relate also to the finding that the measured values of gravitational constant varies within 1 per cent. There has been no reaction from academic circles.
The assumption that gravitation in long length scales has been understood more or less completely is the basic dogma of string theorists. This despite the fact that the list of anomalies and intriguing regularities is really long. It is much more rewarding to impress colleagues with long and complex calculations than using the professional lifetime to a risky attempt to solve a real problem.
11. What is the topology and geometry of space-time and dynamical degrees of freedom on small scales?
In TGD framework "on small scales" can be dropped from the question. Many-sheeted space-time, hierarchy of Planck constants, p-adic space-time sheets serving as correlates of cognition and intentionality, zero energy ontology... All this means a dramatic generalization of the view about space-time in all length scales and a profoundly new way to interpret what we observe. If TGD is correct we really "see" the dark matter in biology and we really "see" p-adic physics via its interaction giving rise to effective p-adic topology of real space-time sheets leading to to extremely successful predictions for elementary particle masses.
Quantum group enthusiasts believe that space-time time becomes non-commutative in Planckian length scales. Some theoreticians believe that some kind of Planckian discreteness emerges. In TGD framework quantum groups emerge as a natural part of description in terms of a finite measurement resolution and in all length scales. Discretization appears as a space-time correlate for a finite measurement resolution but not as an actual discreteness. The finite resolution of cognition and sensory perception implies also an apparent discreteness. Also the hierarchy of infinite primes suggests description in terms of hierarchy of discrete structures.
At fundamental level everything is however continuous- in real or in p-adic sense in accordance with the generalization of number concept involving both fusion of real and p-adic number fields to a larger super structure and providing single space-time point with infinitely rich number theoretic anatomy. The talk about infinite primes (infinite only in real sense) sounds very unpractical but to my great surprise infinite primes lead to highly detailed predictions for the spectrum of states and quantum numbers.
12. How does quantum gravity work in the real world?
Lubos restates the basic belief of string theorists that Einstein's equations follow at long length scale QFT limit of super string models. In TGD framework Einstein's equation hold true too at this limit but quantal aspects are also present. The hierarchy of Planck constants -in particular gigantic values of the gravitational Planck constant at dark magnetic flux tubes mediating gravitational interaction- are essential for the gravitational physics of dark matter.
There are also several delicate effects such as Allais effect suggesting that the ultraconservative view of Lubos is wrong. With all respect, the builders of quantum gravity theories should really consider returning to the roots and also a serious consideration of experimental data. Otherwise they continue to produce useless formalism without any connection with the observed reality.
13. Why was the early universe hot, dense, and very smooth but not perfectly smooth?
The standard answer echoed by Lubos is in terms of inflationary cosmology. In TGD framework very early cosmology is cosmic string dominated. Space-time sheets appear later (at certain proper time distance from light-cone boundary). Inflationary cosmology is replaced with a sequence of expansion periods during which the cosmology is quantum critical at appropriate space-time sheets. No scales are present and 3-space is flat. The critical cosmology, which is unique apart from a parameter telling its duration describes the situation. This is extremely powerful prediction following from the imbeddability to M4× CP2 alone. Quantum criticality implies the universality of the dynamics during expansion periods.
Big Bang is replaced by a "silent whisper amplified to a Bang" since the energy density of cosmic strings behaves as 1/a2, where a denotes the proper time of light-cone. The moduli space of CDs suggests a cartesian product of M4×CP2 labeling the lower tips of CDs with its discrete version labeling the upper tips of CD. One must ask whether a CD corresponds to a counterpart of Big Bang followed eventually by a Big Crush.
14. What is beyond the observable universe?
"What is beyond the universe observable to us" would be a more precise formulation. The hierarchies of Planck constants and p-adic length scales, the hierarchy of conscious entities in which we correspond to one particular relatively low lying level, the hierarchy of infinite primes mathematically similar to an infinite hierarchy of second quantizations, the infinitely complex structure of single space-time point realizing algebraic holography,.... I find myself standing at the shore of an infinitely vast sea. The fundamental symmetries are the basic elementary particle quantum numbers are universal. This by the simple requirement that the geometry of the world of classical worlds exists mathematically and has number theoretic interpretation.
15. Why is there a low-entropy boundary condition in the past but not the future?
The form of the question reflects the erratic identification of the experienced time appearing in second law with the geometric time appearing as one space-time coordinate. After these 32 years this identification looks to me incredibly stupid but is made by most of colleagues despite the that the fact that these times are completely different. Irreversibility contra reversibility, only the recent moment and past contra entire eternity, etc... Here only consciousness theory could help but the patient stubbornly refuses to receive the medication.
Lubos however intuitively realizes that future and past are not in symmetric position in second law but is unable to ask what this means. He really believes that Boltzmann equations are all that is needed and never consider the possibility that these wonderful equations might make sense only under certain conditions.
In TGD framework the geometric correlate for the arrow of subjective time which by definition is always the same (consciousness as sequence of quantum jumps with past identified as quantum jumps that have already occurred and contribute to conscious experience) can in principle have both directions. Phase conjugate laser beams provide a basic example about the situation in which second law applies in "wrong" direction of geometric time. Also self assembly for biological molecules can be interpreted in this manner. Hierarchy of Planck constants implies that for given CD Boltzmann's equations make sense only for smaller CDs inside it. In living matter the Boltzmannian description fails.
In TGD framework the concept of low entropy boundary condition does not make sense. The subjective evolution applies the evolution of entire CD of cosmological size quantum jump by quantum jump. Boltzmann's equation apply only in scales considerably shorter than cosmological time. What is clear that one can speak only initial condition rather than boundary condition.
It is however not clear whether one can speak about the evolution of entropy as a function of cosmic time if identified as a coordinate of the imbedding space. Quantum classical correspondence might allow also the mapping of subjective time evolution to a geometric time evolution with respect to cosmic time. The low entropy of very early universe could correspond to that assignable to cosmic strings. The energy density of cosmic strings goes down as 1/a2 and entropy density as 1/a so that for a given comoving volume the entropy approaches to zero. The structure of moduli space of CDs suggests that positive of the upper tip of CD relative to the lower one defines a discretized cosmic time and the space-time correlate for entropy corresponds to the growth of entropy of CD as a function of this time in an ensemble of CDs. The asummetry between tips could be seen as a correlate for the arrow of time.
Carroll's idea about boundary conditions in future might make sense in the following sense. In zero energy ontology one has pairs of positive and negative energy sense and there is large temptation to think that there are two choices for the tip which corresponds to the discrete version of future light-cone.
16. Why aren't we fluctuations in de Sitter space?
If I have understood correctly the emotional rhetoric of Lubos, the idea of Carroll seems to be that intelligent life is just a random fluctuation rather than a long lasting evolution. For some reason he locates this fluctuation in de Sitter space. In the standard physics framework this view is however more or less unavoidable. The colleagues should really use some of their time to learn what we understand and what we do not understand about consciousness and brain to realize that the physics as they understand really fails to describe the physics of life.
Also Lubos is so fixated in his materialistic and reductionistic dogmas that he is unable to propose anything constructive. For instance, he does not ask how this undeniable evolution is possible in the framework of standard physics.
In TGD framework the hierarchy of Planck constants meaning a hierarchy of macroscopic quantum phases and hierarchy of time scales of memory and intentional action leads to a coherent overall view about what life is. Zero energy ontology provides a concrete realization how volition is realized in accordance with the laws of physics and makes possible a continual re-creation of the Universe.
17. How do we compare probabilities for different classes of observers?
I do not repeat the violent reaction of Lubos to this question. I am myself not at all sure whether I can catch the meaning of this question. Maybe I could interpret in terms of finite measurement resolution. Different measurement resolutions give rise to different M-matrices and probabilities and the comparison would require rules allowing to compare these probabilities. This comparison requires relationship between M-matrices at quantum level: probabilities are not enough. Renormalization group evolution as function of measurement resolution could provide the answer to ho compare the probabilities.
18. What rules govern the evolution of complex structures?
The text book answer of Lubos is "The detailed evolution of all complex structures is governed by the microscopic laws that govern the elementary building blocks, applied to a large number of ingredients".
The TGD inspired answer is based on the acceptance of fractal hierarchies: reductionistic dogma is replaced with fractality. The laws at various levels are essentially similar but every level brings something new: Mandelbrot set does not look exactly the same in the new zoom. It is not possible to reduce the behavior at higher levels that at the lowest level.
The hierarchy of infinite primes characterizes this idea number theoretically and -as there are reasons to believe- also physically. The construction of hyper-octonionic infinite primes is structurally similar to a second quantization of an arithmetic quantum field theory with states labeled by primes (rational, quaternionic, or octonionic). There is infinite hierarchy of second quantization with many particle states of the previous level becoming single particle states of the new level. At each level one has infinite primes analogous to free many particle states plus primes analogous to bound states.
One new element of emergence is association statistics. Permutations and associations are basic stuff of number theory and algebra. Quantum commutativity- invariance of the physical state under permutations in quantum sense leads to Fermi-, Bose- and quantum group statistics in effectively 2-D situation. Quantum associativity requires association statistics with respect to different associations of particles (replacing A(BC) with (AB)C can induce multiplication with +1,-1, or more complex phase).
At space-time level the hierarchy of space-time sheets is the counterpart for this hierarchy. p-Adic length scales define one hierarchy. Also space-time sheets characterized by a large value of Planck constant emerge as systems migrate to the the pages of the Big Book partially characterized increasing value of Planck constant and at which matter is dark relative to the observer with standard value of Planck constant, which corresponds to rational number equal to 1.
There is also a hierarchy of cognitive descriptions of the physical system. The higher the level in the hierarchy, the more abstract the description is and the less details it has. This is like the view of boss of big company as compared to that of a person doing something very concrete job.
p-Adic physics turns upside the reductionistic hierarchy proceeding from short to long scales. What is infinitesimal p-adically is infinitely large in real sense. This p-adic aspects is necessarily if we want to understand intentional systems able to plan their own behavior. p-Adic effectively topology means precise long range correlations and short range chaos which indeed characterizes the behavior of living matter. One can also say that p-adic physics provides the IR completion of physics.
19. Is quantum mechanics correct?
Quantum mechanics is not wrong. It however requires a profound generalization if we want to understand life. Also the gravitational anomalies and unexpected regularities at the level of planetary system suggest a generalization. Planck constant must be replaced with a hierarchy of Planck constants realized in terms of the "Big Book". Positive energy ontology must be replaced with zero energy ontology for which states correspond to physical events in standard positive energy ontology. S-matrix is replaced with its "complex square" root - M-matrix- having interpretation as square root of density matrix and making thermodynamics part of quantum theory. This generalization answers several frustrating questions raised in standard ontology. A further important modification is the introduction of the notion of finite measurement resolution realized in terms of inclusions of hyper-finite factors and having discretization as space-time correlate.
20. What happens when wave functions collapse?
The answer of Lubos is from the few pages of the standard quantum mechanics text book devoted to measurement problem. "A wave function is nothing else than a tool to predict probabilities; it is no real wave. When such an object "collapses", the only thing that it means is that we learned something about the random outcomes of some measurements, so we may eliminate the possibilities that - as we know - can no longer happen. For our further predictions, we only keep the probabilities of the possibilities that can still happen."
This answer brings in "we" but says nothing about what this "we" might be. This "We" remains an outsider to the physical world. Here we encounter the amazing ability of even admittedly intelligent persons to see the problem although it is staring directly at their face.
In TGD framework wave function collapse is involved with quantum jump re-creating the quantum universe. Speaking about space-time correlates this means that entire space-time surface (or rather their quantum superposition) is replaced with a new one. Both geometric past and future are replaced with a new one in quantum jump. There is no conflict with deterministic field equations (in generalized sense in TGD framework) since the non-determinism relates to subjective time identified as a sequence of quantum jumps rather than with geometric time appearing at classical field equations and Schrödinger equation.
Negentropy Maximization Principle stating the reduction of entanglement entropy in quantum jump is maximal implies standard quantum measurement theory. There are fascinating possibilities opened by the fact that for rational and even algebraic entanglement probabilities number theoretic analogs of Shannon entropy make sense and allow negentropic entanglement (emergence of information carrying stable quantum entangled states).
21. How do we go from the quantum Hamiltonian to a quasiclassical configuration space?
A more appropriate question would be "How to go from quantum description to classical description". Hamiltonian formulism relies on on Newtonian time and is given up already in Special Relativity. In General Relativity General Coordinate invariance makes Hamiltonian formalism even more un-natural.
In zero energy ontology the basic mathematical object coding for the predictions of the theory is M-matrix characterizing the physics inside given CD. It decomposes into a product of positive square root of diagonal density matrix and unitary S-matrix. The latter characterizes given CD and need not have any natural representation as an exponentiation of infinitesimal Hermitian operator- the Hamiltonian. This kind of picture is also in conflict with General Coordinate Invariance. In p-adic context unitary evolution becomes highly questionable also for number theoretical reasons. The counterpart of exponential function in p-adic context does not have the properties as it has in real context and the natural unitary operators involve roots of unity typically requiring algebraic extension of p-adic numbers and therefore have no description as unitary time evolutions.
In the formalism without Hamiltonian observables are replaced with algebras of various symmetries. Various super-conformal symmetries make these algebras infinite-dimensional. Modified Dirac equation brings in second quantization which reduces to an infinite-dimensional analog of space-time SUSY algebra.
How classical physics emerges from quantum theory is of course extremely important un-answered question although Lubos claims the opposite. This emergence has two meanings corresponding to geometric time and subjective time.
- Consider first geometric time. In TGD framework space-time surface is a preferred extremal of K\"ahler action and analogous to Bohr orbit. Classical physics in the geometric sense becomes an exact part of quantum physics and the geometry of the World of Classical Worlds. This is forced by the General Coordinate Invariance alone. Even more preferred space-time surfaces correspond to maxima of the K\"ahler function identified as value of K\"ahler action for a preferred space-time surface. In mathematically non-existing path integral formalism stationary phase approximation gives something believed to be enough for classical physics in this sense.
- Lubos talks also about de-coherence as a mechanism leading to classicality. This notion applies when one speaks about subjective time. When the time scale of observer is long as compared to the time scale of observed events (the CD of observer is much larger than those of observed systems so that quantum statistical determinism applies) decoherence taking place in sub-quantum jumps guarantees that all phase information is lost and quantum mechanical interference effects are masked out. The world looks classical in Boltzmannian sense but only for an observer looking the situation from a longer time scale.
22. Is physics deterministic?
Determinism is not valid in quantum universe as Lubos states. Determinism is valid at the level of field equations. These statements are contradictory unless one realizes that there are two different times. To understand these two times and their relationship one is forced bo make observer a part of the Universe instead of being outsider, that is to develop a quantum theory of consciousness. Amusingly, Lubos admits the non-determinism is a fact but denies that Schrödinger amplitudes which must behave non-deterministically in standard ontology, are real.
23. How many bits are required to describe the universe?
Currently around 10100 says Lubos. For me both the question and its answer are nonsense for the same reason as some other questions above. That people waste their time with this kind of questions shows how desperately physics needs an extension to a theory of consciousness. This is required also by neuroscience and biology. Lubos identifies this number as the entropy of the observed Universe. The notion in principle makes sense but not the identification. In TGD framework the entropy is also dependent on the resolution used. The better the measurement resolution, the larger the number of degrees of freedom, and the larger the entropy.
24. Will elementary physics ultimately be finished?
The answer depends on what one means with "elementary particle" and what one means with "finished"! TGD predicts in principle infinite hierarchy of scaled versions of what we have used to call elementary particle physics corresponding to hierarchies of p-adic length scales and Planck constants. The hierarchy of infinite primes suggests a generalization of elementary particle in which many particle states of given hierarchy level (space-time sheets) become single particle states of the new level (space-time sheets topologically condensed at large space-time sheets). Same Universal mathematical description applies at all levels but always something new emerges. Therefore my answer is realistic "No".