Is this picture consistent with the picture based on n-furcations? This seems to be the case. The conservation of energy in n-furcation in which several sheets are realized simultaneously is consistent with the conservation of classical conserved quantities only if the space-time sheet before n-furcation involves n identical copies of the original space-time sheet or if the Planck constant is h_{eff}=nh. This kind of degenerate many-sheetedness is encountered also in the case of branes. The first option means an n-fold covering of imbedding space and h_{eff} is indeed effective Planck constant. Second option means a genuine quantization of Planck constant due to the fact the value of Kähler coupling strength α_{K}=g_{K}^{2}/4πhbar_{eff} is scaled down by 1/n factor. The scaling of Planck constant consistent with classical field equations since they involve α_{K} as an overall multiplicative factor only.

For background see the chapter "Does TGD predict a spectrum of Planck constants" of "Towards M-matrix".

**Note**: The new (temporary) address of my homepage is http://www. tgdtheory.fi. The only change is the replacement of "com" with "fi" and one can get from any link to the new address just by replacing "com" with "fi".

## 9 comments:

This seems interesting. Information as fundamental. Then it is not bound to matter or spinors? http://www.nature.com/news/physics-quantum-quest-1.13711

This sounds very good in my ears. Her a generalized probability theory. Free paper. http://www-bisc.cs.berkeley.edu/zadeh/papers/GTU--Principal%20Concepts%20and%20Ideas-2006.pdf

The foundations of quantum theory deserve serious study. Unfortunately, most of the attempts to understand quantum mechanics start from wave mechanics of thirties and it does not lead anywhere.

It is a pity that the recent "foundationalists" usually are not usually generalists. Either their try to reduce quantum to classical physics or try to get rid of probability amplitudes and end up with artificial constructions.

TGD has forced to ask about foundations because otherwise there is no hope about quantitative predictions. In "worlds of classical worlds" framework quantum states are *classical* spinor fields in WCW so that everything except quantum measurement is reduced to classical field theory, albeit in infinite-D context resulting when pointlike particle is replaced with 3-D surface in 8-D imbedding space. What is also beautiful that the notion of spinor is purely geometric.

Quantum measurement theory proper in turn extends theory of consciousness. The nature of time is one of the unavoidable questions here.

To Ulla:

I associate Zadeh (the author of the article that you gave link) with fuzzy sets (by the way, I have listened a lecture by Zadeh in Liege). In TGD the notion of finite measurement resolution is akin to that of fuzziness.

Matti,

your metaphor of such a hierarchy to the pages of a book, including the idea these are brane-like, Is a good one. The actual possibility is ahead of the pack and closer to the next big idea in physics troubling most everybody in one view or another.

But consider the properties of a book, especially the older ones with some ink illustrations between the pages. After some years the images can bleed thru a page and make a ghostly copy on other pages sometimes even skipping one.

Such ideas, anything that does not see Planck's constant as a sort of minimum that holds up a total theory miraculously- such as where it comes up in string theory- is not accepted well by standard physicists from many diverse model interpretation.

So the idea, while true in its own context yet not explained in depth seems about forces that propagate as an inverse to n, not n^2 nor any that involve some power of 2^n in a real continuum, much like the image on the Shroud of Turin confirmed authentic of the time but hardly proven from a unique higher being in the flesh. Can we know the deeper theory or nature in these still or forever hidden realms? What is he substance of the universe is the usual question along with where does the information go.

A narrow or reduced theory of action as multi ply or not over a focused region is the same problem of one or many-worlds which in a more general case may be one idea. Riemann did not say in these matters his geometry was complete.

So too, as Ulla points out (which I have not read the link) it is also about probabilities when it comes to the uniqueness of "the class of things as an irreducible number that is a prime including Gaussian and so on. What makes this aspect of primacy possible and generally unique in the first place? What forbids the pages from not being dissimilar as say a page on the internet that in its unfolding flow has turned into an almost unmanageable mess of complexity?

This would suggest that the pages each are p-adic in fractal complexity and the entanglement is only observable on one shell or layer of a higher particle, atom, group of galaxies exploding but closed and finite over the shell.

So where the story has a flow we so follow it in our awakening minds and developing theories as best we can, restrained perhaps by unique genetic scaffolding and endowment but certainly in the creative cortex of our dreams and imagination that a first blush does seem a seat of consciousness or memory of some type, even predictability.

Is the action in a reduced linear sense or over the M-like field different in the informational and physical content? Lubos posts that string theory explains quantum gravity and other theories as incomplete are just wrong. What is a trivial reduction to one camp is actuality of the other. Rotating Feynman diagrams does not show the deep nature of asymmetry as part of the picture nor the short range of entanglement, nor the coupling strength hierarchy as well that acts like an overall multiplicative factor of possible things outside it.

Philosophically, locally for example, we debate that something can only come from something and nothing leads to nothing- but as dark matter concepts hint and many try to relate to it conceptually, nothing can lead to something and something to nothing. Ones and zeros and so on. But do we not ask such questions scientifically and mathematically of the universe?

http://dreamikins.blogspot.com/2013/09/gravity-and-dark-flow-pushing-against.html

Some ideas put into a more poetic form.

Vitiello also uses the fuzzyness. This probability is not the usual one, and I suggest you study it :)

When I now have studied these things it seems to me that things went wrong already in the 30'ies. Can it be the war and the bomb, really, that forced physicists to 'hide' the truth, as some claim?

I recently went back to how Feynmann got going with his diagrams, working on the Lamb shift in spectral lines. Bethe had the problem as a Jacobi Last Multiplier K, which is more general than Hamilton-Jacobi theory, assuming no potential theory. All 30s atoms had point poterntials for the nucleus, which is actually lame.

Feynmann's corrections then cover just permittivity and permeability, defining actual speed of radio waves inducing the shift, or frequency-dependent resistance for a current. This gives relativistic sheeting, and Kaehler theory was then developed for x-ray crystallography.

Current JLM theory:

arxiv.org/abs/0809.0022v1

How to restart string theory from Nambu:

arxiv.org/abs/math/990e128v3

"Jacobi brackets are what we gat in the binary case" and algebraic treatment is possible. This could open up von Neumann Type I/II/III theory at last....

Hi Ulla!

The Victorians chased Grassmann into action-angle variables, which were nowhere in QM, and "scientific materialism" triumphed at last. Untill the gauge phase took Weyl by surprise, and tracked the SM. Vitiello et al. run it down in mixing angles, which is very interesting.

When Dirac spoke of an infinite "Sea" of negative-parity states the community went blank due to finitism in logic - until the constructive or intuitionist logic caught up, about mid-90s. But Newton already intuited a dual aether, a concession to the magnetic vortices of Descartes.

And Matti seems to have another angle on this very deep problem, related to the cost of measurement and cost-register automata: http://arxiv.org/abs/1111.0670

Post a Comment