Wednesday, July 26, 2006

My countryman in Wikipedia

I learned from a comment of anonymous to previous posting that Lubos Motl has written something to Wikipedia about finnish physicist Matti Pitkanen whom I happen to know quite well. Knowing Lubos the text could have been much more nasty and I am proud that Lubos sees the trouble of writing something about my countryman who is probably not experienced as any threat for string hegenomy. The text below is the stub by Lubos.
Matti Pitkanen is a Finnish alternative theoretical physicist who has attempted to prove the Riemann hypothesis, worked with p-adic numbers, and proposes an unusual theory called TGD that no other physicist understands.
I would like to suggest a couple of corrections. Pitkanen proposed a "strategy for proving Riemann hypothesis" (as a matter fact a proposal for a sketchy proof based on the identification of zeros as the spectrum of conformal weights of certain conformally invariant physical system: I understand why he choose the cautious formulation). I happen to know that Pitkanen is still working intensively with p-adic numbers and has some strange ideas about how to generalize the notion of number by fusing reals and p-adics to a larger structure. He seems also to believe that p-adic physics could provide the physics of cognition and intentionality. I would like to complete the stub but better not. I still remember the bloodthirsty furor stimulated by my attempt to fill the stub about TGD inspired theory of consciousness which is also one of the great passions of my countryman but not mentioned in the stub.


At 3:09 AM, Anonymous Philippe VIOLA said...

Hi, Matti.

I quickly realized you're someone with a wide scientific knowledge. So, may I give a critical analysis of your work that is intended to be constructive and will probably help you in understanding the scientific community' position towards your work ?
1) In the texts you put on line, whatever the chapter, there's legion of suppositions, hypothesis, intuitive deductions. The conditional is the usual form. I would rather prefer equations and step-by-step rigorous mathematical proofs. Otherwise, nobody can accept what you propose for granted, even if it's correct.
2) I had a look at your paper on the quantization of Planck's constant. You base your reasoning on a fundamental paper by Da Rocha and Nottale. I uploaded the article in question. In there, there is no such quantization. The quantization formula does indeed involves the scattering coefficient D, but : a) Da Rocha & Nottale DO NOT identify their D with ihbar/2m for as such, they only say their gravitational equation is SCHRODINGER-LIKE, not SCHRODINGER'S !!! b) Their D remains real-valued ; c) the introduction of the integer n in their formula DO NOT apply to D at all : eq. (22) they give is
E_n/m = -G²m²/(8D²n²) = -w_o²/2n².
Consequently, only energy is quantized, which is perfectly normal... if there had been some quantization process of the scattering coefficient, as you suggest, we would have had a D(n) instead of D in the above formula and D itself would have been n-dependent. Of course, you can always read the product Dn as D(n), but I don't think it's the spirit of the work, basing myself on the reasoning that preceeds this formula.
3) As I already pointed it out to you in a former email (and you were besides not convinced either by a complex velocity), the Lagrangian Nottale proposes and that leads him to his derivation of the Schrödinger equation, this lagrangian is NO LONGER invariant. By Noether's theorem, this means that physical quantities are NO LONGER CONSERVED, or, to put it only slightly differently, that his Langrangian represents NOTHING PHYSICALLY OBSERVABLE ANYMORE...
Maybe will he argue that Noether was wrong, after all ?...

I face similar difficulties as you, however my own situation is completely different : I'm a SELF-TRAINED mathematical physicist... But, apart from conceptual mistakes in my concern of trying to find alternative models to existant ones too, my reasonings are exclusively based on CALCULUS and RIGOROUS MATHEMATICAL PROVES. Nowhere, in my work will you find hypothesis, suppositions and even less speculations.
This actually uselessly spoils your own capacities. And I honnestly regret it.
Stop theoretical suppositions, do a 100% rigorous work based on sole calculus and try your chance again. Maybe you'll find a ear this way. I hope so for you, mate. :-)

At 6:35 AM, Blogger Matti Pitkanen said...

I think that we have a different view about what the construction of physical theory means. It would be very easy to postulate a simple Lagrangian, deduce Feyman rules and calculate cross sections. Unfortunately this simple linear approach has failed to produce insights about really interesting problems during the last thirty years. We have standard model but no understanding about why it has the symmetries it has, no understanding about particle mass spectrum, etc. Superstring theory tried to continue this hyper-technical approach and I think that its proponents are ready to admit the failure of this approach and are realizing that nothing less than a paradigm shift is needed.

The fundamental questions are simply such that the standard Lagrangian approach is useless: the questions why standard model symmetries, what is the origin of conformal invariance, what gives rise to the quantization of masses, how Poincare invariance could be consistent with gravitation, etc... require completely new level of thinking. Calculus provides no help in this kind of problem since one of the many challenges is to build the calculus!

My own approach to these problems is essentially pattern completion/bootstrap based on the new paradigm defined by the fundamental assumption that spacetimes are surfaces in M^4xCP_2. I fullheartedly but proudly admit that this approach is almost a diametrical opposite to the process in which results are deduced by formal symbol manipulations from an action principle. After all, I am doing my best to test what the fundamental postulates could be! The strategy is the attempt to develop in detail the new ontology in close interaction with information provided by the rich spectrum of anomalies.

That this approach is not completely futile is demonstrated by the fact that Lubos Motl sees the trouble of commenting it without claiming it to be crackpotness. He does not treat loop quantum gravity in the same manner. And we must remember that Lubos Motl must choose carefully his wordings: certain purposeful ambivalence is the only strategy in a situation when you are regarded as a soldier in the troops of the dominating theory. The claim that no one understand TGD is part of this policy. Of course, there are many people who can understand what I am saying. The barriers are basically psychological.

Concerning Nottale's approach and mine. The only input in my approach is the empirical evidence for the quantization of planetary orbital radii according to Bohr rules and my interpretation is completely different from that of Nottale. I am happy that I learned from Nottale's work the formula of gravitational Planck constant consistent with Equivalence Principle. The rest is definitely something totally different.

With Best Regards,


At 11:13 AM, Anonymous Philippe VIOLA said...

"After all, I am doing my best to test what the fundamental postulates could be!"

I can see it and that's why I'm giving you my personal point of view on yur approach. :-)
But you're a bit hard with theoretical physics : high-energy physics is not limited to superstring models and you know it.
On mass hierarchy, the Gell-Mann-Okubo formula is rather good, isn't it ? As are QED, GSW and QCD, no ? Great advances in nuclear physics have been done these last 30 years thanks to these theories.
In the late 1960s, string models were already bad. Adding them supersymmetry could only give something worse.
Physics is something, physical sectarim is something else with no connection at all.
You wanna do something different ? I can do nothing else but to strongly encourage you in this way !
You wanna explore a new kind of calculus ? Excellent !
But you're not technical enough for that : you are a THEORETICIAN. As such, you have to FORMALLY PROVE EVERYTHING YOU INVESTIGATE, EVERY SINGLE ASSERTION YOU MAKE. You have to give people MEANS TO CHECK YOUR DEDUCTIONS.
Look, the paper preceding this one on your blog, where you speak of rotating magnetism : I didn't even read more than a few lines. Why ? Because there is no mean of checking what you say. It may be correct, it may be wrong. How can we know it ? Do you have a technical paper, with a FORMAL THEORY followed by CALCULATED FORECASTS ?
See what I mean ? :-)

At 1:21 PM, Anonymous Philippe VIOLA said...

I finish my reply, interrupted by dinner. :-)
So much for me : there's a technical ref at the end of your paper on magnetic anomaly. I'll have a look at it.
About the quantization of Planck's constant:
1) there's no need for such a quantization, since there already exist one and for long : the famous Sommerfeld-Wilson conditions J_k = h.n_k, where the n_k's are integers. And it's not a quantization of h, but of Delaunay's phase action variables. A further quantization of h would give h(n) and lead to what ? If there was such a non linear dependence of the J_k's with respect to the n_k's, it would have already been experimentally detected, especially at large-scales. Do not forget, indeed, that Delaunay's treatment of separable systems was originally made for planetary motions...
2) A quantization of h is not physically nor mathematically consistent with the foundations of quantum theory, the historical development being : Delaunay -> Sommerfeld-Wilson -> Planck -> Einstein on Sommereld-Wilson -> De Broglie for the wave-mechanical approach -> Schrödinger for the statistical wave interpretation.
There is no place nor any need for a further quantization of h in all these developments.
I may be wrong, as usual, but if dark matter was, as you suggest, a large-scale quantum state of matter, I think its effects would have been detected for long too, at least about coherence phases. look at neutron stars, dwarves and even black holes. Sure that a cosmic-large superfluid would not remain undetected for long...
That's what makes me seriously dubious about the inexistence of that so-called "dark matter", + additional theoretical arguments on our fundamental models, including classical RG, for the description and understanding of the observable universe.

At 8:55 PM, Blogger Matti Pitkanen said...

"But you're a bit hard with theoretical physics : high-energy physics is not limited to superstring models and you know it.
On mass hierarchy, the Gell-Mann-Okubo formula is rather good, isn't it ? As are QED, GSW and QCD, no ?"

What I was talking about whas theorizing done after standard model, GUTS, super symmetric theories, string models. And I believe that some of the mathematical structures discovered during these years will remain a part of the future theory.

By the way, Gell-Mann-Okubo formula might be an accident since flavour SU(3) itself can be seen as a mere accidental symmetry. p-Adic mass calculations provide different origin for the formula and predict the low lying hadron masses with a better accuracy. At high energies new exotic states are predicted and there are anomalies swept under carpet during giving support for these anomalies (Aleph anomaly, the bumps in the mass distribution of top quark).

About rotating magnetic systems: these blog postings are simply short summaries. There is also a detailed article of length about 100 pages at my homepage providing a detailed model for the loss of weight based on catastrophe theoretic model and making some predictions.

But this kind of accuracy is only formal accuracy. The really challenging part of the work is qualitative analysis, constructing a coherent view about the classical and possible new physics of this very complex system. Comparison of various options. And most importantly, tying the explanation of this anomaly to other anomalies. Explanation of just single anomaly cannot be taken seriously. An anomaly whose very existence is questionable is for a scientists able to tolarate uncertainties and working holistically, not for a blind mathematician!

Best Regards,

At 10:45 PM, Blogger Matti Pitkanen said...

A comment related to the quantization of Planck constant.

a) The quantization of Planck constant while staying in the framework of wave mechanics would of course be non-sense. The mathematical framework involved is much much more general: von Neumann algebras known as hyperfinite factors of type II_1 is the techical term. Quite a profound generalization of the notion of 8-D imbedding space H=M^4xCP_2 is needed in order to fuse the physics with different Planck constants to single one. This relates closely to quantum groups, non-commutative spaces, conformal field theories,...

b) The quantization of Planck constant implies fractionization of the integer m characterizing angular momentum projection. Anyons and fractional statistics are established physics and a possible alternative interpretation would be in terms of quantization of h. Fractional quantum Hall effect could be integer quantum Hall with increased Planck constant. But this
is something that should be worked out.

c) Phases with large value of quantized Planck constant can be identified as dark matter. Quite many physicists agree about its existence and there are detailed maps about its distribution but we know practically nothing about its physics to make it "visible" by experimental means. Planetary Bohr orbits and other similar effects claimed by Nottale and others could be seen as an effects of quantum coherent dark matter on the distribution of visible matter.

For intance, the model for Bohr quantization of planetary orbits predicts strong number theoretical constraints on the ratios of planetary masses (ratios of integers n defining n-polygons constructible using only ruler and compass) satisfied with 10 per cent accuracy and predicts correctly the ratio of the densities of visible and of dark matter+energy. More generally, the prediction that preferred values of n correspond to these polygons means very strong testable predictions. The hypothesis that quantum coherent dark matter controls biomatter is a testable hypothesis. A good example is the model of EEG predicting a fractal hierarchy of scaled EEGs and thus a hierarchy of biorhytms.

With Best Regards, Matti

At 1:40 AM, Anonymous Philippe VIOLA said...

Quite many physicists will always prefer to introduce some kinds of "exotic" speculation rather than accepting that their models may be wrong. This is true for everything and remains true for the dark matter hypohesis. Instead of telling themselves "we may be wrong", they say "there's something wrong with the Universe"...
I'm a 100% for a Bohr-like model for planetary orbits. But I remain deeply convinced that some dark matter, if quantum, would necessarily induce visible perturbations on ordinary matter. Moreover, it would produce energy.
There, what do we have ? We have 95% of matter missing and this 95% is attributed to a new kind of matter that is soooo exotic that it cannot be detected and produces no detectable energy...
What the heck can then be such a matter, if not darker than darkness itself ???
I prefer telling myself that darkness is definitely in the physicists' mind, not in cosmos...

At 3:58 AM, Blogger Matti Pitkanen said...

The question is about what interpretation one adopts for the anomalies and the situation here is very similar to that for quarks.
All evidence for quarks is indirect but very few of us questions the reality of quarks anymore.

Dark matter is visible via its gravitational interaction if one accepts standard view about gravitation. If one accepts TGD based view it is detectable via very many other effects and one ends up to a quantum biology in which dark matter implies a huge number of testable effects many of them essential for the model of say EEG.

Detectability is possible only if we know how to detect: solitons is a second excellent example here. All this is about the belief system that one is ready to adopt to interpret the empirical facts and the most effective belief system wins in the long run.


At 11:55 PM, Anonymous Philippe VIOLA said...

"Detectability is possible only if we know how to detect: solitons is a second excellent example here."

In which model ? Not in classical GR : a mathematical theorem from Lichnerowicz in the early 1950s showed that there was no gravitational solitons in GR.

"All this is about the belief system that one is ready to adopt to interpret the empirical facts and the most effective belief system wins in the long run."

For centuries, not to say millenaries, physical theories have been built to model observational facts. Since GR, we started to build theories first, than confront them to observation. Today, as we investigate energies that are far beyond our facilities, we can no longer test our models. So the procedure has become to build speculations first, then confront them to theories and adapt the theoretical models so that it suits their creators.
Where is physics, today ?
What kind of a "science" are we now talking about, other than permanent self-satisfaction, mutual congratulations and, above all, unique thinking ? :-(

At 2:02 AM, Blogger Matti Pitkanen said...

I mentioned solitons as a general example: only for few decades ago no one knew about solitons. When they were discovered (or rediscovered) they were suddenly seen everywhere.

Particle physics is testing of theories since signal-to-noise ratio is so high. It is however the stubborn belief on reductionism which has led to the catastrophic situation in string models. If you believe that the problem is to extend physics from electroweak length scale to Planck scale then experimental testing is out of question and theorizing reduces to a wreckless speculation the only possible hope related to possible large additional dimensions.

If you are ready or forced to give up the reductionistic dogma you suddenly have an immense spectrum of anomalies covering the length scales from particle physics to cosmology. Theorists could not hope for anything better. If you are ready to take also consciousness seriously the situation improves further.

I would like to mention the latest anomaly that I learned of. In the most recent New Scientist there is an article about evidence that the objects believed to be black holes might not actually be black holes: they possess magnetic field. This is what TGD predicts for the asymptotic state of the star as a rotating dynamo like object. The model is based on very simple assumption: gravitational 4-momentum current (related simply to Einstein tensor) is conserved as one can expect in stationary situation. This gives field equations analogous to minimal surface equations with the metric replaced with Einsten tensor.


At 4:19 PM, Anonymous Philippe VIOLA said...

Sorry, Matti, maybe it's because it's late (or rather early in the morning ;-) ), but I don't see the connection between having a magnetic field and not being a black hole. Where's the contradiction ? On the no hair theorem ? It's usually assumed a black hole only keeps its mass, electric charge and angular momentum. With a charge, you create an electric field. With the electric field, you create a magnetic one (Rot E = -dB/dt). So, where's the point ? 8-(

At 9:28 PM, Blogger Matti Pitkanen said...

I repeated blindly the statement in New Scientist article. If magnetic dipole moment can be counted as a hair then black hole has no hair excludes magnetic field. Static (as opposed to stationary as in TGD framework) situation does not allow rotational E since one must have dB/dt=0.

I looked wither whether it might be possible to have vacuum extremals allowing imbedding of a piece of a magnetic dipole field in TGD framework. This does not seem plausible. In principle generic 4-surface allows the imbedding of a piece of an arbitrary gauge potential. Vacuum exremals have however 2-D CP_2 projection so that only two CP_2 coordinates are available and this leads to too strong integrability conditions.

The construction of dynamo like solutions as vacuum extremals is however rather easy by using axial symmetry and stationarity and the TGD variant of field equations given by D_a(G^{ab}\partial_bh^k)=0 saying that gravitational four-momentum is locally conserved in a stationary situation leads to the TGD counterpart of black hole solutions. Its properties resemble those of "magnetars". What is unexpected that the density of gravitational mass is concentrated on a spherical shell.


At 7:50 AM, Blogger Lumo said...

Dear Matti,

just to be sure: I wrote it because I was going through the list of "articles demanded in the physics category" in which, at that time, I was writing stubs for every entry I could.

Hope that you found the stub to be a good and objective starting point. ;-)

All the best


Post a Comment

<< Home