Tuesday, May 03, 2011

The puzzling situation in dark matter searches

Sean Carroll has explained in Cosmic Variance the latest rather puzzling situation in dark matter searches. Some experiments support the existence of dark matter particles with mass of about 7 GeV, some experiments exclude them. The following arguments show that TGD based explanation allows to understand the discrepancy.

How to detect dark matter and what's the problem?

Consider first the general idea behind the attempts to detect dark matter particles and how one ends up with the puzzling situation.

  1. Galactic nucleus serves as a source of dark matter particles and these one should be able to detect. There is an intense cosmic ray flux of ordinary particles from galactic center which must be eliminated so that only dark matter particles interacting very weakly with matter remain in the flux. The elimination is achieved by going sufficiently deep underground so that ordinary cosmic rays are shielded but extremely weakly interacting dark matter particles remain in the flux. After this one can in the ideal situation record only the events in which dark matter particles scatter from nuclei provided one eliminates events such as neutrino scattering.
  2. DAMA experiment does not detect dark matter events as such but annual variations in the rate of events which can include besides dark matter events and other kind of events. DAMA finds annual variation interpreted as dark matter signal since other sources of events are not expected to have this kind of variation. Quire recently also CoGENT reported the annual variation with 2.8 sigma confidence level (official report was represented at thursday in STSI Symposium). The mass of the dark matter particle should be around 7 GeV rather than hundreds of GeVs as required by many models. An unidentified noise with annual variation having nothing to do with dark matter could of course be present and this is the weakness of this approach.
  3. For a few weeks ago we learned that XENON100 experiment detects no dark matter. Also CDMS results exclude dark matter particles with masses around 7 GeV. According to Sean Carroll, the detection strategy used by XENON100 is different from that of DAMA: individual dark matter scatterings on nuclei are detected. This is a very significant difference which might explain the discrepancy since the theory laden prejudices about what dark matter particle scattering can look like, could eliminate the particles causing the annual variations. For instance, these prejudices are quite different for the habitants of the main stream Universe and TGD Universe.

TGD based explanation of the DAMA events and related anomalies

I have commented earlier the possible interpretation of DAMA events in terms of tau-pions. The spirit is highly speculative.

  1. Tau-pions would be identifiable as the particles claimed by Fermi Gamma Ray telescope with mass around 7 GeV and decaying into tau pairs so that one could cope with several independent observations instead of only single one.
  2. Recall that the CDF anomaly gave for two and half years ago support for tau-pions whereas earlier anomalies dating back to seventies give support for electro-pions and mu-pions. The existence of these particles is purely TGD based phenomenon and due to the different view about the origin of color quantum numbers. In TGD colored states would be partial waves in CP2 and spin like quantum numbers in standard theories so that leptons would not have colored excitations.
  3. Tau-pions are of course highly unstable and would not come from the galactic center. Instead, they would be created in cosmic ray events at the surface of Earth and if they can penetrate the shielding eliminating ordinary cosmic rays they could produce events responsible for the annual variation caused by that for the cosmic ray flux from galactic center.

Can one regard tau-pion as dark matter in some sense? Or must one do so? The answer is affirmative to both questions on both theoretical and experimental grounds.

  1. The existence of colored variants of leptons is excluded in standard physics by intermediate gauge boson decay widths. They could however appear as states with non-standard value of Planck constant and therefore not appearing in same vertices with ordinary gauge bosons so that they would not contribute to the decay widths of weak bosons. In this minimal sense they would be dark and this is what is required in order to understand what we know about dark matter.

    Of course, all particles can in principle appear in states with non-standard value of Planck constant so that tau-pion would be one special instance of dark matter. For instance, in living matter the role of dark variants of electrons and possibly also other stable particles would be decisive. To put it bluntly: in mainstream approach dark matter is identified as some exotic particle with ad hoc properties whereas in TGD framework dark matter is outcome of a generalization of quantum theory itself.

  2. DAMA experiment requires that the tau-pions behave like dark matter: otherwise they would never reach the strongly shielded detector. The interaction with nuclei of detector would be preceded by a transformation to a particle-tau-pion or something else- with ordinary value of Planck constant.

TGD based explanation for the dark matter puzzle

The criteria used in experiments to eliminate events which definitely are not dark matter events - according to the prevailing wisdom of course -dictates to high degree what interactions of tau pions with solid matter detector are used as a signature of dark matter event. It could well be that the criteria used in XENON100 do not allow the scatterings of tau-pions with nuclei. This is indeed the case. The clue comes from the comments of Jester in Resonaances. From a comment of Jester, one learns that Cogent - and also DAMA utilizing the same detections strategy - "does not cut on ionization fraction". Therefore, if dark matter mimics electron recoils (as Jester says) or if dark matter produced in the collisions of cosmic rays with the nuclei of the atmosphere decays to charged particles one can understand the discrepancy.

[By the way, also Lubos has commented the puzzling situation.]

The TGD based model for explaining the more than two years old CDF anomaly indeed explains also the discrepnacy between XENON100 and CDMS on one hand and DAMA and CoGENT on the other hand. Recall that the anomaly was put soon under the rug -presumably because it is so badly in conflict with standard model and its extensions- and I was finally expelled from Helsinki University as a consequence of my troublesome blog activities related to the anomaly. The TGD based model for the CDF anomaly can be found here . See also this blog posting and earlier blog postings and also old What's News at my homepage.

  1. To explain the observations of CDF one had to assume that tau-pions and therefore also color excited tau-leptons inside them appear as several p-adically scaled up variants so that one would have several octaves of the ground state of tau-pion with masses in good approximation equal to 3.6 GeV (two times the tau-lepton mass), 7.2 GeV, 14.4 GeV. The 14.4 GeV tau-pion was assumed to decay in a cascade like manner via lepto-strong interactions to lighter tau-pions- both charged and neutral- which eventually decayed to ordinary charged leptons and neutrinos.
  2. Also other decay modes -say the decay of neutral tau-pions to gamma pair and to a pair of ordinary leptons- are possible but the corresponding rates are much slower than the decay rates for cascade like decay via multi-tau-pion states proceeding via lepto-strong interactions.
  3. Just this cascade would take place also now after the collision of the incoming cosmic ray with the nucleus of atmosphere. The mechanism producing the neutral tau-pions -perhaps a coherent state of them- would degenerate in the collision of charged cosmic ray with nucleus generating strong non-orthogonal electric and magnetic fields and the production amplitude would be essentially the Fourier transform of the "instanton density" E⋅B. The decays of 14 GeV neutral tau-pions would produce 7 GeV charged tau-pions, which would scatter from the protons of nuclei and generate the events excluded by XENON100 but not by DAMA and Cogent.
  4. In principle the model predicts to a high degree quantitatively the rate of the events. The scattering rates are proportional to an unknown parameter characterizing the transformation probability of tau-pion to a particle with ordinary value of Planck constant and this allows to perform some parameter tuning. This parameter would correspond to a mass insertion in the tau-pion line changing the value of Planck constant and have dimensions of mass squared.

The overall conclusion is that the discrepancy between DAMA and XENON100 might be interpreted as favoring TGD view about dark matter and it is fascinating to see how the situation develops. This confusion is not the only confusion in recent day particle physics. All believed-to-be almost-certainties are challenged. To be honest (I am unable to be anything else): I have the feeling that TGD vision is what survives after we wait for a decade or two;-).

For TGD based model for CDF anomaly in terms of colored excitations of tau lepton and the earlier anomalies in terms of colored electrons and muons as well as a slightly more detailed version of tex here see the chapter The recent status of leptohadron hypothesis of "p-Adic Length Scale Hypothesis and Dark Matter Hierarchy".


At 12:40 PM, Blogger ThePeSla said...


I am not so sure we can say that a deep idea of dark matter is what the issue is here. But I get a grasp of just how the TGD framework can shed light on some of these issues. That is in reading some of your core ideas as foundational it may be but a small step to a wider view of things if you yourself can apply them or interpret them.

Of course I am lost in how we may compute the energies as you and Kea do- but in my last post I again came to her 256/81 value but I believe in a different level of the structures of such spaces.

We touch base on so many levels. But one thing I thought of as a major intuitive mistake on my part was my assertion there were only five quarks- and they announced the discovery of the top soon after. Yet in reading your post I begin to think that in some sense was not a mistake after all- just the descent into enshrining the limitations of current theories which of course would put your ideas out of their mainstream.

Nevertheless, after a couple of decades there appears something very strange about this top quark. It does seem somewhat unobservable in the Higgs-like sense as if it did live in a wider physics- and it could indeed be this idea of yours even if it is not the grounding of our ideas on dark matter- especially if we insist on these being in a sense a reductionist dust.

I have some general disagreements with some of your stuff but on philosophic grounds, again in how we interpret the hard issue of the finite and the infinite in equations.

I note here you have suggested ideas as to what is to be done in the development of the mathematics and theory. But otherwise you have stuck to the intuition and from my view your notions are much less speculative than the current state of our particle and cosmic physics.

The PeSla

At 4:18 AM, Anonymous Ervin Goldfain said...


It is my belief that understanding the structure of non-baryonic dark matter will require giving up traditional methods and interpretations based on QFT in its current form. Working with Feynman diagrams to compute cross-sections and lifetimes is not going to help much in dark matter searches.

In my opinion, what is essentially missing here is that the "hidden sector" lying beyond SM is no longer in dynamic equilibrium. One needs the tools of fractional dynamics and non-equilibrium critical phenomena for proper model building above the electroweak scale. Following this line of inquiry sets the stage for a natural account of the many puzzles and anomalies reported in recent years (CDF, Pamela, Fermi telescope, muon anomaly, top-antitop asymmetry, CP violation in the heavy quark sector and so on):





At 6:02 AM, Anonymous Matti Pitkanen said...

Dear Erwin,

it seems that we agree about fractaility, about some kind of non-equilibrium thermodynamics and about the insuffiiciency of the ordinary description based on Feynman diagrams.

In my own framework fractional dynamics would emerge from two sources having actually common origin. The p-adic fractality, which might relate to period doubling as you proposed in your earlier comment, is the first mechanism implying fractality.

The hierarchy of Planck constants means geometrically hierarchy of covering spaces due to the many-valuedness of time derivatives of imbedding space coordinates as function of canonical momentum densities. Space-time sheets can be seen as surfaces in the covering space of M^4xCP_2 locally: the number of sheets of covering characterizes the effective value of Planck constant. This would be behind charge fractionization and similar effects. Phases would be fractionized. In p-adic fractality one deals with length scales and hierarchy of Planck constants indeed means hierarchy of algebraic extensions of p-adic numbers defined by roots of unity.

The multi-valuedness implying covering space description is implied by the vacuum degeneracy of Kahler action implying also four-dimensional spin glass degeneracy analogous to gauge degeneracy for Maxwell field. This would correspond to non-equiibrium thermodynamics since spin glasses break ergodic theorem.

Ultra-metricity characterizes spin glass energy landscape and p-adic topologies are ultra-metric. This also suggests that their is deeper connection between the (effective)hierarchy of Planck constants.in the recent interpretation) and p-adicity.

At 6:39 AM, Anonymous Ervin Goldfain said...

Dear Matti,

Thanks for your comments.

Among other attractive features, what is really appealing about using fractal topology beyond SM is its ability to provide natural solutions to the following puzzles:
a) Emergence of exotic phases of matter with unexpected properties (dark matter candidates),
b) Charge quantization (without monopoles and other non-physical objects),
c) Symmetry breaking scenarios and anomalous behavior in SM,
d) Flavor multiplicity in SM (particle masses and couplings via Feigenbaum scaling)
e) Resolution of both gauge and cosmological constant hierarchy problems,
f) Unexpected duality between fractional dynamics in Minkowski space-time and classical gravity of General Relativity.



At 7:06 AM, Anonymous Matti Pitkanen said...

Dear Ervin,

it is fascinating to see that physics is in full swing again.

Just half an hour again I realized that if the reported 4 GeV mass difference between top and anti-top is real, it becomes much easier to accept that the D0 observation of two states with mass around 325 GeV with .2 GeV mass difference decay to muon pairs represent long lived kaon and short lived antikaon of scaled up hadron physics. If so, pion and kaons of new hadron physics would have been observed!

Concerning monopoles I disagree: I regard Dirac monopoles as unphysical but the the magnetic monopoles in TGD framework are homological reflecting the non-trivial second homology of CP_2. All elementary particles can be said to consist of magnetically charged wormhole throats. This hypothesis is testable and LHC will certainly say something about this.

The duality of fractional dynamics in M^4 and classical gravity of GRT: could you explain?

At 8:12 AM, Blogger ThePeSla said...


I think he means exactly what you are saying CP_2 but from M^4 view.

It is simply that it is a matter of choice to treat fractional charges or assume them unit charges and relate to the space and its mirrors.

While you understand the sequences in these spaces one has to see where these resonances default across all spaces and dimensions that seems to limit the possibilities of structures.

The generation problem is much more difficult to explain than even your stance on TGD and its multisheets and analogs of planck- which too set to unity may not be seen as a fundamental constant unless it is set so.

The PeSla

At 8:44 AM, Blogger ThePeSla said...


Between us (and I suppose everyone else) I may not interpret your term non-trivial second homology of CP-2.

But from your description of monopoles I quite imagine it more of a dihedral group thing. But would the wormholes themselves be physical and would they be observable. In some systems monopoles are not needed although I could agree with the mouths or throats idea with only implied connections. As to how this may relate to dark matter ideas our notions are too vague and need to be sorted out.

If we scale up physical particles we can generalize further to scaled up and thus in our familiar space limited, atom like structures.

The PeSla

At 9:22 AM, Anonymous Ervin Goldfain said...

"The duality of fractional dynamics in M^4 and classical gravity of GRT: could you explain?"

Dynamics on fractal spaces (spaces endowed with non-integer metric) is governed by fractional derivatives and integrals. Use of these operators in field theory leads to this tantalizing connection between fractional dynamics in Minkowski space-time and GR, see below:

Section 5 in http://vixra.org/abs/1005.0112 (a sequel to a paper that I published in CSF 28, (2006), 913-922)

Section 13 in http://vixra.org/abs/1011.0061 (will be published later on this year)



At 10:17 AM, Anonymous Jason said...

This comment has been removed by a blog administrator.

At 11:08 AM, Blogger ThePeSla said...

Jason, that was a very good read toward the New Physics.

I especially liked the essential distinction to what is within and without objects in the direction of their dynamics.

I am not sure what to make of the idea of gluon gluon merging.

Fractals, as if discrete strings are only part of the picture- the world is holographic too. Not sure if on this you can derive classical gravity explained in itself but is a fresh idea. Nor why we should not consider some things as not continuous value.

Clearly the physics up to muon and chirality is a good call.

This of course relavant to my fractal or quasic space since 1968 without the complex terminology or that relation to fractals.

www.pesla.blogspot.com covers the philosophy under such issues and a peek into the first of these higher abstract structures.

Integration in such spaces is not simply done as addition.

The PeSla (Thanks for the dialog Matti)

At 11:22 AM, Anonymous Jason said...

Hi The PeSla, I think the bulk of your comments must be meant for Ervin, not me. But I will have a look at www.pesla.blogspot.com. -Jason

At 8:43 PM, Anonymous Matti Pitkanen said...

Question to Erwin. It seems that you introduce fractality using fractional derivatives, which are however not purely local but involve smoothing out and get effective metric in Minkowski space.

Do you get fractional general coordinate invariance? I would guess that you need fractional tensors transforming in general coordinate transformation so that fractional derivatives multiply the tensor quantities. And what about fractional volume element?

At 5:05 AM, Anonymous Ervin Goldfain said...


Fractional derivatives and integrals are non-local operators. This is the main reason why they cannot be applied to the low energy regime of QFT and Relativity by naive extrapolation ("brute force").

For example, one has to recall that fractal manifolds are not differentiable in the usual sense but are fractionally differentiable. As a result, the concept of speed in vacuum becomes ill-defined on these manifolds since the usual derivative of space with respect to time fails to exist. The "brute force" analogy between fractional dynamics and Relativity is simply impossible.

Fractional operators are also non-unitary which is at odds with both QFT and Relativity. But there are ways to recast these operators in a unitary form and then to take advantage of manifest self-similarity/scale invariance of fractals to bring them to the usual form of unitary and local operators. Among these methods there are the so-called "embedding theorems" and "q-deformed Lie algebras". It is only in the sense of "asymptotic limit" that one can meaningfully talk about merging fractional dynamics with low-energy theories such as QFT and Relativity.

I am no mathematician but, if you are interested in more details, a quick Google search reveals names such as Tarasov, Zaslavsky, Laskin, Korabel, Podlubny and many others who wrote extensively about these topics.

It is instructive to note that fractional operators are closely related to non-extensive statistical physics, non-equilibrium critical behavior and dimensional regularization in Renormalization Group theory. They found a wide array of applications in science and engineering



At 11:50 AM, Blogger ThePeSla said...


Was not the idea of quantum theory in effect that of fractional dimensions? In a sense then fractals are an outgrowth from that.

If we cannot find analogies to Relativity and fractional dynamics the fault is in our methods of differentiation as powerful as they are an not in the more comprehensive theories- of which many seem headed in the same direction.

All these words and notions seem to be making a mess of our thoughts.

In what sense an something vanish, say the descending value of TGD sheets and the Planck values? If these vanish in all these very complicated but seeming simple visions so must the very concepts without deeper mathematics of what are things like gravity and mass. What is left?

Where does space and time go at such seemingly limitations of low dimensions? Why the certainty in our statements of impossibility? What stance of vision reassures us?
Here we finally come to a point where something like speed in a vacuum, or better some sort of Higgs-like relation to acceleration as not felt, is something we begin to more deeply define.

As of course what we mean by coordinate invariance. It is no wonder the coordinates of a dodecahedron took so long to find when on the face of it such simple solids in three space seem so simple. Do your visions out pace the details of what is and the dynamics? You who obviously are asking things from knowledge of very deep complexity?

The PeSla

At 7:44 PM, Anonymous Matti Pitkanen said...

Thank you Erwin. I wrote a long response but it was lost due to error. I try again with a shorter response!

It would be nice to have a fractal generalization of differential calculus including all the basic rules but this is probably impossible. Fractal Lie algebra is probably similar idea with no place in Platonia. I think it depends on what one means with fractal. It could mean fractional derivatives or it could mean fractal structure in the sense that zooming reveals endlessly new details and is apparent invariance.

Quantum group associated with a root of unity and p-acid prime appear in a deformation of probability distributions by replacing their rational parameters with quantum rationals that I proposed to explain the findings of Shnoll which seem to be rather universal. Quantum integer must be defined as product of quantum primes decomposing it in order to avoid infinities.

Function algebra based on series of quantum integer powers of argument x is on example of endless variety of fractal function algebras: one has smoothness everywhere except at origin. Quantum integers would be analogs of critical exponents now.

p-Adically analytic functions are smooth and allow general coordinate invariance and Lie-algebras and groups. Real field equations of geometric origin- such as those for preferred extremals fo Kahler action- make sense also p-adically. The common rationals and algebraics allow one manner to relate real and p-adic physics and the simplest idea is that at space-time level common rational and algebraic points for -p-adic and real surfaces in preferred coordinates dictated by symmetries allow to relate these two physics. At the level of world of classical worlds the surfaces whose representation allows interpretation both in real and p-adic sense can be regarded as being in the intersection of real and p-adic worlds. Preferred coordinates again and rational functions.

One can also map real and p-adic functions to real ones by a variant of what I call canonical identification: it is continuous map and produces real fractals.

At 8:27 PM, Anonymous Matti Pitkanen said...

Dear Pesla,

various polygons emerge as natural representations of abstract mathematical structures. For instance, in the representations of Lie-groups various polygons with vertices representing quantum they appear. The dimension of the object is the dimension of Cartan algebra and thus the number of quantum numbers. Also the structures of category theory have similar representations.

Polygons appear as graphical representation in the theory of Jones inclusions of hyper-finite factors of type II_1, which have in TGD framework interpretation in terms of finite measurement resolution defined by included algebra (action of included algebra creates states which are not distinguishable from each other).

Tedrahedral group is one of the three finite discrete subgroups of rotation group whose action is genuinely three-dimensional (tehdahedral, octahedral, icosahedral). In the theory of Jones inclusions they correspond to exceptional Lie groups E6,E7,E8.

My first and unsuccessful adventure in theoretical physics made me skeptic about the idea that polygons as geometric objects of physical space could have fundamental meaning. My Odysseia began with the observation that hadrons could be grouped into irreducible representations of tedrahedral group and its covering group with respect to isospin (I do not remember for sure was it only tedrahedral group or whether also octahedral and icosahedral groups were involved). This is not surprising since isospin multiplets are in question. I think that the group algebra of tedrahedral group seemed to decomposed into representations identifiable in terms of isospin multiplets.

If I remember correctly, the representations of rotation group are irreducible under tedrahedral group (at least it) up to J=2 (Delta resonance). Was this true for all three groups I do not remember? If this holds true for all of them, one can ask whether it could have something to do with the fact that J=2 is highest spin possible for elementary particles. One could imagine that discretization of system by braid strands replacing rotation group with its discrete subgroup so that only those irreducible representations stable under reduction to representations of finite sub-group represent elementary particles in ordinary sense.

In TGD one has higher also spin states for partonic 2-surfaces carrying arbitrarily high fermon numbers and spin but their propagators behaves as 1/p^n, where n is the total number of fermions and antifermions composing the state. n= 1 or 2 allowed for ordinary particles does not allow spins larger than J=2.

At 11:28 PM, Blogger Ulla said...

I once asked about the asymptotic degrees of freedom and the GUT-scale unification in TGD. Gravity is not in that scale, but magnetism could act as an unifying force, in the sense that it 'feels' the other forces. With higher energies also magnetism (spin?) change, maybe polarization too, inducing phase shifts.

I have understood that you don't believe in the GUT-scale as a unifying point. How does a possible Susy invoke on that scale? And assuming asymmetry?

Virtual bosons must be a new force? If it is strong it cannot be the unifying one giving mass, because it is short-ranged? Gravity as very weak can be unifying?

How are the forces changing at LHC at the high energies used there? I have not seen this discussion anywhere?

Bee has also an article about evolving dimensions, that could fairly well be analog to hierarchy of Plancks constants. Dark hierarchy is not sensed? This could explain the flatness of Universe?

Sorry for my low-qualified platter, but I would appreciate an answer.

At 12:17 PM, Anonymous Jason said...

This comment has been removed by a blog administrator.

At 2:44 PM, Anonymous Jason said...

This comment has been removed by a blog administrator.

At 9:13 PM, Anonymous Matti Pitkanen said...

I removed Jason's comments as malevolent and because they contain nothing relevant for the discussion.

At 9:38 PM, Anonymous Matti Pitkanen said...

Dear Ulla,

I answer to those questions that I understand.

To my view in GUTs materializes the fundamental erratic assumption of theoretical particle physics: this applies to both GUTs and to string models and M-theory. This explains why no progress has taken place for more than three decades. This erratic assumption is that both quarks and leptons belong to representations of same gauge group. One implication is proton decay which have not been observed but this message has not been taken since it is too unpleasant.

In TGD framework quarks and lepton numbers are separately conserved and color is orbital angular momentum like quantum number meaning that also leptons can be in colored states. For the predicted leptohadron physics (three of them) there is already now a lot of evidence put under the rug since these findings do not fit with in any of the alternative unifications. There is no other way to understand them than by accepting TGD views about color and dark matter. This medicine tastes however very bad.

CP2 length scale happens to correspond in good approximation to GUT unification length scale. This scale is the fundamental length scale of GUTs and the typical prediction was that there is an enormous particle desert between TeV length scale and GUT scale: no new physics would be observed by future particle physicist generations!!! This gives a good idea about the famous arrogance of particle theorists exceeded only by the arrogance of M-theorists who quite seriously claim that we should accept as a final theory a theory which predicts nothing! Should one laugh or cry? I have done both;-).

Concerning scales the situation in TGD is more complex than in standard unifications. p-Adic length scales define entire hierarchy of fundamental length scales. Standard unifiers live in non-fractal Universe with just one fundamental length scale. As a believer in TGD I live in fractal Universe. Most applications of TGD to elementary particle physics rely heavily on p-adic length scale hypothesis giving the overall important quantitative grasp about the situation.

At 12:15 AM, Blogger Ulla said...

Thanks a lot. Now I think the peieces fall better on place.

I try to compare the phase space and Einsteins spacetime. Einstein has indeed the many-sheeted spacetime there, but he could not build the general picture without assuming a common spacetime. Instead of holding the time split as he measured it, he assumed the other dimensions were unified through the iniert light, and wanted to also unify time. But he also say there is no straight line in the Universe. Smolin assumes an infinite circle.

Maybe if Einstein had found out the hierarchy it would have been better?

I try to treat TGD with care but there may be misunderstandings. I hope you tell me then. I must do it with my words and my limited understanding.

At 10:08 AM, Blogger ThePeSla said...


Yes, Riemann and thus Einstein had their multiple density "sheets" and these described by excluding zero and negative values in the phase space, But Pitkanen's and my different vision of such sheets are qualitatively and quantitatively levels beyond these ideas of the last century.

There is a difference in what we know we do not know, and of some things we feel we know, no matter how complicated such as the hubris of M and string theory. In this sense " a little bit of knowledge", at least the risk of experimenting or applying ideas unaware of the consequences or testing things, "is a dangerous place" as well as some of the people in it.

What sort of pseudoscientist in his claims and why is that Egyptian that came into the comments here and why is he believed in some circles?

On the other hand, although I doubt those committed to theories not just working hypotheses that may change if we are intellectually and scientifically honest, will understand- that Einstein in the equations on his death bed did seem to get a glimpse of these higher order of symmetries- One could call it super I suppose, but I agree with Matti this not the same sort of thing as in the standard theory.

Perhaps it is a measure of a scientist if he teaches and is sociable especially to students, as to the quality of his research, but we are only human and sometimes there is not enough time and a lifetime may not be enough.

The PeSla


Post a Comment

<< Home