https://matpitka.blogspot.com/2015/05/breakthroughs-in-number-theoretic.html

Thursday, May 07, 2015

Breakthroughs in the number theoretic vision about TGD

Number theoretic universality states that besides reals and complex numbers also p-adic number fields are involved (they would provide the physical correlates of cognition). Furthermore, scattering amplitudes should be well-defined in all number fields be obtained by a kind of algebraic continuation. I have introduced the notion of intersection of realities and p-adicities which corresponds to some algebraic extension of rationals inducing an extension of p-adic numbers for any prime p. Adelic physics is a strong candidate for the realization of fusion of real and p-adic physics and would mean the replacement of real numbers with adeles. Field equations would hold true for all numer fields and the space-time surfaces would relate very closely to each other: one could say that p-adic space-time surfaces are cognitive representations of the real ones.

I have had also a stronger vision which is now dead. This sad event however led to a discovery of several important results.

  1. The idea has been that p-adic space-time sheets would be not only "thought bubbles" representing real ones but also correlates for intentions and the transformation of intention to action would would correspond to a quantum jump in which p-adic space-time sheet is transformed to a real one. Alternatively, there would be a kind of leakage between p-adic and real sectors. Cognitive act would be the reversal of this process. It did not require much critical thought to realize that taking this idea seriously leads to horrible mathematical challenges. The leakage takes sense only in the intersection, which is number theoretically universal so that there is no point in talking about leakage. The safest assumption is that the scattering amplitudes are defined separately for each sector of the adelic space-time. This means enormous relief, since there exists mathematics for defining adelic space-time.

  2. This realization allows to clarify thoughts about what the intersection must be. Intersection corresponds by strong form of holography to string world sheets and partonic 2-surfaces at which spinor modes are localized for several reasons: the most important reasons are that em charge must be well-defined for the modes and octonionic and real spinor structures can be equivalent at them to make possible twistorialization both at the level of imbedding space and its tangent space.

    The parameters characterizing the objects of WCW are discretized - that is belong to an appropriate algebraic extension of rationals so that surfaces are continuous and make sense in real number field and p-adic number fields. By conformal invariance they might be just conformal moduli. Teichmueller parameters, positions of punctures for partonic 2-surfaces, and corners and angles at them for string world sheets. These can be continued to real and p-adic sectors and

  3. Fermions are correlates for Boolean cognition and anti-commutation relations for them are number theoretically universal, even their quantum variants when algebraic extension allows quantum phase. Fermions and Boolean cognition would reside in the number theoretically universal intersection. Of course they must do so since Boolean thought and cognition in general is behind all mathematics!

  4. I have proposed this in p-adic mass calculations for two decades ago. This would be wonderful simplification of the theory: by conformal invariance WCW would reduce to finite-dimensional moduli space as far as calculations of scattering amplitudes are considered. The testing of the theory requires classical theory and 4-D space-time. This holography would not mean that one gives up space-time: it is necessary. Only cognitive and as it seems also fundamental sensory representations are 2-dimensional. All that one can mathematically say about reality is by using data at these 2-surfaces. The rest is needed but it require mathematical thinking and transcendence! This view is totally different from the sloppy and primitive philosophical idea that space-time could somehow emerge from discrete space-time.

This has led also to modify the ideas about the relation of real and p-adic physics.
  1. The notion of p-adic manifolds was hoped to provide a possible realization of the correspondence between real and p-adic numbers at space-time level. It relies on the notion canonical identification mapping p-adic numbers to real in continuous manner and realizes finite measurement resolution at space-time level. p-Adic length scale hypothesis emerges from the application of p-adic thermodynamics to the calculation of particle masses but generalizes to all scales.

  2. The problem with p-adic manifolds is that the canonical identification map is not general coordinate invariant notion. The hope was that one could overcome the problem by finding preferred coordinates for imbedding space. Linear Minkowski coordinates or Robertson-Walker coordinates could be the choice for M4. For CP2 coordinates transforming linearly under U(2) suggest themselves. The non-uniqueness however persists but one could argue that there is no problem if the breaking of symmetries is below measurement resolution. The discretization is however also non-unique and makes the approach to look ugly to me although the idea about p-adic manifold as cognitive chargt looks still nice.

  3. The solution of problems came with the discovery of an entirely different approach. First of all, realized discretization at the level of WCW, which is more abstract: the parameters characterizing the objects of WCW are discretized - that is assumed to belong to an appropriate algebraic extension of rationals so that surfaces are continuous and make sense in real number field and p-adic number fields.

    Secondly, one can use strong form of holography stating that string world sheets and partonic 2-surfaces define the "genes of space-time". The only thing needed is to algebraically extend by algebraic continuation these 2-surfaces to 4-surfaces defining preferred extremals of Kähler action - real or p-adic. Space-time surface have vanishing Noether charges for a sub-algebra of super-symplectic algebra with conformal weights coming as n-ples of those for the full algebra- hierarchy of quantum criticalities and Planck constants and dark matters!

    One does not try to map real space-time surfaces to p-adic ones to get cognitive charts but 2-surfaces defining the space-time genes to both real and p-adic sectors to get adelic space-time! The problem with general coordinate invariance at space-time level disappears totally since one can assume that these 2-surfaces have rational parameters. One has discretization in WCW, rather than at space-time level. As a matter fact this discretization selects punctures of partonic surfaces (corners of string world sheets) to be algebraic points in some coordinatization but in general coordinate invariant manner

  4. The vision about evolutionary hierarchy as a hierarchy of algebraic extensions of rationals inducing those of p-adic number fields become clear. The algebraic extension associated with the 2-surfaces in the intersection is in question. The algebraic extension associated with them become more and more complex in evolution. Of course, NMP, negentropic entanglement (NE) and hierarchy of Planck constants are involved in an essential manner too. Also the measurement resolution characterized by the number of space-time sheets connecting average partonic 2-surface to others is a measure for "social" evolution since it defines measurement resolution.

There are two questions, which I have tried to answer during these two decades.
  1. What makes some p-adic primes preferred so that one can say that they characterizes elementary particles and presumably any system?

  2. What is behind p-adic length scale hypothesis emerging from p-adic mass calculations and stating that primes near but sligthly below two are favored physically, Mersenne primes in particular. There is support for a generalization of this hypothesis: also primes near powers of 3 or powers of 3 might be favored as length sand time scales which suggests that powers of prime quite generally are favored.

The adelic view led to answers to these questions. The answer to the first question has been staring directly to my eyes for more than decade.
  1. The algebraic extension of rationals allow so called ramified primes. Rational primes decompose to product of primes of extension but it can happen that some primes of extension appear as higher than first power. In this case one talks about ramification. The product of ramified primes for rationals defines an integer characterizing the ramification. Also for extension allows similar characteristic. Ramified primes are an extremely natural candidate for preferred primes of an extension (I know that I should talk about prime ideals, sorry for a sloppy language): that preferred primes could follow from number theory itself I had not though earlier and tried to deduce them from physics. One can assign the characterizing integers to the string world sheets to characterize their evolutionary level. Note that the earlier heuristic idea that space-time surface represents a decomposition of integer is indeed realized in terms of holography!

  2. Also infinite primes seem to find finally the place in the big picture. Infinite primes are constructed as an infinite hierarchy of second quantization of an arithmetic quantum field theory. The infinite primes of the previous level label the single fermion - and boson states of the new level but also bound states appear. Bound states can be mapped to irreducible polynomials of n-variables at n:th level of infinite obeying some restrictions. It seems that they are polynomials of a new variable with coefficients which are infinite integers at the previous level.

    At the first level bound state infinite primes correspond to irreducible polynomials: these define irreducible extensions of rationals and as a special case one obtains those satisfying so called Eistenstein criterion: in this case the ramified primes can be read directly from the form of the polynomial. Therefore the hierarchy of infinite primes seems to define algebraic extension of rationals, that of polynomials of one variables, etc.. What this means from the point of physics is a fascinating question. Maybe physicist must eventually start to iterate second quantization to describe systems in many-sheeted space-time! The marvellous thing would be the reduction of the construction of bound states - the really problematic part of quantum field theories - to number theory!

The answer to the second question requires what I call weak form of NMP.
  1. Strong form of NMP states that negentropy gain in quantum jump is maximal: density matrix decompose into sum of terms proportional to projection operators: choose the sub-space for which number theoretic negentropy is maximal. The projection operator containing the largest power of prime is selected. The problem is that this does not allow free will in the sense as we tend to use: to make wrong choices!

  2. Weak NMP allows to chose any projection operator and sub-space which is any sub-space of the sub-space defined by the projection operator. Even 1-dimensional in which case standard state function reduction occurs and the system is isolated from the environment as a prize for sin! Weak form of NMP is not at all so weak as one might think. Suppose that the maximal projector operator has dimension nmax which is product of large number of different but rather small primes. The negentropy gain is small. If it is possible to choose n=nmax-k, which is power of prime, negentropy gain is much larger!

    It is largest for powers of prime defining n-ary p-adic length scales. Even more, large primes correspond to more refined p-adic topology: p=1 (one could call it prime) defines discrete topology, p=2 defines the roughest p-adic topology, the limit p→ ∞ is identified by many mathematicians in terms of reals. Hence large primes p<nmax are favored. In particular primes near but below powers of prime are favored: this is nothing but a generalization of p-adic length scale hypothesis from p=2 to any prime p.

See the article What Could Be the Origin of Preferred p-Adic Primes and p-Adic Length Scale Hypothesis?.

For a summary of earlier postings see Links to the latest progress in TGD.

40 comments:

Anonymous said...

Hmm. So what about the phenomenal obviousness of ALL mathematical thinking and believing (in various number theories, axioms and axiomatics etc.) happening as "cognitive acts"?

Matpitka@luukku.com said...


Looks sensible. Universe evolves more and more algebraically complex and becomes conscious of mathematical truths. There must be however something which cannot be reduced by the strong holography to 2-D basic objects (string world sheets and partonic 2-surfaces in the intersection of realities and p-adicities). We are are aware of space-time, were are aware of the idea of continuum, we are aware of completions of rationals and their extensions. This is something that goes beyond formulas.

Anonymous said...

What about these? http://www.encyclopediaofmath.org/index.php?title=Wilson_polynomials

They have an interpretation as Racah coefficients for tensor products of irreducible representations of the group SU(2).

Anonymous said...

http://plato.stanford.edu/entries/fictionalism-mathematics/

Anonymous said...

Number theoretical universality of Boolean logic is highly questionable not only from fictionalist view but even inside the Platonist credo. In 'Sophist' Plato shows the codependent dialectics of fundamental categories ("sameness", "difference", "movement", "stillness", etc.), not unlike what Buddha and Buddhist logic says about codependent origins. So also in the Platonist approach, according to Plato himself, also geometry and number theory, not to mention classical bivalent Aristotelean/boolean logic, are not "outside" or "independent" of codependent origins.

All claims of all kinds of "universalities" can also be questioned from base of cultural anthropology and the study of human world views. In this approach, math based on bivalent boolean _functions_ is not cultural universal, but just one narrative that is not universally accepted even among mathematicians and logicians of Western culture.

Especially if they accept Gödel's proof that number theory of PM and those similar to it are incomplete, and cannot be consistently structured according to boolean or any other bivalent truth OR false logic.

Bohr's complementary philosophy is certainly not boolean, and allows both "true" and "false" as complementary "contradictions" and even as codependent views.

According to Bohr's interpretation, this non-universal Gödel-incomplete number theory (that schools and universities teach and indoctrinate in) is just part of the "classical world" that as a whole "measures" superposition. And as such, also number theory is ultimately a matter of choice...

Anonymous said...

"As the Buddha may or may not have said (or both, or neither): ‘There are only two mistakes one can make along the road to truth: not going all the way, and not starting.’"

The whole article is worth a read:
http://aeon.co/magazine/philosophy/logi
c-of-buddhist-philosophy/

Matpitka@luukku.com said...

To anonymous about number theoretic universality etc… I cannot bat disagree. Godel's proof tells that complete axil systems are not possible: the pool of truth is infinitely deep. Something very easy to accept. Godel does say that Boolean logic is somehow wrong.

I see this as mathematicians, not anthropologist.
For me number theoretical universality means just that it is superstructure behind all mathematics. This is simple fact. What is interesting is that it has concrete and highly non-trivial physical meaning when fermions are identified as physical correlates for Boolean thinking. Anticommutation relations for fermions are number theoretically universal and make sense in any number field. Bosonic commutation relations are not since they involve imaginary unit.

One can of course extend Boolean logic to quantum Boolean version and fermionic representations does this.

Matpitka@luukku.com said...


Sorry for typo: Godel does *NOT* say that Boolean logic is somehow wrong.

Sorry for other typos too: these texting programs have their own vision about what I want to say. In any case: the essence of what I am saying is that fermionic anticommutation relations make sense in any number field - real or p-adic.

This is one facet of number theoretical universality and has a nice interpretation : one can construct endlessly variants of Boolean logic and call them logics but these constructions rely always on Boolean logic. You simply cannot think mathematically without Boolean logic. Fermions serving as correlates of Boolean logic define also a square root of geometry. This is a deep connection between logic and geometry: they really are the two basic pillars of existence.

Buddhist philosophy is nice but my goal is not to transform buddhist writings to mathematical statements. I regard myself as a scientist and for scientist authorities of any kind are poison. Kill the Buddha if you meet him on the road;-).

Anonymous said...

https://statistics.stanford.edu/sites/default/files/2000-05.pdf

see theorem 4.1, there's the 4 outcomes thing

0, -0.5, +0.5, 1

Ulla said...

Gödels problem is an outcome of the 3 body problem if I have understood it right. This is also an effect of the 'collapse'?

2D quantum systems are always wider, bigger with more uncertainty, like some 'fuzzy logic'. Maybe the density differencies this brings along can be something to work on? At least now this is how I want to see the Hierarhchy of Plancks constant. Note how that density shifts when the Ouroborus bites its own tail :P Also the uncertainty shifts.

This is the essence of topology?

Matpitka@luukku.com said...


To Ulla:

Godel's theorem belongs meta-mathematics. It is difficult to imagine any physics application for it.

Effective 2-dimensionality means holography It might be seen as information theoretic notion 4-D space-time is redundant description if only scattering amplitudes are considered: 2-D basic objects are enough for this purpose. Classical correlates for quantum physics require 4-D space-time and one cannot throw it away.

Anonymous said...

In Boolean terms the proposition

"Boolean thought and cognition in general is behind all mathematics!"

is not just false, it is false!

Matpitka@luukku.com said...


Before making this kind of statements we must define what "behind" means in the above statement. "Behind" of course does not mean that all mathematics reduces to Boolean algebra.This would be absolutely idiotic.

The natural meaning of "behind" is that *all* mathematics relies on deductions, which obey the rules of Boolean logic. p-Adic logic is not different from real one. The logic of differential geometry is not different from the logic of probability calculus.

If you can give example about a mathematical theory where deductions are carried out by using some other more romantic logic, please do so. I would be happy to get a concrete illustration about how my statement is false: I would prefer using Boolean logic. Just a concrete mathematical statement instead of something about buddhism, egos, or anthropology. After than I am ready to consider seriously what you are claiming.



Anonymous said...

On the contrary, I believe the beauty of pure mathematics is very much in strict adherence to rigorous deductive logic based on clear definitions.

The freedom of choice in rigorously beautiful math is not in ad hoc axioms (to postulate set theory, real numbers etc. inherently illogical "completions of infinite processes"). The freedom of choice is at the foundational level, how we create and organize and deductively prove our number theory from a blank "zero state".

And you see that there are many areas of modern math, such as calculus etc., that throw away rigorous deductive logic and justify themselves by black magic alone, which also "works".

If you really prefer using rigorous deductive logic, you take side with Berkeley and Beauty and renounce the heretics Newton and Leibniz and their followers in the shadowy art of self-deception, without trying to escape into any kind of authoritarian argumentation (cf. "authoritarians rule academia", "black magic works", etc.).

Only after we agree to do also this bivalent deductive logic properly instead of by deception and black magic axiomatics, we can proceed to thinking about foundational level and "middle-way-math that would avoid all the following extremes:
a, b, both a and b, neither a or b.

As at least in the areas of math that involves measuring, boolean valuing does not work as things-events appear to be inherently approximate and fuzzy, and number theory based on middle-path-logic already at foundational level might be worth a thought.

Matpitka@luukku.com said...



If you believe on rigorous deductive logic, then the
the first thing to do is to start to apply it. You can begin by justifying your claims by arguments. Saying "plainly wrong" without any justification is only an emotional rhetoric burst.

When you say inherently illogical "completions of infinite processes" you must tell what "inherently illogical" is. I find it very difficult to see what this phrase means when you talk about completions.

Mathematics without axioms is like Munchausen lifting himself into air. You simply cannot prove anything without axioms. They are a book keeping device for the what we believe to be truths. From a blank zero state you cannot deduce anything.

I like argumentation with contents. When some-one starts to talk about "self-deception", "authoritarian argumentation, "black magic"… I smell the presence of bad rhetoric.

Our observations about things are fuzzy, not things. I would be happy to see a number theory based on middle valued logic but I am afraid that this is only a nice sounding verbal construct like
mathematics without axioms.

Quantum logic based on qubits is beautiful but logical statements makes only after the state function reduction yielding well-defined bits has happened. Our conscious experience makes the world classical.

Anonymous said...

Then please, do your best to discuss the content. Even if the only content is your confusion. :)

"Infinite process" by definition means that the process (e.g. algorithm) continues ad infinitum, does not get finitely completed. I hope this clarifies what is illogical about "completions of infinite processes" in terms of binary logic. Based on this we can conclude that boolean bivalent either-or valuing applies strictly only to finite phenomena, not to of infinite processes and approximations e.g. by some processes of limitations. Hence, any area of mathematics that deals with infinite processes in any way is not 'boolean' in the strict sense. Boolean thought and cognition can apply only to finite, bivalent processes that involve a boolean identity.


"Axioms" are today used with variety of meanings. What was criticized was "ad hoc" axioms used to postulate what a mathematical physicist _wants_ to please himself with when rigorous deductive logic otherwise fails to produce the object of desire, not the foundational level axioms e.g. in Euclidean sense. Mathematics is a field of applied magic, and such use of ad hoc axioms is black magic.

"Our observations about things are fuzzy, not things." This is purely a statement of personal metaphysical belief, faith in "things" having "inherent ontology". That statement has nothing to do with logic, math and science.

Notions of 'length', 'area', 'volume', 'position', 'momentum' etc. are strictly speaking neither 'observations' nor 'things', but in this context just abstract mathematical notions, which in the mathematics we are used to do not behave in boolean way.

Matpitka@luukku.com said...


It is a pity that your comments are getting increasingly emotional and rhetoric. Mathematics probably looks black magic for anyone, who does not understand it. There is a lot of mathematics which looks black magic to me, but by working hardly I can get rid of this expression. I do
not want to blame mathematics for my own limitations.

I comment those parts of your comment, which have some content.

*As mathematical structures Boolean algebras extend without difficulty to continuous case: consider set theoretic realisation. p-Adics are completely well-defined notion and 2-adic number can be seen as infinite sequence if binary digits.

What is important that pinay digits are ordered: the higher the digit, the lower its significance. This is the basic idea of metric topology and makes it possible to work with continuum. Mathematics without continuum reduces to mere combinatorics.

*Completions of rationals to various number fields
are a standard mathematical concept and extremely successful: if someone wants to believe that this notion is mathematically flawed, he can do so but certainly remains a lonely believer.

In a any case, mathematicians discovered for centuries about that the notions of nearness, limit, continuity, Cauchy sequence and smoothness are extremely useful notions and allow to conclude the outcome of infinite processes. Very useful notions also for a philosopher of mathematics and highly recommended;-)

*Conscious logical thought- at least that part of which is representable physically - is discrete. Discreteness is one of the basic aspects of cognition - I formulate this in terms of cognitive resolution implying in turn finite measurement resolution.

*We should be be careful to not project the limitations of our cognition to the physical and mathematical realities. Materialists do this: they try to identify consciousness and physical reality and end up to a dead end.

Anonymous said...

The only real argument presented above is "useful", which is purely emotional and rhetoric argument. "Useful" is exactly what is meant by "black magic", in contrast to the magic and beauty of rigorous mathematical deduction. Argument from authority is to refer what others think and do, instead of establishing a well defined theory of real numbers based on foundational axioms and boolean chain of deductions and proves. If you want to play the game of boolean mathematices, play it honestly and don't cheat at every corner.

In the boolean context, it seemingly takes a complex self-deception to lose sight of the simple fact that there is indeed significant arithmetic symmetry break. As the basic carry rules of basic arithmetics state, Cauchy numbers in form ...nnn,p + ...nnn,p make arithmetic sense in the boolean context (ie, they can be added), but irrational Cauchy numbers n,nnn... + n,nnn... do not add up but remain vague and non-boolean.

Unless, of course, you can prove that irrational Cauchy sequenses do add up in finite life time calculation in discrete non-vague manner and can be given a boolean value. Go ahead, give it try.

Matpitka@luukku.com said...


I think it is time to stop the discussion since you are seem to be in rebel against mathematics and theoretical physics : windmills would be less dangerous enemy. People who argue that entire branches of science are totally wrong are usually called crackpots: I do not like that word because it is so much misused.

I have met many people who have declared war against some branch of eel-established science: one was logician who had the fix ide that special relativity contains logical errors: he had ended up with this conclusion by interpreting special relativity in Newtonian framework. I tried to explain but in vain.

You seems to misunderstand the idea of Cauchy sequence in a manner which remains for me black magic.

*Cauchy sequences provide a formulation of continuity: I fail to see how you manage to assign to them Boolean interpretation.

*You talk also about Cauchy numbers: by looking at Wikipedia you see that it is a dimensionless parameter used in hydrodynamics. I honesty admit that I fail to see the connection to the notion of continuity.

*Also the addition of Cauchy sequences is to my best knowledge completely irrelevant for the notion of limit.






Matpitka@luukku.com said...

Some comments about axiomatics. This is of a technical tool for mathematicians. The unavoidable
bureaucracy, one might say.

Theoretical physicist who is really constructing a theory is working hardly to find minimal number of basic assumptions, which might be true. Trying to find a set of assumptions which forms a coherent whole, is internally consistent, and predicts as much as possible.

This a process of trial and error and there is no point if declaring wars against mathematics or any other branch of science. This activity could not be farther from mechanical deduction from a fixed set of axioms, which requires just algorithms and in principle can be carried out by computer.

Theoretical physics has indeed led to powerful insight about mathematics: consider only Witten's work. Recently Nima Arkani Hamed and colleagues (mathematicians) have done similar job. This is by working as visionary: mathematicians take care of details when the dust has settled: this can take centuries.

Theoretician of course hopes that some day this structure can be axiomatized and even average professor can use the rules to calculate.

Anonymous said...

No, it's not about politics, the question is very simple. Are "real numbers" numbers in boolean sense or combinatorical noise?

Present real numbers in the form of hindu-arabic cauchy sequenses in base 2. Pick a pair of such real numbers and add them up. Do you get a discrete result that starts with either one or zero?

AFAIK, no, and if not otherwise proven, hence "real numbers" cannot be said to be numbers in boolean sense.

And as real numbers cannot be said to be definable in boolean sense, that goes also for real complex plain, complex manifolds etc.

I don't know what the hell those things are, but they are certainly not "boolean thought and cognition", presumably meaning numbers that can be expressed as either 1 or 0.

You can twist and dance around and play your politics and war games as much as you want - and it is sorry to see you try so hard not to admit what is so obvious -, but it's just math. This is just math, and if we choose to play boolean game, we play it by boolean rules, otherwise we would cheat.

Also in math you need to learn to walk before you can run. When you try to run before you have learned to take baby steps, you make a glorious short dash and then end up with your face in mud. And IMHO that summarizes the state of contemporary academic mathematics.


Anonymous said...

PS: IFF mathematics and theoretical physics claim to rule and conquer and control either openly or by implication, of course I revel, as any honest self-loving man would. :)

Thusly experiencing does not reduce to nor is limited by mathematics and theoretical physics, not even TGD. 2D representation of music is not same as picking a guitar in your lap and playing music that never was and never will be.

Anonymous said...

anonymous, you are talking out of your ass, stop wasting Matti's time

Anonymous said...

Now that we have hopefully left the dogmatic trenches or warfare and politics and are taking our baby steps, the structure of "real line" (-field) is as such an interesting object of study. In binary the sum of two points on real line (almost all of which are non-algebraic) is at least in most cases not:

0
1
both 0 and 1
neither 0 nor 1

but avoids all these extreme positions. ;)

So the notion of qubit seems now inherently related with adding elements of non-Boolean "completion" of rational line.


Also, it would seem that the bigger the number base, the smaller the room for vague. What would be the situation with base of Biggest Mersenne Known? Is there some kind or structural relation with "modern interpretation" of consistent histories and the questions it claims to allow and exclude?

Anonymous said...

see http://www.encyclopediaofmath.org/index.php/Algebra_of_sets for instance, its also called a σ-field σ-algebra, they can be unconditional, or conditional, upon all sorts of other spaces

go back and read the link I posted

Theorem 4.1

and quit babbling your wordy nonsense, anonymous

random matrix theory alone cannot do it, the primes must enter in some way, and this spectral signature is universal for any and all unitary processes, apparently, if "U" know how to look , amirite M8? :)

Anonymous said...

Stephen, I have checked the link and now rechecked, and found this gem, under Proposition 5.1:

"Remark. From the equality, the infinite sum of squares converges to an _almost surely_ finite limit."

I humbly suggest that linguistic expression "almost surely" is pretty sure tell that the math in question has moved from the confines of boolean values or "boolean cognition" to somewhere else. ;)

Or maybe you can show that randomly picked real numbers in base 2, let's say 0,000... and 0,000... , without assuming that they are rationals, do really sum up in boolean terms, ie. the sum is either a number beginning with 0 or 1, but not both or neither or something even more weird like "qubit"?

If you can't, we can't honestly say that "boolean cognition" is behind claimed mathematical structures such as "real number line", and that the proof theory which is used to postulate such structures is a boolean proof theory. And same goes for set theory.

Anonymous said...

https://statistics.stanford.edu/sites/default/files/2001-01.pdf is also very interesting "Unitary correlations and the Feijer kernel" is very interesting.

your statement of "randomly picked real numbers in base 2" is ill-posed , are you just trying to reinvent some some "floating-point" representation ?

I'm saying its something more "weird" like a qubit.

nowhere in the thing u described do I see any sort of room for time, much less deterministic or nondeterministic notions of system states.

Fuzzy you say? nonsense, classical mechanics chaos, u take a Poincaire section of the flow and each orbit punctures that section at a particular point, repeat this many times (by dynamical system evaluation of integrals etc with whatever conditions) and you end up with a process whose "output" can take on a discrete number of values relative to the reference measure, well, basically, one can easily prove things such as Cantor, Levy 'dust' etc and go into fractal dimension which takes on any real value, so your argument is really just wildly stabbing in some direction or another, trying to talk big it seems...

this boolean aspect of any set whatsoever requires the concept of indicator function, I(A)=1 if A in omega, 0 if its not in omega. integral. do you speak it? apparently not.

you sound like an IT guy... are you an IT guy? ;-)

Anonymous said...

See https://app.box.com/files/0/f/0/1/f_30073473869 for some interesting ways that numbers 7 and 24 just pop out of some rather elementary integrals

Matpitka@luukku.com said...



I do not want to use my time to ponder whether there is some conspiracy of power greedy mathematicians and physicists against civilised world. I just want to make clear some elementary things about Cauchy sequences in hope that they remove the feeling of black magic.

a) Cauchy sequences are used by everyone who can sum, subtract, multiply and divide decimal numbers. These are special kind of Cauchy sequences in which n:th term is the number in approximation using n decamical digits. One can use also binary or binary and much more general sequences.

These particular sequences are however convenient since all arithmetic operations are for rational numbers.

b) In numerics on introduces decimal/pinary/... cutoff. This makes sense if the functions are continuous and the operations for them respect continuity.

c) If one wants to formulate this axiomatically one can say that one works in the category of continuous functions. Absolutely no crime against mankind is involved. Everything is finite and numbers of operations are finite but approximate with an error that can be estimated. Computers
use routinely binary Cauchy sequences with a success.

c) One could of course throw away real numbers as a conspiracy against mankind and decides to use only rationals (I do not know whether algebraic numbers are also doomed to to be part of conspiracy) . This leads to difficulties.

One must define differential equations etc as difference equations by specifying the size of different: single equation would be replaced by infinite number of them- one for each accuracy. Calculus and most of what has been achieved since Newton would be lost since no-one wants to write Newton's mechanics or Maxwell's theory or quantum field theory using difference equations: it would incredibly clumsy.

There would no exponent function, no Gaussian, no pi, no special functions. Things become in practice impossible. Most of number theory is lost: forget Riemann Zeta, forget p-adic numbers, ... Analytic calculations absolutely central in all science would become impossible.

Reals represent the transcendent, spirituality, going beyond what we can represent by counting with fingers. Recent science is deeply spiritual and in very concrete manner but the materialistic dogma prevents us from seeing this.




Matpitka@luukku.com said...


I hope that the importance of the notion of finite accuracy became clear. It certainly does not look like a beautiful notion in its recent formulation.

Finite accuracy is the counterpart for finite measurement resolution/cognitive resolution and is anotion, which is often not considered explicitly in math text books. It is fundamental in physics but the problem is how to formulate it elegantly.


It is also encountered in in the adelic vision based on strong form of holography. One can in principle deduce scattering amplitudes in an algebraic extension of rationals (this for the parameters such as momenta appearing in them). One can algebraically continue this expression to all number fields.

But what if one cannot calculate the amplitudes exactly in the algebraic extension? There is no problem in real topology using ordinary continuity. But continuation to p-adic topologies is difficult since even a smallest change in rational number in real sense can mean very big change in p-adic sense. It seems that one cannot avoid canonical identification or some of its variants if one wants to assign to a real amplitude a p-adic amplitude in continuous manner.

Finite accuracy is also a deep physical concept: fermions at string world sheets are Boolean cognitive representation of space-time geometry. But in finite accuracy representing 4-D object using data assignable to a collection of 2-D objects rather than 0-dimensional objects (points) as in the usual naive discretization, which is not consistent with symmetries. Discrete set of points is replaced with discrete collection of 2-surfaces labelled by parameters in algebraic extension of rationals. The larger the density of strings, the better the representation. This is strong form or holography is implied by strong form of general coordinate invariance: a completely unexpected connection to Einstein's great principle.

This leads also to an elegant realization of number theoretical universality and hierarchy of inclusions of hyper-finite factors as a realization of finite measurement resolution. Also evolution as increase of complexity of algebraic extension of rationals pops up labelled by integers n =h_eff/h, which are products of ramified primes characterizing the sub-algebra of super-symplectic algebra acting as gauge conformal gauge symmetries. Effective Planck constant has purely number theoretic meaning as a measure for the complexity of algebraic extension!

Ramification is also number theoretic correlate of quantum criticality! Rational prime decomposes to product of prime powers such that some of them are higher than first powers: analog for multiple root in polynomial - criticality! For me this looks amazingly beautiful.

Matpitka@luukku.com said...

Correction to the last paragraph: "prime powers such that" should read "powers of primes of extension such that"

Anonymous said...

I don't know what all will be lost if we honestly admit that "real numbers" do not behave arithmetically, at least in the boolean sense, and though many say that "real numbers satisfy the usual rules of arithmetic", obviously they don't. Any child can see that emperor has no clothes in that respect.

Even though reals don't, AFAIK the p-adic side does satisfy the usual rules of arithmetic, at least in some areas. Worth a more careful look. Cauchy intervals within intervals is perfectly OK and very rich and interesting structure, and repeating patterns of rationals is amazing and beautiful thing worth deeper study, e.g. how do lengths of repeating patterns behave in various bases, on both sides of rationals? When repeating patterns are plotted inside Cauchy intervals, I see a wave forms at very basic level of number theory.

In OP Matti does see the light, saying that mathematical structures follow from number theory itself, trying to deduce from "physics" does not work.

So here is relatively simple question: what is the _minimum_ of number theory you need to observe quantum observables? I'm very much in doubt that e.g. "canonical identification" is needed (but rather, confuses and messes things up).

I'm not IT, but even I know that computers don't do real numbers or any other infinite processes. Floating points, lazy algorithms, etc. get the job done. Finite accuracy works in boolean way, but no, we can't say that "finite accuracy" strings are "real numbers".

If we insist that problem of mathematical continuum "has been solved with "least upper bound" completion of algebraic (e.g. roots) and algorithmic (pi, e) realside infinite processes", there is a cost: the solution is not boolean, rules of arithmetic don't work regardless of how much some people pretend that they work and push the problems under the mattress and out of text books. It's not about politics, it's just math.

There are other options, we can admit that the problem of mathematical continuum remains unsolved, or poorly understood and formulated, and keep on thinking and questioning, instead of blindly believing the academic authorities that say that real numbers follow the basic rules of arithmetics. Eppur si muove.

Matpitka@luukku.com said...


This is my last comment in this fruitless discussion. I have done pedagogical efforts in order to clarify the basics but in vain. I however strongly encourage to continue serious studies of basics before declaring a war against modern mathematics and physics.

I have tried to explain that finite calculational accuracy is the point: it is not possible to calculate real number exactly in finite time and no-one has been claiming anything like that. The idea of giving up all the mathematics since Newton is just just because we cannot calculate with infinite precision is complete idiotism.

And I am still unable to see what is wrong with Cauchy sequences: here I tried to concretise them in terms of decimal representation in order to give the idea what they are about but it seems that it did not help.

The generalisation of real numbers rather than refusing to admit their existence, is the correct direction to proceed and I have been working with this problem with strong physical motivations. Fusion of reals and p-adics to adelic structures also at space-time level, hierarchy of infinite primes defining infinite hierarchy of second quantisation for an arithmetic quantum field theory, even construction of arithmetics of Hilbert spaces, replacement of real point with infinitely structured point realizing number theoretic Brahman = Atman/algebraic holography. These are interesting lines to proceed rather than a return to cave.

Strange that someone blames me for blindly believing academic authorities;-).

Anonymous said...

As for relation of Boolean operators V and its vertical inverse, and human cognition, propositional logic is far from universal; some natural languages behave closer to propositional logic, some not in the slightest.

Leveled horizontal operators of ordinality "<" and ">" (less-more) are much more naturally universal in human cognition, I'm not aware of natural language without more-less relation, which is also naturally hyperfinite process closely related to whole-part relation. The arrows giving directions are also more-less relations: go more in the direction the arrow is pointing, less in the opposite direction. These operators predate all other written language and needless to say, propositional logic.

Anonymous said...

Matti, read again. Your latest comment has very little to do with what has been said and meant.

Again:
The authorities (e.g. wiki) keep on saying that real numbers follow the basic rules of arithmetics. Obviously that claim is not true.

The definition of 'real number' refers to infinite process ("least upper bound"), not to finite computable segment. Finite segments by definition are NOT "real numbers", they are something else. Some say "approximations", but also an approximation is NOT a real number. It is an approximation.

If we want to keep math communicable, we must respect definitions and do our best to define as clearly as we can. The notion of "real number" is as it is usually used, horribly vague and poorly defined.

That is of course a big if, and communication is not necessarily priority. The word "sin" has been mentioned in context of incommunicado.

Anonymous said...

anonymous, who is this we you speak of? get some books and stop watching videos. do some analysis. read up on Baire spaces if u are so caught up on notions of continuity

Anonymous said...

for example, ur little toy problem: The irrational numbers, with the metric defined by , where  is the first index for which the continued fraction expansions of  a and  b differ (this is a complete metric space).

the iterated map that gives rise to the continted fraction expansion of a real number .. well, it's related to the riemann zeta function, see the Wikipedia page
Edit

Continued fractions also play a role in the study of dynamical systems, where they tie together the Farey fractions which are seen in the Mandelbrot set with Minkowski's question mark function and the modular group Gamma.

The backwards shift operator for continued fractions is the map h(x) = 1/x − ⌊1/x⌋ called the Gauss map, which lops off digits of a continued fraction expansion: h([0; a1, a2, a3, …]) = [0; a2, a3, …]. The transfer operator of this map is called the Gauss–Kuzmin–Wirsing operator. The distribution of the digits in continued fractions is given by the zero'th eigenvector of this operator, and is called the Gauss–Kuzmin distribution.

Matpitka@luukku.com said...


To Anonymous: You should calm down and stop talking total nonsense.

You are unable to realize that things can exist although we cannot know perfectly what they are. What we can know is that real number is in some segment, we can narrow down this segment endlessly but never know exactly.

But we have also something else than mere numerical computation: we have the conscious intelligence. It cannot be computerised or axiomatised but and most importantly, it is able to discover new truths.

In mathematics communication requires also learning: just watching some videos and becoming a fan of some character making strong nonsense claims is not enough. Also mathematical intuition is something very difficult to teach: some of us have it, others do not.

Just as some people are able to compose marvellous music. It seems that we must just accept this. I am not musically especially talented but I enjoy the music of the great composers and experience the miracle again and again: I do not declare war against this music.

Matpitka@luukku.com said...

Some comments about quantum Boolean thinking and computation as I see it to happen at fundamental level.

a) One should understand how Boolean statements A-->B are represented. Or more generally, how a computation like procedure leading from a collection A of math objects collection B of math objects takes place. Recall that in computations the objects coming in and out are bit sequences. Now one have computation like process. --> is expected to correspond to the arrow of time.

If fermionic oscllator operators generate Boolean basis, zero energy ontology is necessity to realize rules as rules connecting statements realized as bit sequences. Positive energy ontology would allow only statements. Collection A is at the lower passive boundary of CD and B at the upper active one. As a matter fact, it is a quantum superpositions of Bs, which is there! In the quantum jump selecting single B at the active boundary, A is replaced with a superposition of A:s: self dies and re-incarnates and generates negentropy. Q-computation halts.

That both a and b cannot be known precisely is a quantal limitation to what can be known: philosopher would talk about epistemology here. The different pairs (a,b) in superposition over b:s are analogous to different implications of a. Thinker is doomed to always live in a quantum cognitive dust and never be quite sure of.

Matpitka@luukku.com said...



Continuing….

b) What is the computer like structure now? Turing computer is 1-D time-like line. This quantum computer is superposition of 4-D space-time surfaces with the basic computational operations located along it as partonic 2-surfaces defining the algebraic operations and connected by fermion lines representing signals. Very similar to ordinary computer.


c) One should understand the quantum counterparts for the basic rules of manipulation. x,/,+, and - are the most familiar example.

*The basic rules correspond physically to generalized Feynman/twistor diagrams representing sequences of algebraic manipulations in the Yangian of super-symplectic algebra. Sequences correspond now to collections of partonic 2-surfaces defining
vertices of generalized twistor diagrams.

*3- vertices correspond to product and co-product represented as stringy Noether charges. Geometrically the vertex - analog of algebraic operation - is a partonic 2-surface at with incoming and outgoing light-like 3-surfaces meet - like vertex of Feynman diagram. There is also co-product vertex not encountered in simple algebraic systems, it is time reversed variant of vertex. Fusion instead of annihilation.

*There is a huge symmetry as in ordinary computations too. All computation sequences connecting same collections A and B of objects produce the same scattering amplitude. This generalises the duality symmetry of hadronic string models. This is really gigantic simplification and the results in twistor program suggest that something similar is obtained there. This implication was so gigantic that I gave up the idea for years.

d) One should understand the analogs for the mathematical axioms. What are the fundamental rules of manipulation?

*The classical computation/deduction would obey deterministic rules at vertices. The quantal formulation cannot be deterministic for the simple reason that one has quantum non-determinism (weak form of NMP allowing also good and evil) . The quantum rules obey the format that God used when communicating with Adam and Eve: do anything else but do not the break conservation laws. Classical rules would list all the allowed possibilities and this leads to difficulties as Goedel demonstrated. I think that chess players follow the anti-axiomatics.

I have the feeling that anti-axiomatics could give a more powerful approach to computation and deduction and allow a new manner to approach to the problematics of axiomatisations. Note however that the infinite hierarchy of mostly infinite integers could make possible a generalisation of Godel numbering for statements/computations.

e) The laws of physics take care that the anti-axioms are obeyed. Quite concretely:

*Preferred extremal property of Kaehler action and Kaeler-Dirac action plus conservation laws for charges associated with super-symplectic and other generalised conformal symmetries would define the rules not broken in vertices.

*At the fermion lines connecting the vertices the propagator would be determined by the boundary part of Kahler-Dirac action. K-D equation for spinors and consistency consistency conditions from Kahler action (strong form of holography) would dictate what happens to fermionic oscillator operators defining the analog of quantum Boolean algebra as super-symplectic algebra.