- A canonical model for classical computation is in terms of Turing machine, which has bit sequence as inputs and transforms them to outputs and each step changes its internal state. A more concrete model is in terms of a network of gates representing basic operations for the incoming bits: from this basic functions one constructs all recursive functions. The computer and program actualize the algorithm represented as a computer program and eventually halts - at least one can hope that it does so. Assuming that the elementary operations require some minimum time, one can estimate the number of steps required and get an estimate for the dependence of the computation time as function of the size of computation.
- If the time required by a computation, whose size is characterized by the number N of relevant bits, can be carried in time proportional to some power of N and is thus polynomial, one says that computation is in class P. Non-polynomial computation in class NP would correspond to a computation time increasing with N faster than any power of N, say exponentially. Donald Knuth, whose name is familiar for everyone using Latex to produce mathematical text, believes on P=NP in the framework of classical computation. Lubos in turn thinks that the Turing model is probably too primitive and that quantum physics based model is needed and this might allow P=NP.
- Quantum computation is often compared to a superposition of classical computations and this might encourage to think that this could make it much more effective but this does not seem to be the case. Note however that the amount of information represents by N qubits is however exponentially larger than that represented by N classical bits since entanglement is possible. The prevailing wisdom seems to be that in some situations quantum computation can be faster than the classical one but that if P=NP holds true for classical computation, it holds true also for quantum computations. Presumably because the model of quantum computation begins from the classical model and only (quantum computer scientists must experience this statement as an insult - apologies!) replaces bits with qubits.
- In quantum computer one replaces bits with entangled qubits and gates with quantum gates and computation corresponds to a unitary time evolution with respect to a discretized time parameter constructed in terms of fundamental simple building bricks. So called tensor networks realize the idea of local unitary in a nice manner and has been proposed to defined error correcting quantum codes. State function reduction halts the computation. The outcome is non-deterministic but one can perform large number of computations and deduce from the distribution of outcomes the results of computation.
- In TGD framework Zero energy ontology (ZEO) replaces ordinary positive energy ontology and forces to generalize the theory of quantum measurement. This brings in several new elements. In particular, state function reductions can occur at both boundaries of causal diamond (CD), which is intersection of future and past direct light-cones and defines a geometric correlate for self. Selves for a fractal hierarchy - CDs within CDs and maybe also overlapping. Negentropy Maximization Principle (NMP) is the basic variational principle of consciousness and tells that the state function reductions generate maximum amount of conscious information. The notion of negentropic entanglement (NE) involving p-adic physics as physics of cognition and hierarchy of Planck constants assigned with dark matter are also central elements.
- NMP allows a sequence of state function reductions to occur at given boundary of diamond-like CD - call it passive boundary. The state function reduction sequence leaving everything unchanged at the passive boundary of CD defines self as a generalized Zeno effect. Each step shifts the opposite - active - boundary of CD "upwards" and increases its distance from the passive boundary. Also the states at it change and one has the counterpart of unitary time evolution. The shifting of the active boundary gives rise to the experienced time flow and sensory input generating cognitive mental images - the "Maya" aspect of conscious experienced. Passive boundary corresponds to permanent unchanging "Self".
- Eventually NMP forces the first reduction to the opposite boundary to occur. Self dies and reincarnates as a time reversed self. The opposite boundary of CD would be now shifting "downwards" and increasing CD size further. At the next reduction to opposite boundary re-incarnation of self in the geometric future of the original self would occur. This would be re-incarnation in the sense of Eastern philosophies. It would make sense to wonder whose incarnation in geometric past I might represent!
Might it be possible in ZEO to perform quantally computations requiring classically non-polynomial time much faster - even in polynomial time? If this were the case, one might at least try to understand how Ramanujan did it although higher levels selves might be involved also (did his Goddess do the job?).
- Quantal computation would correspond to a state function reduction sequence at fixed boundary of CD defining a mathematical mental image as sub-self. In the first reduction to the opposite boundary of CD sub-self representing mathematical mental image would die and quantum computation would halt. A new computation at opposite boundary proceeding to opposite direction of geometric time would begin and define a time-reversed mathematical mental image. This sequence of reincarnations of sub-self as its time reversal could give rise to a sequence of quantum computation like processes taking less time than usually since one half of computations would take place at the opposite boundary to opposite time direction (the size of CD increases as the boundary shifts).
- If the average computation time is same at both boundaries, the computation time would be only halved. Not very impressive. However, if the mental images at second boundary - call it A - are short-lived and the selves at opposite boundary B are very long-lived and represent very long computations, the process could be very fast from the point of view of A! Could one overcome the P≠NP constraint by performing computations during time-reversed re-incarnations?! Short living mental images at this boundary and very long-lived mental images at the opposite boundary - could this be the secret of Ramanujan?
- Was the Goddess of Ramanujan - self at higher level of self-hierarchy - nothing but a time reversal for some mathematical mental image of Ramanujan (Brahman=Atman!), representing very long quantal computations! We have night-day cycle of personal consciousness and it could correspond to a sequence of re-incarnations at some level of our personal self-hierarchy. Ramanujan tells that he met his Goddess in dreams. Was his Goddess the time reversal of that part of Ramanujan, which was unconscious when Ramanujan slept? Intriguingly, Ramanujan was rather short-lived himself - he died at the age of 32! In fact, many geniuses have been rather short-lived.
- Why the alter ego of Ramanujan was Goddess? Jung intuited that our psyche has two aspects: anima and animus. Do they quite universally correspond to self and its time reversal? Do our mental images have gender?! Could our self-hierarchy be a hierarchical collection of anima and animi so that gender would be something much deeper than biological sex! And what about Yin-Yang duality of Chinese philosophy and the ka as the shadow of persona in the mythology of ancient Egypt?
For a summary of earlier postings see Latest progress in TGD.