Many problems of quantum computation in standard sense might relate to a wrong view about quantum theory. If TGD Universe is the physical universe, the situation would improve in many respects. There is the new fractal view about quantum jump and observer as "self"; there is p-adic length scale hierarchy and hierarchy of Planck constants as well as self hierarchy; there is a new view about entanglement and the possibility of irreducible entanglement carrying genuine information and making possible quantum superposition of fractal quantum computations and quantum parallel dissipation; there is zero energy ontology, the notion of M-matrix allowing to understand quantum theory as a square root of thermodynamics, the notion of measurement resolution allowing to identify M-matrix in terms of Connes tensor product; there is also the notion of magnetic body providing one promising realization for braids in tqc, etc... Taking the risk of boring the reader by repeating things that I have already said I will summarize these new aspects TGD below.

There is also a second motivation. Quantum TGD and TGD inspired theory of consciousness involve quite a bundle of new ideas and the continual checking of internal consistency by writing it through again and again is of utmost importance. The following considerations can be also seen as this kind of checking. I can only represent apologies to the benevolent reader: this is a work in progress.

**A. Fractal hierarchies**

Fractal hierarchies are the essence of TGD. There is hierarchy of space-time sheets labelled by preferred p-adic primes. There is hierarchy of Planck constants reflecting a book like structure of the generalized imbedding space and identified in terms of a hierarchy of dark matters. These hierarchies correspond at the level of conscious experience to a hierarchy of conscious entities -selves: self experiences its sub-selves as mental images.

Fractal hierarchies mean completely new element in the model for quantum computation. The decomposition of quantum computation to a fractal hierarchy of quantum computations is one implication of this hierarchy and means that each quantum computation proceeds from longer to shorter time scales T_{n}=2^{-n}T_{0} as a cascade like process such that at each level there is a large number of quantum computations performed with various values of input parameters defined by the output at previous level. Under some additional assumptions to be discussed later this hierarchy involves at a given level a large number of replicas of a given sub-module of tqc so that the output of single fractal sub-module gives automatically probabilities for various outcomes as required.

**B. Irreducible entanglement and possibility of quantum parallel quantum computation**

The basic distinction from standard measurement theory is irreducible entanglement not reduced in quantum jump.

**B.1 NMP and the possibility of irreducible entanglement**

Negentropy Maximimization Principle states that entanglement entropy is minimized in quantum jump. For standard Shannon entropy this would lead to a final state which corresponds to a ray of state space. If entanglement probabilities are rational -or even algebraic - one can replace Shannon entropy with its number theoretic counterpart in which p-adic norm of probability replaces the probability in the argument of logarithm: log(p_{n})→ log(N_{p}(p_{n}). This entropy can have negative values. It is not quite clear whether prime p should be chosen to maximize the number theoretic negentropy or whether p is the p-adic prime characterizing the light-like partonic 3-surface in question.

Obviously NMP favors generation of irreducible entanglement which however can be reduced in U process. Irreducible entanglement is something completely new and the proposed interpretation is in terms of experience of various kinds of conscious experiences with positive content such as understanding.

Quantum superposition of unitarily evolving quantum states generalizes to a quantum superposition of quantum jump sequences defining dissipative time evolutions. Dissipating quarks inside quantum coherent hadrons would provide a basic example of this kind of situation.

**B.2 Quantum parallel quantum computations and conscious experience**

The combination of quantum parallel quantum jump sequences with the fractal hierarchies of scales implies the possibility of quantum parallel quantum computations. In ordinary quantum computation halting selects single computation but in the recent case arbitrarily large number of computations can be carried out simultaneously at various branches of entangled state. The probability distribution for the outcomes is obtained using only single computation.

One would have quantum superposition of space-time sheets (assignable to the maxima of Kähler function) each representing classically the outcome of a particular computation. Each branch would correspond to its own conscious experience but the entire system would correspond to a self experiencing consciously the outcome of computation as intuitive and holistic understanding, abstraction. Emotions and emotional intellect could correspond to this kind of non-symbolic representation for the outcome of computation as analogs for collective parameters like temperature and pressure.

**B.3 Delicacies**

There are several delicacies involved.

- The above argument works for factors of type I. For HFFs of type II
_{1}the finite measurement resolution characterized in terms of the inclusion Nsubset M mean is that state function reduction takes place to N-ray. There are good reasons to expect that the notion of number theoretic entanglement negentropy generalizes also to this case. Note that the entanglement associated with N is below measurement resolution. - In TGD inspired theory of consciousness irreducible entanglement makes possible sharing and fusion of mental images. At space-time level the space-time sheets corresponding to selves are disjoint but the space-time sheets topologically condensed at them are joined typically by what I call join along boundaries bonds identifiable as braid strands (magnetic flux quanta). In topological computation with finite measurement resolution this kind of entanglement with environment would be below the natural resolution and would not be a problem.
- State function reduction means quantum jump to an eigen state of density matrix. Suppose that density matrix has rational elements. Number theoretic vision forces to ask whether the quantum jump to eigen state is possible if the eigenvalues of ρ do not belong to the algebraic extension of rationals and p-adic numbers used. If not, then one would have number theoretically irreducible entanglement depending on the algebraic extension used. If the eigenvalues actually define the extension there would be no restrictions: this option is definitely simpler.
- Fuzzy quantum logic (see this) brings also complications. What happens in the case of quantum spinors that spin ceases to be observable and one cannot reduce the state to spin up or spin down. Rather, one can measure only the eigenvalues for the probability operator for spin up (and thus for spin down) so that one has fuzzy quantum logic characterized by quantum phase. Inclusions of HFFs are characterized by quantum phases and a possible interpretation is that the quantum parallelism related to the finite measurement resolution could give rise to fuzzy qubits. Also the number theoretic quantum parallelism implied by number theoretic NMP could effectively make probabilities as operators. The probabilities for various outcomes would correspond to outcomes of quantum parallel state function reductions.

**C.Connes tensor product defines universal entanglement**

Both time-like entanglement between quantum states with opposite quantum numbers represented by M-matrix and space-like entanglement reduce to Connes tensor dictated highly uniquely by measurement resolution characterized by inclusion of HFFs of type II_{1}

**C.1 Time-like and space-like entanglement in zero energy ontology**

If hyper-finite factors of II_{1} are all that is needed then Connes tensor product defines universal S-matrix and the most general situation corresponds to a direct sum of them. M-matrix for each summand is product of Hermitian square root of density matrix and unitary S-matrix multiplied by a square root of probability having interpretation as analog for Boltzmann weight or probability defined by density matrix (note that it is essential to have Tr(Id)=1 for factors of type II_{1}. If factor of type I_{∞} are present situation is more complex. This means that quantum computations are highly universal and M-matrices are characterized by the inclusion N subset M in each summand defining measurement resolution. Hermitian elements of N act as symmetries of M-matrix. The identification of the reducible entanglement characterized by Boltzmann weight like parameters in terms of thermal equilibrium would allow interpret quantum theory as square root of thermodynamics.

If the entanglement probabilities defined by S-matrix and assignable to N rays do not belong to the algebraic extension used then a full state function reduction is prevented by NMP. Ff the generalized Boltzmann weights are also algebraic then also thermal entanglement is irreducible. In p-adic thermodynamics for Virasoro generator L_{0} and using some cutoff for conformal weights the Boltzmann weights are rational numbers expressible using powers of p-adic prime p.

**C.2 Effects of finite temperature**

Usually finite temperature is seen as a problem for quantum computation. In TGD framework the effect of finite temperature is to replace zero energy states formed as pairs of positive and negative energy states with a superposition in which energy varies.

One has an ensemble of space-time sheets which should represent nearly replicas of the quantum computation. There are two cases to be considered.

- If the thermal entanglement is reducible then each space-time sheet gives outcome corresponding to a well defined energy and one must form average over these outcomes.
- If thermal entanglement is irreducible each space-time sheet corresponds to a quantum superposition of space-time sheets, and if the outcome is represented classically as rates and temporal field patterns, it should reflect thermal average of the outcomes as such.

If the degrees of freedom assignable to topological quantum computation do not depend on the energy of the state, thermal width does not affect at all the relevant probabilities. The probabilities are actually affected even in the case of tqc since 1-gates are not purely topological and the effects of temperature in spin degrees of freedom are unavoidable. If T grows the probability distribution for outcomes flattens and it becomes difficult to select the desired outcome as that appearing with maximal probability.

**D. Possible problems related to quantum computation**

At least following problems are encountered in quantum computation.

- How to preserve quantum coherence for a sufficiently long time so that unitary evolution can be achieved?
- The outcome of calculation is always probability distribution: for instance, the output with maximum probability can correspond to the result of computation. The problem is how to replicate the computation with a sufficient accuracy. Or more precisely, how to produce replicas of the hardware of quantum computer defined in terms of classical physics?
- How to isolate the quantum computer from the external world during computation and despite this feed in the inputs and extract the outputs?

**D.1 The notion of coherence region in TGD framework**

In standard framework one can speak about coherence in two senses. At the level of Schrödinger amplitudes one speaks about coherence region inside which it makes sense to speak about Schrödinger time evolution. This notion is rather defined.

In TGD framework coherence region is identifiable as inside which modified Dirac equation holds true. Strictly speaking, this region corresponds to a light-like partonic 3-surface whereas 4-D space-time sheet corresponds to coherence region for classical fields. p-Adic length scale hierarchy and hierarchy of Planck constants means that arbitrarily large coherence regions are possible.

The precise definition for the notion of coherence region and the presence of scale hierarchies imply that the coherence in the case of single quantum computation is not a problem in TGD framework. De-coherence time or coherence time correspond to the temporal span of space-time sheet and a hierarchy coming in powers of two for a given value of Planck constant is predicted by basic quantum TGD. p-Adic length scale hypothesis and favored values of Planck constant would naturally reflect this fundamental fractal hierarchy.

**D.2 De-coherence of density matrix and replicas of tqc**

Second phenomenological description boils down to the assumption that non-diagonal elements of the density matrix in some preferred basis (involving spatial localization of particles) approach to zero. The existence of more or less faithful replicas of space-time sheet in given scale allows to identify the counterpart of this notion in TGD context. De-coherence would mean a loss of information in the averaging of M-matrix and density matrix associated with these space-time sheets.

Topological computations are probabilistic. This means that one has a collection of space-time sheets such that each space-time sheet corresponds to more or less same tqc and therefore same M-matrix. If M is too random (in the limits allowed by Connes tensor product), the analog of generalized phase information represented by its "phase" - S-matrix - is useless.

In order to avoid de-coherence in this sense, the space-time sheets must be approximate copies of each other. Almost copies are expected to result by dissipation leading to asymptotic self-organization patterns depending only weakly on initial conditions and having also space-time correlate. Obviously, the role of dissipation in eliminating effects of de-coherence in tqc would be something new. The enormous symmetries of M-matrix, the uniqueness of S-matrix for given resolution and parameters characterizing braiding, fractality, and generalized Bohr orbit property of space-time sheets, plus dissipation give good hopes that almost replicas can be obtained.

**D.3 Isolation and representations of the outcome of tqc**

The interaction with environment makes quantum computation difficult. In the case of topological quantum computation this interaction corresponds to the formation of braid strands connecting the computing space-time sheet with space-time sheets in environment. The environment is four-dimensional in TGD framework and an isolation in time direction might be required. The space-time sheets responsible for replicas of tqc should not be connected by light-like braids strands having time-like projections in M^{4}.

Length scale hierarchy coming in powers of two and finite measurement resolution might help considerably. Finite measurement resolution means that those strands which connect space-time sheets topologically condensed to the space-time sheets in question do not induce entanglement visible at this level and should not be affect tqc in the resolution used.

Hence only the elimination of strands responsible for tqc at given level and connecting computing space-time sheet to space-time sheets at same level in environment is necessary and would require magnetic isolation. Note that super-conductivity might provide this kind of isolation. This kind of elimination could involve the same mechanism as the initiation of tqc which cuts the braid strands so the initiation and isolation might be more or less the same thing.

Strands reconnect after the halting of tqc and would make possible the communication of the outcome of computation along strands by using say em currents in turn generating generalized EEG, nerve pulse patterns, gene expression, etc... halting and initiation could be more or less synonymous with isolation and communication of the outcome of tqc.

**D.4 How to express the outcome of quantum computation?**

The outcome of quantum computation is basically a representation of probabilities for the outcome of tqc. There are two representations for the outcome of tqc. Symbolic representation which quite generally is in terms of probability distributions represented in terms "classical space-time" physics. Rates for various processes having basically interpretation as geometro-temporal densities would represent the probabilities just as in case of particle physics experiment. For tqc in living matter this would correspond to gene expression, neural firing, EEG patterns,...

A representation as a conscious experience is another (and actually the ultimate) representation of the outcome. It need not have any symbolic counterpart since it is felt. Intuition, emotions and emotional intelligence would naturally relate to this kind of representation made possible by irreducible entanglement. This representation would be based on fuzzy qubits and would mean that the outcome would be true or false only with certain probability. This unreliability would be felt consciously.

In the proposed model of tqc the emergence of EEG rhythm (say theta rhythm) and correlated firing patterns would correspond to the isolation at the first half period of tqc and random firing at second half period to the sub-sequent tqc:s at shorter time scales coming as negative powers of 2. The fractal hierarchy of time scales would correspond to a hierarchy of frequency scales for generalized EEG and power spectra at these scales would give information about the outcome of tqc. Synchronization would be obviously an essential element in this picture and could be understood in terms of classical dynamics which defines space-time surface as a generalized Bohr orbit.

Tqc would be analogous to the generation of a dynamical hologram or "conscious hologram" (see this). EEG rhythm would correspond to reference wave and the contributions of spikes to EEG would correspond to the incoming wave interfering with it. Two remarks are in order.

**D.5 How data is feeded into submodules of tqc?**

Scale hierarchy obviously gives tqc a fractal modular structure and the question is how data is feeded to submodules at shorter length scales. There are are certainly interactions between different levels of scale hierarchy. The general ideas about master-slave hierarchy assigned with self-organization support the hypothesis that these interactions are directed from longer to shorter scales and have interpretation as a specialization of input data to tqc sub-modules represented by smaller space-time sheets of hierarchy. The call of submodule would occur when the tqc of the calling module halts and the result of computation is expressed as a 4-D pattern. The lower level module would start only after the halting of tqc (with respect to subjective time) and the durations of resulting tqcs would come as T_{n}= 2^{-n}T_{0} that geometric series of tqcs would become possible. There would be entire family of tqcs at lower level corresponding to different values of input parameters from calling module.

**D.6 The role of dissipation and energy feed**

Dissipation plays key role in the theory of self-organizing systems. Its role is to serve as a Darwinian selector. Without an external energy feed the outcome is a situation in which all organized motions disappear. In presence of energy feed highly unique self-organization patterns depending very weakly on initial conditions emerge.

In case of tqc one function of dissipation would be to drive the braidings to static standard configurations, prevent over-braiding, and perhaps even effectively eliminate fluctuations in non-topological degrees of freedom. Note that magnetic fields are important for 1-gates. Magnetic flux conservation however saves magnetic fields from dissipation.

External energy feed is needed in order to generate new braidings. For the proposed model of cellular tqc the flow of intracellular water induces the braiding and requires energy feed. Also now dissipation would drive this flow to standard patterns coding for tqc programs. Metabolic energy would be also needed in order to control whether lipids can move or not by generating cis type unsaturated bonds.

For the model of DNA as topological quantum computer see the chapter DNA as Topological Quantum Computer of "Genes and Memes".

## 5 comments:

When a person is conceived his dna code forms a circle wich we call the universe.He is a separate sovereign individual.Every other person(including his parents) are only there because he is there.The individual is never aware of "dying".He is reperceived and he always starts as an embryo.So the universe forms around the embryo moments after conception.

It's interesting that the first atom bomb that was dropped on japan was refered to as "little boy" and the second bomb dropped was refered to as "fat man". I contend that this points to the circle of life,and that the energy released in a nuclear explosion is not from an ordinary atom being split but rather a humane being split. Please comment....

Normal emotion occurs when the individual is in a separate sovereign circle,and he occupies his natural space and time according to life as a whole wich we call God.

Hi Nice Blog .A online timesheets that tracks both direct labor and indirect labor activity, including the employee, activity, machine, part, operation, project, date, time, and hours. This module is fully integrated with the Timeclock screens provided by Time and Attendance System

Hi Nice Blog .I think HR understands the importance of other people tracking time--IT, Lawyers, non-exempt employees, but struggles with the idea of web time clock .

Post a Comment