### About negentropic entanglement as analog of an error correction code

In classical computation, the simplest manner to control errors is to take several copies of the bit sequences. In quantum case no-cloning theorem prevents this. Error correcting codes (\urlhttps://en.wikipedia.org/wiki/Quantum

_{e}rror

_{c}orrection) code n information qubits to the entanglement of N>n physical qubits. Additional contraints represents the subspace of n-qubits as a lower-dimensional sub-space of N qubits. This redundant representation is analogous to the use of parity bits. The failure of the constraint to be satisfied tells that the error is present and also the character of error. This makes possible the automatic correction of the error is simple enough - such as the change of the phase of spin state or or spin flip.

Negentropic entanglement (NE) obviously gives rise to a strong reduction in the number of states of tensor product. Consider a system consisting of two entangled systems consisting of N_{1} and N_{2} spins. Without any constraints the number of states in state basis is 2^{N1}× 2^{N2} and one as N_{1}+N_{2} qubits. The elements of entanglement matrix can be written as E_{A,B}, A== ⊗_{i=1}^{N1} (m_{i},s_{i}), B== ⊗_{k=1}^{N2} (m_{k},s_{k}) in order to make manifest the tensor product structure. For simplicity one can consider the situation N_{1}=N_{2}=N.

The un-normalized general entanglement matrix is parametrized by 2× 2^{2N} independent real numbers with each spin contributing two degrees of freedom. Unitary entanglement matrix is characterized by 2^{2N} real numbers. One might perhaps say that one has 2N real bits instead of almost 2N+1 real qubits. If the time evolution according to ZEO respects the negentropic character of entanglement, the sources of errors are reduced dramatically.

The challenge is to understand what kind of errors NE eliminates and how the information bits are coded by it. NE is respected if the errors act as unitary transformations E→ UEU^{†} of the unitary entanglement matrix. One can consider two interpretations.

- The unitary automorphisms leave information content unaffected only if they commute with E. In this case unitary automorphisms acting non-trivially would give rise genuine errors and an error correction mechanism would be needed and would be coded to quantum computer program.

- One can also consider the possibility that the unitary automorphisms
*do not affect*the information content so that the diagonal form of entanglement matrix coded by N phases would carry of information. Clearly, the unitary automorphisms would act like gauge transformations. Nature would take care that no errors emerge. Of course, more dramatic things are in principle allowed by NMP: for instance, the unitary entanglement matrix could reduce to a tensor product of several unitary matrices. Negentropy could be transferred from the system and is indeed transferred as the computation halts.

By number theoretic universality the diagonalized entanglement matrix would be parametrized by N roots of unity with each having n possible values so that n

^{N}different NEs would be obtained and information storage capacity would be I=log(n)/log(2) × N bits for n=2^{k}one would have k× N bits. Powers of two for n are favored. Clearly the option for which only the eigenvalues of EĀ matter, looks more attractive realization of entanglement matrices. If overall phase of E does not matter as one expects, the number of full bits is k× N-1. This option looks more attractive realization of entanglement matrices.

In fact, Fermat polygons for which cosine and sine for the angle defining the polygon are expressible by iterating square root besides basic arithmetic operations for rationals (ruler and compass construction geometrically) correspond to integers, which are products of a power of two and of different Fermat primes F

_{n}=2^{2n}+1. l

- In TGD framework number theoretical universality requires discretization in terms of algebraic extension of rationals. This is not performed at space-time level but for the parameters characterizing space-time surfaces at the level of WCW. Strong form of holography is also essential and allows to consider partonic 2-surfaces and string world sheets as basic objects. Number theoretical universality (adelic physics) forces a discretization of phases and number theoretically allowed phases are roots of unity defined by some algebraic extension of rationals. Discretization can be also interpreted in terms of finite measurement resolution. Notice that the condition that roots of unity are in question realizes finite measurement resolution in the sense that errors have minimum size and are thus detectable.

- Hierarchy of quantum criticalities corresponds to a fractal inclusion hierarchy of isomorphic sub-algebras of the super-symplectic algebra acting as conformal gauge symmetries. The generators in the complement of this algebra can act as dynamical symmetries affecting the physical states. Infinite hierarchy of gauge symmetry breakings is the outcome and the weakening of measurement resolution would correspond to the reduction in the size of the broken gauge group. The hierarchy of quantum criticalities is accompanied by the hierarchy of measurement resolutions and hierarchy of effective Planck constants h
_{eff}=n× h.

- These hierarchies are argued to correspond to the hierarchy of inclusions for hyperfinite factors of type II
_{1}labelled by quantum phases and quantum groups. Inclusion defines finite measurement resolution since included sub-algebra does induce observable effects on the state. By Mac-Kay correspondence the hierarchy of inclusions is accompanied by a hierarchy of simply laced Lie groups which get bigger as one climbs up in the hierarchy. There interpretation as genuine gauge groups does make sense since their sizes should be reduced. An attractive possibility is that these groups are factor groups G/H such that the normal subgroup H (necessarily so) is the gauge group and indeed gets smaller and G/H is the dynamical group identifiable as simply laced group which gets bigger. This would require that both G and H are infinite-dimensional groups.

An interesting question is how they relate to the super-symplectic group assignable to "light-cone boundary" δ M^{4}_{+/-}× CP_{2}. I have proposed this interpretation in the context of WCW geometry earlier.

- Here I have spoken only about dynamical symmetries defined by discrete subgroups of simply laced groups. I have earlier considered the possibility that discrete symmetries provide a description of finite resolution, which would be equivalent with quantum group description.

Gauge invariance has turned out to be a fundamental symmetry principle, and one can ask whether unitary entanglement matrices assuming that only the eigenvalues matter, could give rise to a simulation of discrete gauge theories. The reduction of the information to that provided by the diagonal form be interpreted as an analog of gauge invariance?

- The hierarchy of inclusions of hyper-finite factors of type II
_{1}suggests strongly a hierarchy of effective gauge invariances characterizing measurement resolution realized in terms of hierarchy of normal subgroups and dynamical symmetries realized as coset groups G/H. Could these effective gauge symmetries allow to realize unitary entanglement matrices invariant under these symmetries.

- A natural parametrization for single qubit errors is as rotations of qubit. If the error acts as a rotation on
*all*qubits, the rotational invariance of the entanglement matrix defining the analog of S-matrix is enough to eliminate the effect on information processing.

Quaternionic unitary transformations act on qubits as unitary rotations. Could one assume that complex numbers as the coefficient field of QM is effectively replaced with quaternions? If so, the multiplication by unit quaternion for states would leave the physics and information content invariant just like the multiplication by a complex phase leaves it invariant in the standard quantum theory.

One could consider the possibility that quaternions act as a discretized version of local gauge invariance affecting the information qubits and thus reducing further their number and thus also errors. This requires the introduction of the analog of gauge potential and coding of quantum information in terms of SU(2) gauge invariants. In discrete situation gauge potential would be replaced with a non-integrable phase factors along the links of a lattice in lattice gauge theory. In TGD framework the links would correspond the fermionic strings connecting partonic two-surfaces carrying the fundamental fermions at string ends as point like particles. Fermionic entanglement is indeed between the ends of these strings.

- Since entanglement is multilocal and quantum groups accompany the inclusion, one cannot avoid the question whether Yangian symmetry crucial for the formulation of quantum TGD \citeallb/twistorstory could be involved.

For a summary of earlier postings see Links to the latest progress in TGD.

## 2 Comments:

See also:

https://en.wikipedia.org/wiki/Byzantine_fault_tolerance

https://en.wikipedia.org/wiki/Quantum_Byzantine_agreement

Thank you for the link.

Post a Comment

<< Home