https://matpitka.blogspot.com/2024/12/has-google-managed-to-reach-critical.html

Monday, December 16, 2024

Has Google managed to reach the critical value for the error rate of a single qubit?

Google claims to have achieved something marvellous with the quantum computer called Willow. This claim is however combined with a totally outlandish claim about parallel universes created in quantum computers and this has generated a lot of cognitive dissonance in professionals during the last week. They have not yet forgotten the earlier equally absurd claim about the creation of wormholes in quantum computers.

The Quanta Magazine article "Quantum Computers Cross Critical Error Threshold" (see this) tells what has been achieved but did not resolve the cognitive dissonance. I already commented the claims of Google in a blog posting (see this).

Now I encountered an excellent article "Ask Ethan: Does quantum computation occur in parallel universes?" (see this) analyzing thoroughly the basics of quantum computation and what Google has achieved. I recommend it to anyone seriously interested in quantum computation.

The really fantastic achievement is the ability to reduce the error rate for the physical qubits forming the grid defining the logical qubit below the critical value .1 percent guaranteeing that for larger grids of physical qubits the error rate decreases exponentially. This achievement is more than enough! But why do they claim that this implies parallel universes? This claim is totally absurd and leads me to ask whether the claimed achievement is really true? How can one trust professionals who do not seem to understand the basic notions of quantum mechanics?

Taking the basic claim seriously, one can of course ask whether the slow error rate is actually theoretically possible in standard quantum mechanics or does it require new physics. These qubits are rather stable but are they so stable in standard QM?

I have been talking about this kind of new physics now for two decades. This new physics would play a key role in quantum biology and could be important also in condensed matter physics and even in chemistry. It is implied by the predicted hierarchy of effective Planck constants heff labelling the phases of ordinary matter with quantum scales scaled up by heff/h. This makes possible long scale temporal and spatial quantum coherence and can reduce the error rate and provide a solution to the basic problems listed in the article. The latest proposal along these lines is the proposal how classical computers and quantum computers could be fused to what might be regarded as conscious computers sharing several life-like features with biomatter (see this).The situation is now different since the temperature is very low and the chip is superconducting.

One learns from the video describing the Willow chip (see this) that the lifetime of a logical qubit is T ≈ 100 μs. This time is surprisingly long: can one really understand this in ordinary quantum mechanics? One can try this in the TGD framework.

  1. The energy of qubit flip must be as small as possible but above the thermal energy. Energy economics suggests that the Josephson energy E= ZeV of electrons in Josephson junction is above the thermal energy at the temperatures considered but not much larger. For superconducting quantum computers (see this) the temperature is about 10-2 K, which corresponds to the energy scale of μeV.
  2. The formula f= ZeV/heff gives a rough estimate for the quantum coherence time of a superconducting qubit as T= heff /ZeV. For heff=h this gives T≈ 3 ns for the quantum coherence time of a single qubit. The value heff≈ 3.3× 104 would be needed to increase T from its naive estimate of T=3 ns to the required T=100 μs.

    I have proposed that these relatively small values of heff (as compared to the values of the gravitational Planck constant) can appear in electrically charged systems. The general criterion applying to all interactions is that the value of heff is such that the perturbation series as powers of, say, Z1Z2e2/ℏeff for the electromagnetic interactions of charges Z1 and Z2 converges.

    In the recent case, the value of heff could correspond to the electric counterpart of the gravitational Planck constant having the form ℏem= Z1Z2e20, where β0=v0/c is a velocity parameter (see this). Z1 could correspond to a large charge and Z2 to a small charge, say that of a Cooper pair. For instance, DNA having a constant charge density per unit length, would have a rather large value of ℏem. The presence of electronic Cooper pair condensate could give rise to the needed large electric charge making possible the needed value of ℏeff= ℏem≈ 3.3 × 104ℏ.

See the article Has Google managed to reach the critical value for the error rate of a single qubit? or the chapter Are Conscious Computers Possible in TGD Universe?.

For a summary of earlier postings see Latest progress in TGD.

For the lists of articles (most of them published in journals founded by Huping Hu) and books about TGD see this.

No comments: