Carnot engine is the standard practical application. One has two systems A and B, both in thermal equilibrium but with different temperatures TA and TB ≥ TA. By second law one has heat flow Q from A to B the two systems, and Carnot's engine transforms some of this heat to work. Carnot's law gives an upper bound for the efficiency of the engine as η= W/Q ≤ (T2-T1)/T2. The possibility to transform information to work forces to generalize Carnot's law.
Since information is basically conscious information, this generalization is highly interesting from the point of view of quantum theories of consciousness and quantum biology. Certainly the generalization is highly non-trivial. Especially so in standard physics framework, where only entropy is defined at fundamental level and is regarded as ensemble entropy and basically has very little to do with conscious information. Therefore the argumentation is kind of artwork.
1. Maxwell's demon in its original form
Maxwell's demon appears in a thought experiment in which one considers a system consisting of two volumes A and B of gas in thermal equilibrium at same temperature. At the boundary between A and B having a small hole sits a demon checking whether a molecule coming from A has velocity above some threshold: if so it allows the molecule to go to B. Demon monitors also the molecules coming from B and if the velocity is below the threshold it allows the molecule to continue to A. As a consequence, temperature and pressure differences develop between A and B. Pressure difference can do work very much voltage between the cathode and anode of battery. One can indeed add a tube analogous to wire between ends of the entire system and pressure difference causes a flow of mass doing thus work: one has pump.
The result is in conflict with the second law and one can ask what goes wrong. From the Wikipedia article one learns that a lot of arguments have been represented con and pro Maxwell's demon. Biologist might answer immediately. Demon must measure the states of molecules and this requires cognition and memory, which is turn require metabolic energy. When one takes this into account this, paradox should disappear and second law should remain true in a generalized form in which one takes into account the needed metabolic energy.
2. Experimental realization of Maxwell's demon
The popular article describes an experiment actualizing Maxwell's demon carried out by Govind Paneru, Dong Yun Lee, Tsvi Tlusty, and Hyuk Kyu Pak . Below is the abstract of the article Lossless Brownian Information Engine published in Phys Rev Letters (see this).
We report on a lossless information engine that converts nearly all available information from an error-free feedback protocol into mechanical work. Combining high-precision detection at a resolution of 1 nm with ultrafast feedback control, the engine is tuned to extract the maximum work from information on the position of a Brownian particle. We show that the work produced by the engine achieves a bound set by a generalized second law of thermodynamics, demonstrating for the first time the sharpness of this bound. We validate a generalized Jarzynski equality for error-free feedback-controlled information engines.
Unfortunately, the article is behind paywall and I failed to find it in arXiv. The popular article uses notions like "particle trapped by light at room temperature" and photodiode as "light trap" without really defining what these expressions mean. For instance, it is said that the light trap would follow particles moving in definite direction (from A to B in Maxwell's thought experiment). I must admit that I am not at all sure what the precise meaning of this statement is.
3. TGD view about the situation
TGD inspired theory of consciousness can be regarded as a quantum measurement theory based on zero energy ontology (ZEO) and it it is interesting to try to analyze the experiment in this conceptual framework.
3.1 TGD view about the experiment
The natural quantum interpretation is that the photodiode following the photon is performing repeated quantum measurements, which in standard quantum theory do not affect the state of the particle after the first measurement. From the viewpoint of TGD inspired consciousness, which can be regarded as a generalization of quantum measurement theory forced by zero energy ontology (ZEO), the situation could be as follows.
- Photo-diode following the particle by would be like conscious entity directing attention its to the particle and keeping it in focus. In TGD Universe directed attention has as classical space-time correlates flux tubes connecting attendee and target of attention: in ER-EPR correspondence the flux tubes are replaced with wormholes, which suit better to GRT based framework. Flux tubes make also possible entanglement between attendee and target. The two systems become single system during the period of attention and one could say that the attention separates the particle from the rest.
- Directed attention costs metabolic energy. Same would be true also now - photo-diode indeed requires energy feed. Directed attention creates mental image the conscious entity associated with the mental images can be regarded as a generalized Zeno effect or as a sequence of weak measurements.
Tracking would thus mean that particle's momentum is measured repeatedly so that the particle is forced to continue with the same momentum. Gradually this would affect the thermal distribution and generate temperature and pressure gradients. Directed attention could be also seen as a mechanism of volition in quantum biology.
- This looks nice but one can ask what about the collisions of the particle with other molecules of gas: don't they interfere with the Zeno effect? If the period between repeated measurements is shorter than the average time between the collisions of particles, this is not a problem. But is there any effect in this case? The directed attention or a sequence of quantum measurements could separate the particle from the environment by de-entangling it from the envirobment. Could it be that collisions would not occur during this period so that attendee and target would form a subsystem de-entangled from rest of the world?
Zero energy ontology (ZEO) forces to consider different arrangement producing energy somewhat like in perpetuum mobile but not breaking the conservation of energy in any obvious manner. The idea pops into my mind occasionally and I reject it every time and will do so again.
- Zero energy states (ZESs) are like physical events: pairs of positive and negative energy state with energy E and -E: this codes for energy conservation.
- One can have quantum superposition of ZESs with different values of energy E and with average value < E> of energy. In state function reduction < E> can change and in principle this does not break conservation of energy since one has still superposition of pairs with energies E and -E.
- For instance, the probabilities for states with energy E could be given by thermal distribution parameterized by temperature parameter T: one would have "square root" of thermodynamic distribution for energies. "Square root" of thermodynamics is indeed forced by ZEO. One would have essentially entanglement in time direction. Single particle states would realize square root of thermodynamical ensemble, which would not be a fictive notion anymore.
The coefficients for the state pairs would have also phases and these phases would bring in something new and very probably very important in living matter. System characterized by temperature T would not be so uninteresting as we think, there could be hidden phase information.
In ordinary thermodynamics energy should be fed from environment to increase < E>: how environment would enter into the game now?
- State function reduction always reduces the entanglement of system S with environment, call it Senv. Could the increase of < E> be compensated by compensating change of -< E> in Senv. Indeed, the conservation of energy for single state is expected have statistical counterpart: energy would come from environment as a kind of metabolic energy. Therefore also the "square root of thermodynamics would prevent perpetuum mobile.
- This would be the case if the reduction measures the energy of the entire system Stot=S + Senv - so that Stot is always in energy eigenstate with eigenvalue Etot and Etot does not change in reductions and in unitary evolutions between them. Can one pose this condition?
ZEO based theory of consciousness (see this) forces to consider also a genuine breaking of the second law.
- In ZEO self as a conscious entity corresponds to a generalized Zeno effect or equivalently a sequence of analogs of weak measurements as "small" state function reductions. The state at passive boundary of CD is unaffected as also the members of state pairs at it.
Second boundary of CD (active boundary) shifts farther away from the passive one and the members of state pairs at it change giving rise to the conscious experience of self. Clock time time identified as temporal distance between the tips of CD increases. This gives rise to the correspondence between clock time and subjective time identified as sequence of weak reductions.
- Also "large" state function reductions are possible and also unavoidable. The roles of active and passive boundary are changed and time reversal occurs for the clock time. One can say that self dies and re-incarnates as a time-reversed self.
At the next re-incarnation self with the original arrow of clock time would be reborn and continue life from time value shifted towards future from the moment of death: its identity as a physical could be however very different. One can of course wonder whether sleep could mean a life in opposite direction of clock time and wake-up a reincarnation in the usual sense.
The time-reversed self need not have conscious memories about its former life cycle: only the collections of un-entangled subsystems at passive boundary carry information about this period. A continuation of conscious experience could however take place in different sense: the contents of consciousness associated with the magnetic body of self could survive the death as near-death-experiences indeed suggest.
- The time reversed system obeys second law but with opposite time direction as normally. Already Italian physicist Fantappie proposed that this occurs routinely in living matter and christened the entropy for time reversed systems syntropy. Processes like spontaneous assembly of complex molecules from their building bricks could be controlled by time reversed selves.
In TGD inspired biology motor actions could be seen as generation of signal propagating backwards in time and defining sub-system with revered arrow of time and inducing the activity preceding motor activity before the conscious decision leading to it is made: this with respect to geometric time. There are many effects supporting the occurrence of these time reversals.
- How the possibility of time reversals relates to the second law? One might argue that second law emerges from the non-determinism of state function reduction alone. Second law would transform to its temporal mirror image when one looks the system from outside with unchanged arrow of clock time.
But does the second law continue to hold in statistical sense as one takes average over several incarnations? One might think that this is the case since generalized Zeno effect generalizes ordinary Zeno effect and at the limit of positive energy ontology one would effectively have a sequence of ordinary state function reductions leading leading to second law.
TGD also predicts what I call Negentropy Maximization Principle (NMP) .
- Entanglement coefficients belong to extension of rationals allowing interpretation as both real and p-adic numbers in the extension of p-adics induced by the extension of rationals defining the adele.
One can assign ordinary entanglement entropy to the real sector of adele and entanglement negentropy with the p-adic sectors of adelic physics: for latter the analog of ordinary Shannon entropy is negative and thus the interpretation as conscious information is possible. The information is assigned with the pairing defined by entanglement whereas entropy is associated with the loss of precise knowledge about the state of particle in entangled state.
- One can also consider the difference of sum of p-adic entanglement negentropies and real entanglement entropy as the negentropy. This quantity can be positive for algebraic extensions of rationals and its maximal value increases with the complexity of the extension and with p-adic prime.
Also the information defined in this manner would increase during evolution assignable to the gradual increase of dimension of algebraic extension of rationals, which can take place in "large" state function reductions (re-incarnations of self): if the eigenvalues of density matrix are algebraic numbers in an extension of the extension of rationals, the "large" state function must take place.
- NMP would hold true in statistical sense - and mathematically very much analogous to second law - and would relate to evolution. In particular, one can understand why the emergence of intelligent systems is - rather paradoxically - accompanied by the generation of entropy. To have large entanglement negentropy in p-adic sectors one must have large entanglement entropy in real sector since same entanglement defines both.
The hierarchy of Planck constants heff/h=n is a further key notion in TGD inspired quantum biology.
- The hierarchy of Planck constants heff/h=n implied by adelic physics as physics of both sensory experience (real numbers) and cognition (p-adic number fields) is basic prediction of TGD (see this). Planck constant characterizes the dimension of the algebraic extension of rationals characterizing the cognitive representations, and is bound to increase since the number of extensions with dimension larger than given dimension is infinite whereas those with smaller dimension is finite.
- The ability to generate negentropy increases during evolution. System need not however generate negentropy and can even reduce it. In statistical sense negentropic resources however increase: things get better in the long run. In biology metabolic energy feed brings to system molecules having valence bonds with heff/h=n larger than that for atoms (see this), and this increases the ability of the system to generate negentropy and in statistical sense this leads to the increase of negentropy.
For a summary of earlier postings see Latest progress in TGD.
No comments:
Post a Comment