First I want to drop from consideration what looks to me nonsense.
- Our senses are only electrical signals that the brain encodes. We are just a biocomputers. This is the first fatal mistake, which, thank God, w are getting rid of as theories of consciousness develop.
- The assumption that we are a simulation is non-sensical and explains nothing except maybe Vopson's second rule of infodynamics, which is most probably wrong as will be found. It however creates myriads of questions: for instance, who are those simulators and what physics do they obey?
- Vopson's second law of infodynamics is motivated by facts: by the experience with computer codes and corona viruses. Interpretation of these facts does not howerve require Vopson's second law. The system's goal is to have just enough information, only the most significant bits, for it to cope with its tasks. This is because the maintenance of information requires (metabolic) energy and this must be saved.
If an increase in information as an increase in complexity is followed by an increase in entropy then this would explain why entropy decreases as the genome develops and gets simpler. Complexity, the maintenance of which requires metabolic energy and its input is minimized subject to the constraint that the genome works. In computer science one speaks about compression.
Above I however made some assumptions: the increase of conscious information means increase of complexity accompanied by the increase of entropy and metabolic energy is needed to preserve complexity. How does TGD justify these assumptions and can it explain the findings.
- One must explain why an increase of conscious information results in an increase of entropy. One must of course define first what conscious information is, and this cannot be achieved without the theory of consciousness, cognition and quantum biology. In TGD, number theory is an integral part of the physics of cognition.
The first results of TGD-based cognition was the concept of p-adic entropy as a generalization of Shannon entropy. p-Adic entropies can be interpreted as measures of both algebraic complexity and the amount of conscious information, a universal IQ.
Conscious information defined as the sum of p-adic entropies turns out to be greater than the usual entropy even though these two quantities are strongly correlated. This information tends to grow in the number theory evolution. Its increase also results in an increase of entropy. Jeremy England has postulated the same on basis of empirical findings, but in TGD it is predicted.
- One must also understand why metabolic energy is necessary for the growth of information/complexity. Here, too, number-theoretic physics is needed. It predicts a hierarchy of Planck's constants heff which corresponds to the hierarchy of extensions of rational numbers. heff is proportional to the dimension of the extension and serves as a measure of the algebraic complexity of the extension. The deviation of heff from h means that the phase of ordinary matter in question behaves like dark matter, which would be an essential part of biosystems and would control ordinary biomatter.
- The increase of heff requires energy, i.e. metabolic energy. heff tends to decrease spontaneously, so that the system remains complex/intelligent/aware only if it receives metabolic energy continually.
The minimization metabolic energy feed forces the system to eliminate unnecessary complexity, to represent just the signification bits, and this explains the findings of Vopson and others.
For a summary of earlier postings see Latest progress in TGD.
For the lists of articles (most of them published in journals founded by Huping Hu) and books about TGD see this.