Wednesday, January 26, 2022

Quantum Statistical Brain

The following considerations were inspired by a popular article (see this) telling about findings (see this) of Li et al supporting the view that neural noise carries information in the sense that it represents the uncertainty of visual short term memories so that both the content of memory and its uncertainty are represesented. Thanks for the link to Jouko Alanko.

Does neural noise carry information about the uncertainty of visual short term memories?

The highlights of Li et al are following:

  • Humans know the uncertainty of their working memory and use it to make decisions.
  • The content and the uncertainty of working memory can be decoded from so called BOLD signals.
  • Decoding errors predict memory errors at the single-trial level.
  • Decoded uncertainty correlates with behavioral reports of working memory uncertainty.
It is not surprising that the states of feature detector neurons should obey a statistical distribution. It is however not obvious that the reliability of the memory should correlate with the width of this distribution and that even the subjective estimate for the reliability should reflect this width.

Does the distribution in the feature space reflect quantum non-determinism?

Could the distribution in the feature space reflect quantum non-determinism rather than uncertainty of sensosry perceptions and somehow also the uncertainty of memories.

  1. If features as states of feature detector neurons or groups of them correspond to the outcomes of quantum measurements, they have a probability distribution. The real input to these neutron would have produced this distribution and could be estimated from the probability distribution.

    The outcomes are eigenstates of density matrix determined by the entanglement and determined apart from phase factors. For instance, in the measurement of spin of spin 1/2 particle the probabilities of spin 1/2 and spin -1/2 states can be deduced for an ensemble of identical particles but the relative phase of the spin 1/2 and spin -1/2 state cannot be deduced.

  2. The interpretation of quantum measurement would differ from the classical one. Classically, and according to recent neuroscience, sensory perception means that brain, system A, detects the state of system B in the external world. Quantum mechanically, the entanglement between A and B is reduced in the measurement and entangled state becomes a tensor product of are eigenstates of the density matrix. The relationship between A and B is what is "measured". For an ensemble of outcomes, the probabilities of outcomes allow to deduce information about the entanglement before measurement.
  3. If the reduction of the entanglement between sensory organ and external world can be measured repeatedly, it gives rise to a distribution of outcomes coding also the uncertainty caused by the quantum measurement. This however requires that the entanglement is regenerated between these measurements. Is this possible?
The distribution of features would not reflect uncertainty of memories but the non-determinism of the outcome in the reduction of entanglement. Interestingly, in quantum computation this kind of ensemble is produced and from the distribution of outcomes of the measurement halting the quantum computation, the outcome of the quantum computation is deduced. The method is essentually statistical.

In TGD framework the notion of magnetic body (MB) using biological body as sensory receptor and motor instrument emerges as a new notion. The entanglement between magnetic body and sensory organs could be reduced in sensory perception. There is a hierarchy of levels and entanglements at them and SFR is replaed with a cascade of SFRs proceeding from long to short scales.

Is the feature distribution realized as a temporal ensemble?

In sensory perception, the distribution of features should correspond to a distribution of states of feature detector neurons or their groups. How is this distribution realized? How does this distribution relate to the distribution of memories?

Let us consider the questions about sensory perceptions.

  1. The neuroscience based answer to question in the case of sensory perceptions would be "As a spatial ensemble consisting of feature neurons". But how does this distribution relate to the distribution of memories?
  2. In TGD framework, the answer would be "As a temporal ensemble". Zero energy ontology (ZEO) leads to a new view about quantum states as superpositions of deterministic time evolutions and modifies the view about quantum measurements allowing to circumvent the basic paradox of quantum measurement theory leading to various interpretations.

    The outcome is the notion of 4-D brain, which suggests a temporal ansemble formed by memory mental images of the feature. In ZEO, the sequences of "small" state function reductions (SSFRs) as counterparts of so called weak measurements would form temporal ensembles of memory mental images so that the connection with short term memory would be direct. The spatial ensemble would be replaed by temporal ensemble experienced consciously as memories.

TGD based view about sensations and short term memories

To develop a more detailed model based on the proposed ideas, one must answer several questions in the TGD framework. What sensory experiences, perceptions, and features are in TGD Universe? What could the phrase "statistical ensemble of features" mean? What does sensory perception as a quantum measurement and quantum measurement itself correspond to?

The notions of sensation, perception, and feature

Sensation as the core of sensory experience must be distinguished from perception. Sensation is just the sensory awareness with nothing added. Perception involves a cognitive representation providing an interpretation of perception and consists of objects and the associations and memories associated with them.

Brain is believed to analyze the sensory input from the sensory organs to features. Features are just those aspects of the input that are relevant to survival or target of attention. Neutrons serve as feature detectors (see this).

This deconstruction process is followed by reconstruction which proceeds upwards from features to objects of the perceptive field so that the perceptive field decomposes to standardized mental images representing objects with various attributes, orientation and motion are such attributes. This is basically pattern recognition. Features are basic building bricks of the sensory mental images and not necessarily conscious to us.

The reconstruction process is analogous to first drawing a simple drawing consisting of lines and then gradually filling the picture by adding colors with varying intensities. Something analogous happens also when the sound-scape of a movie is constructed. One starts from the actual sound-scape but the outcome is quite different and very far from the original. One could say that sensory perception is essentially an artwork.

In the mathematical modeling, one can speak of a feature space. Features have attributes and the claim of the article discussed is that one can assign to features a probability distribution. Brain would not only build features but also represent this probability distribution making it possible to estimate the reliability of the visual short memory. It is however not clear how the distribution gives rise to a conscious experience about reliability and how the short term memory relates to the sensory perception.

Ensemble of features as temporal ensemble of memory mental images?

The probability distribution for features should be realized somehow as a statistical ensemble. One can consider two alternative options.

  1. In the standard physics framework spatial ensemble seems to be the only possible realization. The perception would be represented as a large number of copies. The fact that the inputs in the retina are mapped in a topographic manner to various parts of the visual cortex poses strong constraints on the number and location of the copies. If there is a spatial ensemble its neurons should form groups of neary neurons. The problem is how the distribution of features in this ensemble can code for the reliability of sensory or memory mental images and this requires a theory of consciousness.
  2. In the TGD framework, the brain is 4-D and it makes sense to speak of a temporal ensemble of memory mental images. These temporal ensembles would correspond to temporal sequences of memory mental images and the distribution aspect would be automatically realized. The variance of this distribution would provide conscious experience about the reliability of the mental images. The natural interpretation would be in terms of short term memory.
For the TGD option, the sensory input to the sensory organ, say retina, would generate a temporal ensemble of visual mental images making possible short term memory. This ensemble would be characterized by a probability distribution. The probability distribution for the states of feature neurons would be a neuronal level example of this kind of distribution. Variance would be one characteristic of this distribution and characterize the reliability of short term memory. Sensory perceptions would give rise to short term memories.

Many questions remain to be answered. How are these memory mental images generated in quantum measurements? How does the memory recall of long term memory generate a short term memory represented as a temporal ensemble of visual mental images?

  1. For instance, in the memory recall of a phone number, long term memory is involved. Somehow the memory recall creates "almost" sensory, that is virtual, perception, which suggests that a virtual sensory input from MB is involved and creates a virtual sensory perception giving rise to a visual short term memory.
  2. In the TGD framework, these virtual sensory perceptions would also make possible imagination. The virtual sensory input would come from MB to cortex and proceed to the lower levels of the brain but would not reach sensory organs except during dreams, hallucinatory states, and sensory memories (memory feats of idiot savants).
  3. The sensation associated with the sensory experience would correspond to a state function reduction (SFR) occurring in quantum measurement. But what does SFR correspond to in TGD?

    In the zero energy ontology (ZEO), the notion of SFR generalizes. The are two kinds of SFRs: "big" SFRS (BSFRs) as analogs of ordinary quantum measurements in which a large change is possible and "small" SFRs (SSFRs) as analogs of so called weak measurements, which are assumed in quantum optics but are not very-well defined in the standard quantum theory and do not appear in the text books.

    SSFRs relate closely to the Zeno effect which states that the state of the physical system remains unaffected if the same measurement is repeated. In reality this is not quite true, and the sequence of SSFRs represents a generalization of a repeated quantum measurement allowing us to understand what really happens.

    Sensory perception would be repetition of SSFRs following analogs of unitary time evolutions and would produce an temporal ensemble of sensory mental images giving rise to short term memory. The system would be measured, it would return back to almost its original state and would be measured again. SSFR is almost a classical measurement.

See the article Quantum Statistical Brain or the chapter with the same title.

For a summary of earlier postings see Latest progress in TGD.

Articles and other material related to TGD. 


No comments: