**Matti Pitkanen** (*matpitka@pcu.helsinki.fi*)

*Fri, 4 Jun 1999 08:22:13 +0300 (EET DST)*

**Messages sorted by:**[ date ] [ thread ] [ subject ] [ author ]**Next message:**Stephen P. King: "[time 384] Re: [time 380] Re: [time 376] What are observers"**Previous message:**Matti Pitkanen: "[time 382] Re: [time 365] Re: [time 359] What is Information?"

Here previous posting with some additions.

***************************************************

The previous discussions about information concept inspired

the attempt to associate a well defined information to configuration

space spinor field as a property of quantum history.

This would make possible to associate *genuine information

gain to quantum jump as difference of the informations associated

with initial and final quantum histories*.

[Entanglement entropy would indeed measure the 'catchiness'

of conscious experience rather than its information content.]

It seems that this is possible!

In accordance with intuitive expectations the

information contains infinite part, which does not however depend

on the state! Therefore it is possible to compare the information

contents of different quantum histories. Most importantly,

the information gain associated with conscious experience is well

defined since infinite terms cancel each other in difference!

The definition is based on Shannon information. Entanglement plays

now no role. Definition works also in ordinary wave mechanics

but has no obvious generalization to quantum field theory context.

The argument goes as follows.

*******

Concept of configuration space spinor field

Configuration space spinor field is determined once its values

on the lightcone boundary are fixed. Nondeterminism however

implies that given 3-surface on the lightcone boundary corresponds

to several absolute minima. This forces the generalization of

the concept of 3-surface. The space of 3-surfaces on the lightcone

boundary is like manysheeted like Riemann

surface with various sheets corresponding to various absolute

minima X^4(Y^3) fixed by choosing some minimal number

of 3-surfaces from particular absolute minima: these

association sequences provide geometric representation

for thoughts. What is essential that everything

reduces to lightcone boundary since inner product for configuration

space spinor fields can be expressed as integral over the space

of the 3-surfaces Y^3 belonging to lightcone boundary xCP_2

plus summation over the degenerate branches of X^4(X^3).

*******

How to measure the information associated with

configuration space spinor field?

The idea of selection and Shannon entropy works also here.

a) The probability that 3-surface X^3 in volume element dV of

configuration space is selected is

dP = R*dV

where R is 'modulus squared' for the configuration space spinor

field at X^3, which is essentially the norm of state in fermionic

Fock space.

b) The information associated with configuration

space spinor field is just the negative of Shannon entropy. Using

division into volume elements dV

I= -SUM(X^3) dP log(dP) = -SUM(X^3) R*log(R)*dV -SUM(X^3) R*dV log(dV)

= -INT R*log(R)dV - log(dV).

Here INT denotes integral. The first part gives well defined integral

over configuration space. Second term is infinite but does not depend

on state!! This infinite

term tells that the information contained in state is infinite,

which is not at all surprising. One can however forget this

infinite since it is information differences which matter so that

one can define:

I== -INT R*log(R).

This kind of formula of course applies also in case of ordinary quantum

mechanics. Perhaps one should call I as available information.

c) The degeneracy of absolute minima brings in

summation over branches but this is only minor complication and

can be included in the definition of integral.

******

Connection with the concept of cognitive resources

One can decompose configuration space spinor field as

Psi = exp(-K/2) f,

where K is Kaehler function. This makes it possible to express

information in the form

I== <K> -<log|f|^2>,

where the first term is expectation value for the

Kaehler function.

What is remarkable that first term is a direct generalization

of the purely classical hypothesis that Kaehler function gives information

type measure for the cognitive resources of the 3-surface

measured by the number of degenerate absolute minima proportional

to exp(K_cr), where K_cr is Kaehler function at quantum criticality.

This suggests that 'ontogeny repeats phylogeny' principle is at work also

here in the sense that

Vacuum expectation for the classical measure for cognitive resources

equals to the quantal information of the vacuum state (apart from

infinite state dependent term).

*******

Properties of the information and information gain

The information defined in the proposed manner is not positive definite.

This follows from the dropping of the infinite background contribution

guaranteing positivity.

At the limit, when configuration space spinor field is located

to infinitely small volume the information becomes negative

and infinite whereas at the limit when configuration space spinor

field is totally delocalized, I becomes positive and infinite.

The interpretation is obvious. Completely localized configuration

space spinor field does not carry (potential information) whereas

delocalized field carries a lot of information.

Each quantum jump is preceided by the action of 'time development'

operator

U_a acting on the initial quantum history. This means dispersion in

the reduced configuration space so that information increases.

The final state results in a quantum jump involving localization to some

sector of configuration space. This obviously means

the reduction of information and the interpretation is that

the difference

Delta I = I_i-I_f

of the informations associated with the

initial and final state is the *information content of conscious

experience* (which in general decomposes into separate sub-experiences).

What is nice that the ill defined log(dV) factor automatically disappears

from I_i-I_f. This is quite sensible: it is conscious information

gain which matters and this must be well defined and finite(?).

One can of course argue that I is actually entropy rather than

information. On the other hand, the larger the I the larger

the potential information gain in quantum jump leading

to localization in configuration space. Therefore one can say

that entropy is a necessary prequisite for information gain and

could as well be regarded as (potential) information.

Only sinner can have the moment of mercy!(;-)

*******

What happens in case of wave mechanics and QFT?

The definition of information concept works also in case

of wave mechanics. What is remarkable is that the dispersion

associated with Schrodinger time evolution increases the information

(potential information gain of quantum jump). The information

associated with density matrix associated with pure state is constant.

For instance, the information for harmonic oscillator states/states

of hydrogen atom increases, when the energy increases

since states become increasingly delocalized. One can generalize

the definition also to the case of many particle wave mechanics

by replacing 3-dimensional configuration space with 3N-dimensional

configuration space.

In quantum field theory situation seems to be different since

it is not possible to interpret time evolution in any kind of

configuration

space. In TGD the situation is saved by the fact that configuration space

spinor fields are infinite-dimesional classical spinor fields so that

one can regard states of universe as states of single gigantic

classical fermion.

********

p-Adicization

The logarithm of R is problematic in real context

and one can quite well wonder whether the integral is well defined.

p-Adicization implies some modifications (restriction to definite

sector of configuration space and replacement of logarithm with

its p-adic counterpart Log_p(R), which is integer valued and

determined by the p-adic norm of R. Hence on obtains extremely simple

formula

I= Int R(X^3) n(X^3) dV =<n> = SUM(n) p_n n

expressing information as p-adic expectation value of n= Log_p(R).

p_n is the probability that Log_p(R) equals to n.

In p-adic context information gain in quantum jump must be defined

as the difference for the *real counterparts of the p-adic

information* for initial and final quantum histories. For the state

U_a |Psi_i| preceiding quantum jump p-adic sectors with all values of

p give their contribution to information so that this is indeed the

only sensical possibility.

*******

Connection with p-adic thermodynamics and

generalized Hawking formula

A good guess is that the huge complexity of the infinite-dimensional

situation implies that the probabilities p_n can be calculated from p-adic

thermodynamics and are hence of the form

p_n= g(n) p^(n/T_p) ,

where p^(n/F_p) is counterpart of Boltzmann weight exp(-E/T),

1/T_p is integer valued p-adic temperature and g_n is the degeneracy

of state having 'energy' Log_p(R)= n.

In p-adic thermodynamics determining the values of particle mass squared,

exactly similar formula for particle mass squared as analog of thermal

energy results. Elementary particle black hole analogy leads to

the hypothesis that Hawking-Bekenstein formula stating the proportionality

of particles mass squared and p-adic entropy of particle generalizes!

This in turn leads an intuitive justification for the p-adic length

scale hypothesis stating that p-adic primes near prime powers of

two are the most interesting ones physically: geometrically the hypothesis

means that the radii of elementary particle horizons are p-adic

length scales themselves. Thus it seems that generalization

of Hawking-Bekenstein formula might derive from the proposed

definition of information for quantum state.

********

To sum up, the prospects seem good! Of course, it takes week or

two to think all details through but I think that I can express

already now my gratitude for Stephen for very stimulating discussions!

Best,

Matti Pitkanen

**Next message:**Stephen P. King: "[time 384] Re: [time 380] Re: [time 376] What are observers"**Previous message:**Matti Pitkanen: "[time 382] Re: [time 365] Re: [time 359] What is Information?"

*
This archive was generated by hypermail 2.0b3
on Sat Oct 16 1999 - 00:36:04 JST
*