**Stephen P. King** (*stephenk1@home.com*)

*Thu, 27 May 1999 15:07:28 -0400*

**Messages sorted by:**[ date ] [ thread ] [ subject ] [ author ]**Next message:**Stephen P. King: "[time 363] Re: [time 361] What is Information?(errata)"**Previous message:**Hitoshi Kitada: "[time 361] Re: [time 357] Re: [time 356] What is Information?"**In reply to:**Matti Pitkanen: "[time 356] Re: [time 355] What is Information?"

Dear Hitoshi,

Hitoshi Kitada wrote:

*>
*

*> Dear Stephen,
*

*>
*

*> I feel by your postings that you seem to try to define the subjectivity by
*

*> the degree of resonance to it. Or at least it seems to me that you need to
*

*> do so if you would proceed to the direction that information could play some
*

*> definite roles in your thought.
*

*> Maybe you could define a subject as the existence that can have some
*

*> resonance from others or other observers, thus subjectivity becomes a
*

*> relative notion.
*

Umm, I had not thought of it those words! :) Yes, I believe that is

true. I am trying to model subjectivity such that we can deal with may

observers having their own subjectivity... Resonance then, to me,

relates to the fuzzy mutual entropy (using Kosko's term) between

observers. (http://www.phptr.com/ptrbooks/esm_0131249916.html,

http://www.ercb.com/ddj/1998/ddj.9805.html Ironically, the frequentist

type of probability theory

(http://www.odu.edu/~fernand/e520sp99/second/index.htm) is deriveable,

claims Kosko, from simpler postulates...

quoting from http://www.ercb.com/ddj/1998/ddj.9805.html:

"Chapter 12, entitled "Fuzzy Cubes And Fuzzy Mutual Entropy," is a tour

de force. Given that a

single fuzzy variable can be mapped continuously over the [(0,0),(1,0)]

line, then two fuzzy variables map to the unit square

[(0,0),(1,0),(1,1),(0,1)]; three map to the unit cube; and n variables

map to the n-dimensional hypercube. The information state within the

fuzzy cube is continuous, with the center paradox point being the

fuzziest of the fuzzy. Kosko deduces from the presence of the divergence

operator in the information field equations that fuzzy computations can

be thought of as information fluid calculations."

I have this and would like to discuss it. I understand Ben distaste for

fuzzy sets and I do understand the distributivity issue... I am just

trying to explore all possible formalisms. :)

*> Matti's "absolute information," if existed, might be a one that cannot be
*

*> reduced to other things like the axioms and undefined terms in mathematical
*

*> theory. We, the older aged people tend to think ourselves as the absolute
*

*> criterion to judge the validity of a theory. You young people seem
*

*> different. I feel this at classrooms when I am giving lectures to young
*

*> students and have feedbacks from them. They seem to judge by majority vote.
*

*> The truth seems not a one that should be judged by reason for them. Or it
*

*> might be more correct to say that their concerns are in their community, not
*

*> in the solitary truth. This generation difference might be the cause of the
*

*> difficulty of our communication. In other words, the problems you are
*

*> considering and the ones we are considering might be different. How do you
*

*> think?
*

I find that the "truth by majority vote" is more prevalent that

suspected! It is quite possible that that we humans can "truth" is

merely what we agree upon. As people age, I have noticed that there is

the tendency to become ossified

(http://www.znet.com/~normanl/philo2.htm)and make the assumptions that

you state. :) It could be that it is the result of habituation and

becoming comfortable in one's situation -we stop looking for

contradictions. The young tend to be iconoclast.

(http://ecco.bsee.swin.edu.au/text/ddict/ICONOCLAST.html)

I am in the midst of thinking about Matti's last post ([time 359]) and

find myself agreeing with him completely, just discussing a few

subtleties. I am just making the point that Absolute information implies

"infinite knowability" if they are to be used to determine "Absolute

Truths" I say: "I think this due to the very nature of "knowability",

e.g. it is not possible to "know" an infinite amount of information..."

I believe this connects directly to your method of deriving the

Uncertainty Principle using the limit of infinite time. Bart Kosko's

ideas in Chapter 12 of the book reference above gives another way of

thinking of this, thus my desire to discuss his work further.

*> Best wishes,
*

*> Hitoshi
*

*>
*

*> PS I feel tired after today's lectures, which might let me write this :)
*

:) I am now a full time baby sitter for my children, so I can dedicate

more of my time to thinking and writing. :)

Kindest regards,

Stephen

**Next message:**Stephen P. King: "[time 363] Re: [time 361] What is Information?(errata)"**Previous message:**Hitoshi Kitada: "[time 361] Re: [time 357] Re: [time 356] What is Information?"**In reply to:**Matti Pitkanen: "[time 356] Re: [time 355] What is Information?"

*
This archive was generated by hypermail 2.0b3
on Sun Oct 17 1999 - 22:10:32 JST
*