[FRIAM] Quote of the week

Grant Holland grant.holland.sf at gmail.com
Mon Jun 6 12:54:50 EDT 2011


Eric makes an important point: "...if it is already known, then the 
message is not information."

In Information theory, it's not about how much information YOU already 
HAVE, but about how much information THERE IS regarding a situation - 
inherent in the probability distribution that models the situation.

Anytime you hear someone talking about "obtaining more information in 
order to reduce uncertainty", you know they are talking about how much 
information THEY HAVE, and not how much information THERE IS inherent in 
the situation. They are not having the same conversation about 
information that the information theorists are.

Grant

On 6/6/11 10:08 AM, ERIC P. CHARLES wrote:
> Nick,
> The notion of information that Shannon proposes takes a very idealized 
> understanding of "communication." I think it is a good model for 
> machine "communication" and things like that (i.e., metaphorical 
> communication), but it will not make you very happy, what with your 
> feet-on-the-ground study of actual communication between organisms.  
> For example, as I understand Shannon's information theory, there must 
> be countless things transmitted from one organism to another that do 
> not count as information, but which nevertheless are 'sent' by one 
> organism and alter the behavior of the other. Also, we cannot have a 
> conversation over whether or not it is in the interests of the 
> organism to base their behavior on the information they receive form 
> other organisms, because 'information' has been defined as that on 
> which it is good to base behavior. Also, also, we also cannot talk 
> about the transmission of information already known by the receiver, 
> because if it is already known, then the message is not information. 
> That is, if 1) we are flipping a coin, 2)  I see the coin land heads, 
> 3) you say 'heads', then your message contained no information.
>
> Eric
>
> P.S. Oddly, for the last point, I probably need to say that your 
> message contained no information 'about the coin.' In information 
> theory land they don't want to count it as information that your 
> saying 'heads' tells me that you also have seen the coin as landing a 
> heads (i.e., they don't want to count the information it gives me 
> about you). If they counted that, then all messages would contain 
> information.
>
>
> On Mon, Jun 6, 2011 09:44 AM, *"Nicholas Thompson" 
> <nickthompson at earthlink.net>* wrote:
>
>     Grant,
>
>     This seems backwards to me, but I got properly thrashed for my
>     last few postings so I am putting my hat over the wall very
>     carefully here.
>
>     I thought……i thought …. the information in a message was the
>     number of bits by which the arrival of the message decreased the
>     uncertainty of the receiver.  So, let’s say you are sitting
>     awaiting the result of a coin toss, and I am on the other end of
>     the line flipping the coin.  Before I say “heads” you have 1 bit
>     of uncertainty; afterwards, you have none.
>
>     The reason I am particularly nervous about saying this is that it,
>     of course, holds out the possibility of negative information. 
>      Some forms of communication, appeasement gestures in animals, for
>     instance, have the effect of increasing the range of behaviors
>     likely to occur in the receiver.  This would seem to correspond to
>     a negative value for the information calculation.
>
>     Nick
>
>     *From:*friam-bounces at redfish.com
>     [mailto:friam-bounces at redfish.com] *On Behalf Of *Grant Holland
>     *Sent:* Sunday, June 05, 2011 11:07 PM
>     *To:* The Friday Morning Applied Complexity Coffee Group; Steve Smith
>     *Subject:* Re: [FRIAM] Quote of the week
>
>     Interesting note on "information" and "uncertainty"...
>
>     Information is Uncertainty. The two words are synonyms.
>
>     Shannon called it "uncertainty", contemporary Information theory
>     calls it "information".
>
>     It is often thought that the more information there is, the less
>     uncertainty. The opposite is the case.
>
>     In Information Theory (aka the mathematical theory of
>     communications) , the degree of information I(E) - or uncertainty
>     U(E) - of an event is measurable as an inverse function of its
>     probability, as follows:
>
>     U(E) = I(E) = log( 1/Pr(E) ) = log(1) - log( Pr(E) ) = -log( Pr(E) ).
>
>     Considering I(E) as a random variable, Shannon's entropy is, in
>     fact, the first moment (or expectation) of I(E). Shannon entropy =
>     exp( I(E) ).
>
>     Grant
>
>     On 6/5/2011 2:20 PM, Steve Smith wrote:
>
>         /"Philosophy is to physics as pornography is to sex. It's
>         cheaper, it's easier and some people seem to prefer it."/
>
>
>     Modern Physics is  contained in Realism which is contained in
>     Metaphysics which I contained in all of Philosophy.
>
>     I'd be tempted to counter:
>
>     /"Physics is to Philosophy as the Missionary Position is to the
>     Kama Sutra"/
>
>
>     Physics also appeals to Phenomenology and Logic (the branch of
>     Philosophy were Mathematics is rooted) and what we can know
>     scientifically is constrained by Epistemology (the nature of
>     knowledge) and phenomenology (the nature of conscious experience).
>
>     It might be fair to say that many (including many of us here) who
>     hold Physics up in some exalted position simply dismiss or choose
>     to ignore all the messy questions considered by  *the rest of*
>     philosophy.   Even if we think we have clear/simple answers to the
>     questions, I do not accept that the questions are not worthy of
>     the asking.
>
>     The underlying point of the referenced podcast is, in fact, that
>     Physics, or Science in general might be rather myopic and limited
>     by it's own viewpoint by definition.
>
>     / "The more we know, the less we understand."/
>
>
>     Philosophy is about understanding, physics is about knowledge
>     first and understanding only insomuch as it is a part of natural
>     philosophy.
>
>     Or at least this is how my understanding is structured around
>     these matters.
>
>     - Steve
>
>     On Sun, Jun 5, 2011 at 1:15 PM, Robert Holmes
>     <robert at holmesacosta.com <#>> wrote:
>
>     >From the BBC's science podcast "The Infinite Monkey Cage
>     <http://www.bbc.co.uk/podcasts/series/timc>":
>
>     "Philosophy is to physics as pornography is to sex. It's cheaper,
>     it's easier and some people seem to prefer it."
>
>     Not to be pedantic, but I suspect that s/he has conflated
>     "philosophy" with "new age", as much of science owes itself to
>     philosophy.
>
>     marcos
>
>     ============================================================
>
>     FRIAM Applied Complexity Group listserv
>
>     Meets Fridays 9a-11:30 at cafe at St. John's College
>
>     lectures, archives, unsubscribe, maps athttp://www.friam.org
>
>
>
>     ============================================================
>
>     FRIAM Applied Complexity Group listserv
>
>     Meets Fridays 9a-11:30 at cafe at St. John's College
>
>     lectures, archives, unsubscribe, maps athttp://www.friam.org
>
>     ============================================================
>     FRIAM Applied Complexity Group listserv
>     Meets Fridays 9a-11:30 at cafe at St. John's College
>     lectures, archives, unsubscribe, maps at http://www.friam.org
>
> Eric Charles
>
> Professional Student and
> Assistant Professor of Psychology
> Penn State University
> Altoona, PA 16601
>
>
>
> ============================================================
> FRIAM Applied Complexity Group listserv
> Meets Fridays 9a-11:30 at cafe at St. John's College
> lectures, archives, unsubscribe, maps at http://www.friam.org
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://redfish.com/pipermail/friam_redfish.com/attachments/20110606/f9f45696/attachment.html>


More information about the Friam mailing list