[FRIAM] On old question

Roger Critchlow rec at elf.org
Wed Oct 24 07:26:02 EDT 2018


Mutual information turned up earlier this week in some articles about what
Google AI does next, which could be read as an attempt on organization, too.

https://www.zdnet.com/article/google-ponders-the-shortcomings-of-machine-learning/
this is the article that Google news pushed on me.  The shortcoming of
machine learning, in the article, is that it isn't general intelligence.
And it linked to the following papers.

First https://arxiv.org/abs/1806.01261 Relational inductive biases, deep
learning, and graph networks

> Artificial intelligence (AI) has undergone a renaissance recently, making
> major progress in key domains such as vision, language, control, and
> decision-making. This has been due, in part, to cheap data and cheap
> compute resources, which have fit the natural strengths of deep learning.
> However, many defining characteristics of human intelligence, which
> developed under much different pressures, remain out of reach for current
> approaches. In particular, generalizing beyond one's experiences--a
> hallmark of human intelligence from infancy--remains a formidable challenge
> for modern AI.

The proposed solution is to develop deep learning architectures over graph
structured data.

Then https://arxiv.org/abs/1809.10341 Deep Graph Infomax

> We present Deep Graph Infomax (DGI), a general approach for learning node
> representations within graph-structured data in an unsupervised manner. DGI
> relies on maximizing mutual information between patch representations and
> corresponding high-level summaries of graphs---both derived using
> established graph convolutional network architectures.

So mutual information between the whole and the part, the shared purpose as
it were, where the residual information of the part -- modulo the mutual
information with the whole -- would presumably be the 'function' of the
part.

What struck me most forcefully about this is that deep learning models
themselves provide a huge dataset of graph structured data, with varying
degrees of generality and success over varying datasets of experience.

The oil and water never "really mix".  The shaking mechanically breaks them
into smaller and smaller pieces which intermingle, but they are immediately
reforming into their original layers.  If you add detergent, not
recommended for salad dressing, you can get the emulsified oil to dissolve
into the water and they won't separate again.

-- rec --


On Wed, Oct 24, 2018 at 2:56 AM Marcus Daniels <marcus at snoutfarm.com> wrote:

> Nick,
>
>
>
> It sounds like you are describing mutual information.  This is ancient,
> but a nice overview of related topics:
>
>
>
>
> https://www.amazon.com/Information-Theory-Qualitative-Quantitative-Applications/dp/0803921322
>
>
>
> Marcus
>
>
>
> *From: *Friam <friam-bounces at redfish.com> on behalf of Nick Thompson <
> nickthompson at earthlink.net>
> *Reply-To: *The Friday Morning Applied Complexity Coffee Group <
> Friam at redfish.com>
> *Date: *Wednesday, October 24, 2018 at 12:22 AM
> *To: *Roger Critchlow <rec at elf.org>, Friam <Friam at redfish.com>
> *Subject: *[FRIAM] On old question
>
>
>
> Dear Roger, and anybody else who wants to play,
>
>
>
> While waiting for my paper, *Signs and Designs*, to be rejected, I have
> gone back to thinking about an old project, whose working title has been “*A
> Sign Language*.”  And this has led me back to Robert Rosen, whose *Life
> Itself* I bought almost 9 years ago and it has remained almost pristine,
> ever since.  In the chapter I am now looking at, Rosen is talking about
> “organization.”  Now, I have been thinking about organization ever since I
> read C. Ray Carpenter’s early work on primate groups back in the late
> 50’s.  It seemed to me at the time, and it seems to me reasonable now, to
> define the organization of a set of entities as related in some way to the
> degree to which one can predict the behavior of one entity from knowledge
> about another.  Now the relationship is not straightforward, because
> neither total unpredictability (every monkey behaves exactly the same as
> every other monkey in every situation) nor total unpredictability (no
> monkey behaves like any other monkey in ANY situation) smacks of great
> organization.  The highest levels organization, speaking inexpertly and
> intuitively, seem to correspond to intermediate levels of predictability,
> where there were several classes of individuals within a group and within
> class predictability was strong but cross-class predictability was weak.
> On my account, the highest levels of organization involve hierarchies of
> predictability.  Thus honey bees and ants are more organized than starling
> flocks, say.
>
>
>
> This is where the matter stood at the point that I came to Santa Fe and
> started interacting with you guys 14 years ago.  You-all introduced me to a
> totally different notion of organization based – shudder – on the second
> law.  But I have never been able to deploy your concept with any
> assurance.  So, for instance, when I shake the salad dressing, I feel like
> I am disorganizing it, and when it reasserts itself into layers, I feel
> like it ought to be called more organized.  But I have a feeling that you
> are going to tell me that the reverse is true.  That, given the molecules
> of fat and water/acid, that the layered state is the less organized state.
>
>
>
> Now this confusion of mine takes on importance when I try to read Rosen.
> He defines a function as the difference that occurs when one removes a
> component of a system.  I can see no reason why the oil or the water in my
> salad dressing cannot be thought of components of a system and if, for
> instance, I were to siphon out the water from the bottom of my layered
> salad dressing, I could claim that the function of the water had been to
> hold the water up.  This seems a rather lame notion of function.
>
>
>
> Some of you who have been on this list forever will remember that I raised
> the same kind of worry almost a decade back when I noticed the drainage of
> water from a basin was actually *slowed *by the formation of a vortex.
> This seemed to dispel any notion that vortices are structures whose
> function is to efficiently dispel a gradient.  It also violated my
> intuition from traffic flows, where I imagine that rigid rules of priority
> would facilitate the flow of people crossing bridges to escape Zozobra.
>
>
>
> It’s quite possible that my confusions in this matter are of no great
> general applicability, in which case, I look forward to being ignored.
>
>
>
> Nick
>
>
>
> Nicholas S. Thompson
>
> Emeritus Professor of Psychology and Biology
>
> Clark University
>
> http://home.earthlink.net/~nickthompson/naturaldesigns/
>
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://redfish.com/pipermail/friam_redfish.com/attachments/20181024/fc2ed459/attachment.html>


More information about the Friam mailing list