[FRIAM] alternative response

glen∉ℂ gepropella at gmail.com
Tue Jun 16 08:35:24 EDT 2020


As usual, I am below arguing a position that I don't necessarily believe. This is a game, a temporary hypothesis. Precede every assertion with "Let's say that ..."

1) There's no need for two of you. You are a steady mesh of choices in parallel, from the tiniest cellular process to picking up the cranberry. And I agree, there's no need for free will there.

2) The "two behavioral tendencies" are not *two*. They are a loose collection of many behaviors that *might* group, ungroup, and regroup. The compositional machinery that does the grouping does NOT pit one group of behaviors against another group of behaviors. It mixes and matches behaviors to arrive at a grouping that (kinda-sorta) optimizes for least effort.

3) The "first person sense" is the perception of irreversibility. It is the mesh of you clipping the tree of possibilities. In a different post, you asked "freedom from what?" The answer I'm proposing here is: freedom from evaluating/realizing every POSSIBLE next event. At any given instant, there's a (composite) probability distribution for everything that *could* happen in the next instant. Some events are vanishingly unlikely. Other events are overwhelmingly likely. The interesting stuff is somewhere in between, like 50% likely to happen. Within some ε of 50% are the things you sense/feel/perceive. And as the options fall away, you feel/realize the lost opportunity. That is the first person perspective you talk about. Again, no free will is required.

4) When you feel that lost opportunity, i.e. when you sense that you've now gone down an irreversible path, for a little while, you can ask "what if I'd taken that path and not this one?" Again, no free will is required, only the ability to *perceive* that there were other paths your mesh/machine could have taken if the universe had been different.

5) That cohesive sensing is identical to the compositional machinery in (2) above. There's a storage/memory to that compositional machinery that can remember the historical trace the mesh took ... the "choices" made by the mesh. So, the NEXT time your mesh is on a similar trajectory, your compositional machinery will be slightly biased by your history.

That memory of lost opportunities is what we call free will.

On 6/15/20 8:29 PM, thompnickson2 at gmail.com wrote:
> ... when I act deliberately, there are two of me, the me that acts
> and the me that chooses to act.  [...] Now from a third person point of view, you have no need of
> any of that.  [...] No need for free
> will there.
> 
> Now I am under no illusion that human individuals are wholly integrated
> beings.  In fact, evolutionary theory suggests that we have been designed by
> two selection regimens, one that privileges the individual, and one that
> privileges any group that we associate with.  At any one time, these two
> behavioral tendencies are struggling for the controls of our body-engine,
> [...]
> 
> The only need for free will arises from my first person sense that I have
> made a decision not to pick up the cranberry and then acted on that
> decision. [...]
> 
> Now I got bogged down over the weekend, so I still don't know where you guys
> came down on that issue.  I get the impression, perhaps, that what you have
> been arguing about is entirely orthogonal to my concern.



More information about the Friam mailing list