[FRIAM] Free Will in the Atlantic

Marcus Daniels marcus at snoutfarm.com
Mon Apr 5 12:21:38 EDT 2021


When machine learning algorithms act racist, it is because people are often racist.   Mimicry of mass behavior is not insight.   

-----Original Message-----
From: Friam <friam-bounces at redfish.com> On Behalf Of u?l? ???
Sent: Monday, April 5, 2021 9:16 AM
To: friam at redfish.com
Subject: Re: [FRIAM] Free Will in the Atlantic

That's an interesting example because it helps tease apart the measurement of flocking. Like "free will", what is that we're pointing to with the phrase "flocking"? When the programmer does explicitly implement the Boids protocol, they don't implement flocking so much as the alphabet/grammar that will generate flocking. But such constructive demonstrations only show one generative structure. In order to sample the space of possible generative structures, we have to be algorithmic in our specification of the objective function "flocking". Then we can at least, if not brute force, largely at random falsify as many generative structures as possible and maybe classify those that work. At that point, we could explore the classes of structures that work and ask which ones are structurally analogous to whatever's available to referent birds.

On 4/5/21 8:47 AM, Pieter Steenekamp wrote:
> Let me try and give an example:
> 
> Instead of humans, let's use birds. Then I present to you flocking, nobody knows the algorithm for flocking and we may never know it. Indirectly yes, by using ABM but there the complexity emerges from running the program, the human did not program the algorithm for flocking.

--
↙↙↙ uǝlƃ

- .... . -..-. . -. -.. -..-. .. ... -..-. .... . .-. .
FRIAM Applied Complexity Group listserv
Zoom Fridays 9:30a-12p Mtn GMT-6  bit.ly/virtualfriam un/subscribe http://redfish.com/mailman/listinfo/friam_redfish.com
FRIAM-COMIC http://friam-comic.blogspot.com/
archives: http://friam.471366.n2.nabble.com/


More information about the Friam mailing list