[FRIAM] New ways of understanding the world

uǝlƃ ↙↙↙ gepropella at gmail.com
Tue Dec 1 12:53:59 EST 2020


Well, as I've tried to make clear, machines can *accrete* their machinery. I think this is essentially arguing for "genetic memory", the idea that there's a balance between scales of learning rates. What your dog learns after its birth is different from what it "knew" at its birth. I'm fine with tossing the word "theory" for this accreted build-up of inferences/outcomes/state. But it's as good a word as any other.

I suspect that there are some animals, like humans, born with FPGA-like learning structures so that their machinery accretes more after birth than other animals. And that there are some animals born with more of that machinery already built-in. And it's not a simple topic. Things like retractable claws are peculiar machinery that kindasorta *requires* one to think in terms of clawing, whereas our more rounded fingernails facilitate both clawing and, say, unscrewing flat head screws.

But this accreted machinery is *there*, no matter how much we want to argue where it came from. And it will be there for any given AI as well. Call it whatever you feel comfortable with.

On 12/1/20 9:39 AM, Marcus Daniels wrote:
> Dogs and humans share 84% of their DNA, so that almost sounds plausible on the face of it.  However, humans have about 16 billion neurons in the cerebral cortex but the whole human genome is only about 3 billion base pairs, and only about 30 million of it codes for proteins.   This seems to me to say that learning is more important than inheritance of "theories" if you must insist on using that word.


-- 
↙↙↙ uǝlƃ



More information about the Friam mailing list