[FRIAM] Memoization & Entropy Estimates
steve smith
sasmyth at swcp.com
Mon Jan 20 15:04:22 EST 2025
If I haven't already beaten this horse into the ground:
The conceit of the program here is that rather than doing a binary
quad-tree decomposition of the 2D "Life" grid, we instead decompose it
into a denser quad-tree hierarchy based on an N-1xN-1 kernel. The
point of this is to capture *all possible* translations at *every
scale*. The idea that drives this for me is that by hashing these
patterns (same as Gosper's original) *should* yield the most compact
state description of whole system at any time step and therefore also
across the evolution of the system?
The bar chart on the right is green==number of hash hits and red==number
of hash misses for each scale 3x3...NxN.
There is theoretically an upper limit of the complexity of such systems
which is 2^(NxN) or for my 32x32 grid (anything much larger bogs badly)
2^1024 (Yuuuge!).
This code (run in P5.js in my browser because I'm too lazy to update the
editor/runtime environment on my 15 year old computer?) seems to exhibit
the behaviour I would expect across many runs.(P5 is to me as NetLogo is
to Guerin).
In the first pass, for example, the NxN has has precisely 1 hash
entry while the exponentially increasing number of possible entries is
still sparse down to the smallest examples (3x3, 4x4, 5x5, etc). In
*most* runs I've made the 3x3 saturates at 2^9=512 but sometimes it
achieves quiescence or oscillation without fully exploring all 512.
The majority of larger scale patterns don't even approach saturation
(the numbers are just too large to achieve in reasonable time but also
the total computational space (32x32x1bit) is not large enough to
support very much exploration?
Next steps might include inverting the program to do recursive
computation rather than simply recording/measuring the patterns as they
unfold. A side effect of this would be to allow for arbitrarily large
patterns, not restricting the computation to a pre-defined real-estate
(e.g. 32x32)?
In spite of being a groupie to CA (and many forms of
computation/simulation) I have not really studied the larger range of
analyis of "Life" as Hashlife facilitates. I believe Guerin turned me
on to an example of the implementation of a glider-gun based emulation
of a conventional register/gate computer which then implements hashlife
(recursively), being "hashlife all the way down"? It is possible that
my N-1 kernel concept converges with the N/2 quadtree of Gosper for
"most" computations?
inquiring minds want to know.
(PS) been listening to Judith Rosen as I worked on this... I do hope we
can cajole/support Glen (or another) to engage... I am also hammering
on the recursive version but stalled because multi-tasking with
Rosennian and death/disposal and Fascist v Oligarchical maunderings...
> My weak grasp on the root of the reversibility argument is that a
> great deal of the "information" that is generated in a computation is
> in some sense extraneous? I wish I knew more what I meant by that.
>
> I took this opportunity to ride one of my hobby horses down the road a
> little: a variation on Gosper's HashLife which is designed to support
> a measure of computational complexity. I'll elaborate under separate
> cover.
>
>
>
> On 1/18/25 5:03 PM, steve smith wrote:
>>
>> Pieter -
>>
>> Good find. It lead me to Vaire and then to the Sandia/ABQ work of
>> Michael Frank who left to join/found Vaire this summer? It is
>> possible that my renewed interest in reversible computing might have
>> been triggered subliminally by some reference to both/either?
>>
>> https://techcrunch.com/2024/07/01/vaire-computing-raises-4-5m-for-reversible-computing-moonshot-which-could-drastically-reduce-energy-needs/
>>
>> <https://techcrunch.com/2024/07/01/vaire-computing-raises-4-5m-for-reversible-computing-moonshot-which-could-drastically-reduce-energy-needs/>
>>
>> https://vaire.co/
>>
>> I thought I'd been triggered by the combination of the demands of AI
>> and on data centers (my daughter closed her gym of 10 years to take a
>> job in a data center development startup a year ago... ).
>>
>> My inability to attribute such things, parallels that of LLMs (or
>> more generally transformer models)?
>>
>> - Steve
>>
>> .- .-.. .-.. / ..-. --- --- - . .-. ... / .- .-. . / .-- .-. --- -. --. / ... --- -- . / .- .-. . / ..- ... . ..-. ..- .-..
>> FRIAM Applied Complexity Group listserv
>> Fridays 9a-12p Friday St. Johns Cafe / Thursdays 9a-12p Zoomhttps://bit.ly/virtualfriam
>> to (un)subscribehttp://redfish.com/mailman/listinfo/friam_redfish.com
>> FRIAM-COMIChttp://friam-comic.blogspot.com/
>> archives: 5/2017 thru presenthttps://redfish.com/pipermail/friam_redfish.com/
>> 1/2003 thru 6/2021http://friam.383.s1.nabble.com/
>
> .- .-.. .-.. / ..-. --- --- - . .-. ... / .- .-. . / .-- .-. --- -. --. / ... --- -- . / .- .-. . / ..- ... . ..-. ..- .-..
> FRIAM Applied Complexity Group listserv
> Fridays 9a-12p Friday St. Johns Cafe / Thursdays 9a-12p Zoomhttps://bit.ly/virtualfriam
> to (un)subscribehttp://redfish.com/mailman/listinfo/friam_redfish.com
> FRIAM-COMIChttp://friam-comic.blogspot.com/
> archives: 5/2017 thru presenthttps://redfish.com/pipermail/friam_redfish.com/
> 1/2003 thru 6/2021http://friam.383.s1.nabble.com/
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://redfish.com/pipermail/friam_redfish.com/attachments/20250120/57daf9fd/attachment-0001.html>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: Screen Shot 2025-01-19 at 9.28.38 AM.png
Type: image/png
Size: 103414 bytes
Desc: not available
URL: <http://redfish.com/pipermail/friam_redfish.com/attachments/20250120/57daf9fd/attachment-0002.png>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: Screen Shot 2025-01-19 at 9.31.57 AM.png
Type: image/png
Size: 108595 bytes
Desc: not available
URL: <http://redfish.com/pipermail/friam_redfish.com/attachments/20250120/57daf9fd/attachment-0003.png>
More information about the Friam
mailing list