<!DOCTYPE html>
<html>
<head>
<meta http-equiv="Content-Type" content="text/html; charset=UTF-8">
</head>
<body>
<br>
<blockquote type="cite"
cite="mid:MN0PR11MB598585F70659843B02924FB9C51C2@MN0PR11MB5985.namprd11.prod.outlook.com">
<meta http-equiv="Content-Type" content="text/html; charset=UTF-8">
<meta name="Generator"
content="Microsoft Word 15 (filtered medium)">
<!--[if !mso]><style>v\:* {behavior:url(#default#VML);}
o\:* {behavior:url(#default#VML);}
w\:* {behavior:url(#default#VML);}
.shape {behavior:url(#default#VML);}
</style><![endif]-->
<style>@font-face
{font-family:"Cambria Math";
panose-1:2 4 5 3 5 4 6 3 2 4;}@font-face
{font-family:Calibri;
panose-1:2 15 5 2 2 2 4 3 2 4;}@font-face
{font-family:Aptos;}@font-face
{font-family:Consolas;
panose-1:2 11 6 9 2 2 4 3 2 4;}p.MsoNormal, li.MsoNormal, div.MsoNormal
{margin:0in;
font-size:12.0pt;
font-family:"Aptos",sans-serif;}a:link, span.MsoHyperlink
{mso-style-priority:99;
color:blue;
text-decoration:underline;}pre
{mso-style-priority:99;
mso-style-link:"HTML Preformatted Char";
margin:0in;
font-size:10.0pt;
font-family:"Courier New";}span.HTMLPreformattedChar
{mso-style-name:"HTML Preformatted Char";
mso-style-priority:99;
mso-style-link:"HTML Preformatted";
font-family:"Consolas",serif;}span.EmailStyle21
{mso-style-type:personal-reply;
font-family:"Aptos",sans-serif;
color:windowtext;}.MsoChpDefault
{mso-style-type:export-only;
font-size:10.0pt;
mso-ligatures:none;}div.WordSection1
{page:WordSection1;}</style><!--[if gte mso 9]><xml>
<o:shapedefaults v:ext="edit" spidmax="1026" />
</xml><![endif]--><!--[if gte mso 9]><xml>
<o:shapelayout v:ext="edit">
<o:idmap v:ext="edit" data="1" />
</o:shapelayout></xml><![endif]-->
<div class="WordSection1">
<p class="MsoNormal"><span style="font-size:11.0pt">SFI had one
of these for a while. (As far as I know it just sat there.)<o:p></o:p></span></p>
<p class="MsoNormal"><span style="font-size:11.0pt"><o:p> </o:p></span></p>
<p class="MsoNormal"><span style="font-size:11.0pt"><a
href="http://www.ai.mit.edu/projects/im/cam8/"
moz-do-not-send="true" class="moz-txt-link-freetext">http://www.ai.mit.edu/projects/im/cam8/</a><br>
<br>
Nowadays GPUs are used for Lattice Boltzmann.</span></p>
</div>
</blockquote>
<p><br>
</p>
<p>such a blast-from-past with the awesome 90's stylized web page
and the pics of the SUN (and Apollo?) workstations!</p>
<p>CAM8 is clearly the legacy of Margolis' work (MIT). At the time
(1983) I remember handwired/soldered breadboards and I think banks
of memory chips wired through logic gates and such... I think this
was pre SUN days (I had an M68000 Wicat Unix box on my desktop
which sported a massive 5MB hard drive with pruned down BSD
variant installed on it). In fact, that was where I ran the MT
simulations (tuning rules until I got "interesting" activity,
running parameter sweeps, etc).</p>
<p>When GPUs first rose up (SGI) they seemed hyper-apropriate to the
purpose but alas, I had not spare cycles at that point in my
career to look into it. Just a few years ago when I was working
on the Micoy Omnistereoscopic "camera ball" (you mentioned it
looked a bit like coronavirus particle) I had specced out an FPGA
fabric solution (with a dedicated FPGA wired directly between
every adjacent overlapping camera pair - 52 cameras) to do
realtime image de-distortion/stitching with the special
considerations which stereo vison adds. I never became a VHDL
programmer but I did become familiar with the paradigm... I think
I tried to engage Roger at a Wedtech on the topic when he was
(also) investigating FPGAs. (circa 2016?) <br>
</p>
<p>At that time, my fascination with CA had evolved into variations
on Gosper's Hashlife... so GPU and FPGA fabric didn't seem as
apt, though TPUs do seem (more) apt for the implicit data
structures (hashed quad-trees).</p>
<p><br>
</p>
<p>The new nVidia DGX concentrated TPU system for $3k is fascinating
and triggers my thoughts (not very coherent) about the tradeoffs
between power and entropy and "complexity".</p>
<p>A dive down this 1983/4 rabbit hole lead me (also) to the <i>Toffoli</i>
and <i>Fredkin Gates</i> and <i>Reversible Computing. </i>More
on that in a few billion more neural/GPT cycles...</p>
<p></p>
<br>
</body>
</html>