<div dir="ltr"><div class="gmail_default" style="font-family:tahoma,sans-serif;font-size:small">I love the graphic! I've had the misfortune of twice jumping on that roller coaster just before the Peak of Inflated Expectation - once for the AI boom/bust of the mid 1980s and once for the dotcom boom/bust of the late 1990s. Jumped on too late to make a killing, but didn't get too badly damaged by the Trough of Disillusionment either.</div></div><br><div class="gmail_quote"><div dir="ltr" class="gmail_attr">On Thu, May 4, 2023 at 10:34 AM Steve Smith <<a href="mailto:sasmyth@swcp.com">sasmyth@swcp.com</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex">
<div>
<p align="center"><img src="https://miro.medium.com/v2/resize:fit:720/format:webp/1*aG2GG56d_C6DJnGzZn0d3Q.jpeg"></p>
<p align="center"><a href="https://doctorow.medium.com/the-ai-hype-bubble-is-the-new-crypto-hype-bubble-74e53028631e" target="_blank">https://doctorow.medium.com/the-ai-hype-bubble-is-the-new-crypto-hype-bubble-74e53028631e</a></p>
<p>I *am* a fan of LLMs (not so much image generators) and
blockchain (not so much crypto or NFTs) in their "best" uses (not
that I or anyone else really knows what that is) in spite of my
intrinsic neoLuddite affect. <br>
</p>
<p>Nevertheless I think Doctorow in his usual acerbic and
penetrating style really nails it well here IMO.</p>
<p>I particularly appreciated his reference/quote to Emily Bender's
"High on Supply" and "word/meaning conflation" in the sense of
"don't mistake an accent for a personality" in the dating scene.</p>
<p>A lot of my own contrarian commments on this forum come from
resisting what Doctorow introduces (to me) as "CritiHype"
(attributed to Lee Vinsel)... the feeling that some folks make a
(a)vocation out of kneejerk criticism. It is much easier to
*poke* at something than to *do* something worthy of being *poked
at*. I appreciate that Doctorow doesn't seem to (by my fairly
uncritical eye) engage in this much himself... which is why I was
drawn into this article... <br>
</p>
<p>I also very much appreciate his quote from Charlie Stross: <br>
</p>
<blockquote>
<p id="m_7928671687561163185668a" style="box-sizing:inherit;font-weight:400;color:rgb(41,41,41);word-break:break-word;line-height:32px;letter-spacing:-0.003em;font-family:source-serif-pro,Georgia,Cambria,"Times New Roman",Times,serif;font-size:20px;font-variant-ligatures:normal;font-variant-caps:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px;background-color:rgb(255,255,255);text-decoration-style:initial;text-decoration-color:initial"><i>corporations are
Slow AIs, autonomous artificial lifeforms that consistently do
the wrong thing even when the people who nominally run them
try to steer them in better directions:</i></p>
<p id="m_79286716875611631852871" style="box-sizing:inherit;font-weight:400;color:rgb(41,41,41);word-break:break-word;line-height:32px;letter-spacing:-0.003em;font-family:source-serif-pro,Georgia,Cambria,"Times New Roman",Times,serif;font-size:20px;font-variant-ligatures:normal;font-variant-caps:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px;background-color:rgb(255,255,255);text-decoration-style:initial;text-decoration-color:initial"><i><a href="https://media.ccc.de/v/34c3-9270-dude_you_broke_the_future" rel="noopener ugc nofollow" style="box-sizing:inherit;color:inherit;text-decoration:underline" target="_blank">https://media.ccc.de/v/34c3-9270-dude_you_broke_the_future</a><br>
</i></p>
<p style="box-sizing:inherit;font-weight:400;color:rgb(41,41,41);word-break:break-word;line-height:32px;letter-spacing:-0.003em;font-family:source-serif-pro,Georgia,Cambria,"Times New Roman",Times,serif;font-size:20px;font-variant-ligatures:normal;font-variant-caps:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px;background-color:rgb(255,255,255);text-decoration-style:initial;text-decoration-color:initial"><br>
<i></i></p>
</blockquote>
<p>I could go on quoting and excerpting and commenting on his whole
article and the myriad links/references he offers up but will curb
my enthusiasm and leave it to the astute FriAM readers to choose
how much to indulge in. It was a pretty good antidote for my own
AI-thusiasm driven by long chats with GPT4 (converging on being
more like long sessions wandering through Wikipedia after the
first 100 hours of engagement).</p>
<p><br>
</p>
<p><br>
</p>
<p><br>
</p>
</div>
-. --- - / ...- .- .-.. .. -.. / -- --- .-. ... . / -.-. --- -.. .<br>
FRIAM Applied Complexity Group listserv<br>
Fridays 9a-12p Friday St. Johns Cafe / Thursdays 9a-12p Zoom <a href="https://bit.ly/virtualfriam" rel="noreferrer" target="_blank">https://bit.ly/virtualfriam</a><br>
to (un)subscribe <a href="http://redfish.com/mailman/listinfo/friam_redfish.com" rel="noreferrer" target="_blank">http://redfish.com/mailman/listinfo/friam_redfish.com</a><br>
FRIAM-COMIC <a href="http://friam-comic.blogspot.com/" rel="noreferrer" target="_blank">http://friam-comic.blogspot.com/</a><br>
archives: 5/2017 thru present <a href="https://redfish.com/pipermail/friam_redfish.com/" rel="noreferrer" target="_blank">https://redfish.com/pipermail/friam_redfish.com/</a><br>
1/2003 thru 6/2021 <a href="http://friam.383.s1.nabble.com/" rel="noreferrer" target="_blank">http://friam.383.s1.nabble.com/</a><br>
</blockquote></div>