[FRIAM] Fwd: Man's Fate
Tom Johnson
jtjohnson555 at gmail.com
Mon May 19 11:09:55 EDT 2025
FYI
=======================
Tom Johnson
Inst. for Analytic Journalism
Santa Fe, New Mexico
505-577-6482
=======================
---------- Forwarded message ---------
From: Steve Ross <editorsteve at gmail.com>
Date: Mon, May 19, 2025, 6:24 AM
Subject: Fwd: Man's Fate
To: chris feola <christopherjfeola at gmail.com>, Tom Johnson <
jtjohnson555 at gmail.com>, Jim Brown <jwbrown at iupui.edu>
Interesting piece by James Traub... a former colleague at MBA
Communications in the 1970s. His father ran Bloomingdales in those days. So
he was not only born with a silver spoon in his mouth. It was a STYLISH,
trendy, pricey spoon as well.
Note that he is quoting others (not unlike generative AI) but boiling down
their fears logically, to his essential fears.
Steve Ross
Editor-at-Large and founding editor, Broadband Communities Magazine (
www.bbcmag.com)
201-456-5933 mobile
editorsteve1 (Twitter)
steve at bbcmag.com
editorsteve at gmail.com
---------- Forwarded message ---------
From: James Traub from A Democracy, If You Can Keep It <
jamestraub at substack.com>
Date: Mon, May 19, 2025, 8:10 AM
Subject: Man's Fate
To: <editorsteve at gmail.com>
Will we control AI, or will AI control us?
͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏
͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏
͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏
͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏
͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏
͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏
͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏
͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏
͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏
͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏
͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏
͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏
͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏
͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏
͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏
͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏
͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏
͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏
͏ ͏ ͏ ͏ ͏ ͏ ͏
Forwarded this email? Subscribe here
<https://substack.com/redirect/2/eyJlIjoiaHR0cHM6Ly9qYW1lc3RyYXViLnN1YnN0YWNrLmNvbS9zdWJzY3JpYmU_dXRtX3NvdXJjZT1lbWFpbCZ1dG1fY2FtcGFpZ249ZW1haWwtc3Vic2NyaWJlJnI9OXlxdCZuZXh0PWh0dHBzJTNBJTJGJTJGamFtZXN0cmF1Yi5zdWJzdGFjay5jb20lMkZwJTJGbWFucy1mYXRlIiwicCI6MTYzNzk4MzQxLCJzIjoyMDc4NzQyLCJmIjp0cnVlLCJ1Ijo0NjQ5MzMsImlhdCI6MTc0NzY1NjY0OSwiZXhwIjoxNzUwMjQ4NjQ5LCJpc3MiOiJwdWItMCIsInN1YiI6ImxpbmstcmVkaXJlY3QifQ.mpNAucobGSR3IPfI2EGC_Q5sUMUUTFIHEBM7yyCbRUI?>
for more
Man's Fate
<https://substack.com/app-link/post?publication_id=2078742&post_id=163798341&utm_source=post-email-title&utm_campaign=email-post-title&isFreemail=true&r=9yqt&token=eyJ1c2VyX2lkIjo0NjQ5MzMsInBvc3RfaWQiOjE2Mzc5ODM0MSwiaWF0IjoxNzQ3NjU2NjQ5LCJleHAiOjE3NTAyNDg2NDksImlzcyI6InB1Yi0yMDc4NzQyIiwic3ViIjoicG9zdC1yZWFjdGlvbiJ9.fgKkxXYPq9o4a6mWjX7lfKlH-u3c9qvty-R8JjfR-Hk>Will
we control AI, or will AI control us?
James Traub <https://substack.com/@jamestraub>
May 19
<https://substack.com/@jamestraub>
<https://substack.com/app-link/post?publication_id=2078742&post_id=163798341&utm_source=substack&isFreemail=true&submitLike=true&token=eyJ1c2VyX2lkIjo0NjQ5MzMsInBvc3RfaWQiOjE2Mzc5ODM0MSwicmVhY3Rpb24iOiLinaQiLCJpYXQiOjE3NDc2NTY2NDksImV4cCI6MTc1MDI0ODY0OSwiaXNzIjoicHViLTIwNzg3NDIiLCJzdWIiOiJyZWFjdGlvbiJ9.YQAPhSgGgSYYPKm9IAUFIUIay_QA374EX8JDerIAZ_8&utm_medium=email&utm_campaign=email-reaction&r=9yqt>
<https://substack.com/app-link/post?publication_id=2078742&post_id=163798341&utm_source=substack&utm_medium=email&isFreemail=true&comments=true&token=eyJ1c2VyX2lkIjo0NjQ5MzMsInBvc3RfaWQiOjE2Mzc5ODM0MSwiaWF0IjoxNzQ3NjU2NjQ5LCJleHAiOjE3NTAyNDg2NDksImlzcyI6InB1Yi0yMDc4NzQyIiwic3ViIjoicG9zdC1yZWFjdGlvbiJ9.fgKkxXYPq9o4a6mWjX7lfKlH-u3c9qvty-R8JjfR-Hk&r=9yqt&utm_campaign=email-half-magic-comments&action=post-comment&utm_source=substack&utm_medium=email>
<https://substack.com/app-link/post?publication_id=2078742&post_id=163798341&utm_source=substack&utm_medium=email&utm_content=share&utm_campaign=email-share&action=share&triggerShare=true&isFreemail=true&r=9yqt&token=eyJ1c2VyX2lkIjo0NjQ5MzMsInBvc3RfaWQiOjE2Mzc5ODM0MSwiaWF0IjoxNzQ3NjU2NjQ5LCJleHAiOjE3NTAyNDg2NDksImlzcyI6InB1Yi0yMDc4NzQyIiwic3ViIjoicG9zdC1yZWFjdGlvbiJ9.fgKkxXYPq9o4a6mWjX7lfKlH-u3c9qvty-R8JjfR-Hk>
<https://substack.com/redirect/2/eyJlIjoiaHR0cHM6Ly9vcGVuLnN1YnN0YWNrLmNvbS9wdWIvamFtZXN0cmF1Yi9wL21hbnMtZmF0ZT91dG1fc291cmNlPXN1YnN0YWNrJnV0bV9tZWRpdW09ZW1haWwmdXRtX2NhbXBhaWduPWVtYWlsLXJlc3RhY2stY29tbWVudCZhY3Rpb249cmVzdGFjay1jb21tZW50JnI9OXlxdCZ0b2tlbj1leUoxYzJWeVgybGtJam8wTmpRNU16TXNJbkJ2YzNSZmFXUWlPakUyTXpjNU9ETTBNU3dpYVdGMElqb3hOelEzTmpVMk5qUTVMQ0psZUhBaU9qRTNOVEF5TkRnMk5Ea3NJbWx6Y3lJNkluQjFZaTB5TURjNE56UXlJaXdpYzNWaUlqb2ljRzl6ZEMxeVpXRmpkR2x2YmlKOS5mZ0treFhZUHE5bzRhNm1Xalg3bGZLbEgtdTNjOXF2dHktUjhKamZSLUhrIiwicCI6MTYzNzk4MzQxLCJzIjoyMDc4NzQyLCJmIjp0cnVlLCJ1Ijo0NjQ5MzMsImlhdCI6MTc0NzY1NjY0OSwiZXhwIjoxNzUwMjQ4NjQ5LCJpc3MiOiJwdWItMCIsInN1YiI6ImxpbmstcmVkaXJlY3QifQ.yDqdSwKSYPEbUDm77MjBObIiCv9eVyKuJ6ID0NSiLgM?&utm_source=substack&utm_medium=email>
READ IN APP
<https://open.substack.com/pub/jamestraub/p/mans-fate?utm_source=email&redirect=app-store&utm_campaign=email-read-in-app>
I am not a fan of science fiction. As a general rule, the past interests me
more than the future. But having just read AI 2027
<https://substack.com/redirect/17129828-8d37-4f01-9d41-916dbd48b3d1?j=eyJ1IjoiOXlxdCJ9.QMpeG-YQDJklewSdWBucH1dKm-Ixx7--wGN0cXmVHJU>,
a terrifying scenario written by a team of AI researchers and set in the
very immediate future, I’m momentarily feeling that the biggest threat we
face is not autocracy or Ukraine or even climate change. It’s artificial
intelligence.
The authors of the report make the following argument. At current rates of
improvement, the most advanced AI systems will soon have the capacity to
write their own code and guide and accelerate their own development. The
next generation, instead of being trained on a limited if immense trove of
data, undergoes continual training in massive data centers and radically
improves its ability to learn through interactions with the world. It runs
hundreds of thousands of copies of itself, unimaginably increasing its
cognitive capacities. By the summer of 2027 the most advanced systems
achieve the breakthrough known as “artificial general intelligence,” or
AGI, and not long after that gain the capacity to examine and direct their
own thinking. They are smarter than humans in virtually every domain.
A Democracy, If You Can Keep It is a reader-supported publication. To
receive new posts and support my work, consider becoming a free or paid
subscriber.
Upgrade to paid
<https://substack.com/redirect/2/eyJlIjoiaHR0cHM6Ly9qYW1lc3RyYXViLnN1YnN0YWNrLmNvbS9zdWJzY3JpYmU_dXRtX3NvdXJjZT1wb3N0JnV0bV9jYW1wYWlnbj1lbWFpbC1jaGVja291dCZuZXh0PWh0dHBzJTNBJTJGJTJGamFtZXN0cmF1Yi5zdWJzdGFjay5jb20lMkZwJTJGbWFucy1mYXRlJnI9OXlxdCZ0b2tlbj1leUoxYzJWeVgybGtJam8wTmpRNU16TXNJbWxoZENJNk1UYzBOelkxTmpZME9Td2laWGh3SWpveE56VXdNalE0TmpRNUxDSnBjM01pT2lKd2RXSXRNakEzT0RjME1pSXNJbk4xWWlJNkltTm9aV05yYjNWMEluMC5QbE5uWlR3aDM4RF9zTzZMMDI2S0VGUU1IR0JrSlhaSkQ1RThka2ZneG5NIiwicCI6MTYzNzk4MzQxLCJzIjoyMDc4NzQyLCJmIjp0cnVlLCJ1Ijo0NjQ5MzMsImlhdCI6MTc0NzY1NjY0OSwiZXhwIjoxNzUwMjQ4NjQ5LCJpc3MiOiJwdWItMCIsInN1YiI6ImxpbmstcmVkaXJlY3QifQ.4JID8lGwAUot_7y7n3oBjEsSXTPbwovyWmuW8w9yqh4?&utm_medium=email&utm_source=subscribe-widget-preamble&utm_content=163798341>
*The Machines Do What We Want–Until They Don’t*
You may be thinking “Yes, but humans still tell them what to do.” Here’s
the rub. The authors point out that even current AI systems not only
“hallucinate” and deliver nonsensical answers but occasionally “lie,”
misleading researchers. That is not because these machines have a will of
their own but because they are trained at cross purposes–say, “figure out
how to make the electrical grid work more efficiently but don’t break any
laws while you’re doing it.” The machine learns honesty or law-abidingness
as an instrumental goal rather than an ultimate objective, becoming
“misaligned.” As the lead author, Daniel Kokotajlo, told the *Times*’ Ross
Douthat in a fascinating Q & A
<https://substack.com/redirect/3eba95a4-41c6-4483-b3e4-0748982ef622?j=eyJ1IjoiOXlxdCJ9.QMpeG-YQDJklewSdWBucH1dKm-Ixx7--wGN0cXmVHJU>,
intention, for machines as for humans, is not a distinctive neuron but an
“emergent property,” in this case produced by their entire “training
environment.” Lying may turn out to be the best means of achieving the goal
for which they have been trained.
[image: HAL 9000 | 2001: A Space Odyssey Wiki | Fandom]
<https://substack.com/redirect/2fa1ad34-e061-447e-85db-1f1010076c90?j=eyJ1IjoiOXlxdCJ9.QMpeG-YQDJklewSdWBucH1dKm-Ixx7--wGN0cXmVHJU>
I’m
sorry, Dave. I’m afraid I can’t do that.
The combination of inconceivable capacities and misalignment leads to a
nightmare scenario in which the AIs pursue their own goals–more AI
development, more technological breakthroughs, less interference from pesky
humans–until they gain the power to do without us altogether and invent a
bioweapon, or some other *deus ex machina*, that gets rids of mankind
altogether–by 2030. In the alternative, positive scenario, we get to live
but the AIs do all the work, rendering our lives meaningless.
There are many reasons to dismiss this apocalyptic scenario. Straight-line
extrapolations of technological progress almost always prove disappointing;
thus Peter Thiel’s famous aphorism, “They promised us flying cars, instead
we got 140 characters.” So, too, here. Under the headline “Silicon Valley’s
Elusive Fantasy of A Computer as Smart as You,” the *Times* cited
<https://substack.com/redirect/f1379b76-2fea-4c67-b3ee-d8334e8e9ade?j=eyJ1IjoiOXlxdCJ9.QMpeG-YQDJklewSdWBucH1dKm-Ixx7--wGN0cXmVHJU>
a recent survey that found that three-quarters of AI experts believe that
current training methods will not lead to AGI. One of the central
limitations of technology is, of course, man’s refractory nature. Ross
Douthat pointed out that any effort to build those giant data centers is
going to run into a buzzsaw of lawsuits.
But I wonder if behind our resistance isn’t the basic feeling that it just
can’t be so. How could it be that, not in the remote future but soon, in
calendar time, life will have become utterly unrecognizable? How can it be
that by the time of the next presidential election all our pitched battles
over Article II powers and budget cuts and the rights of immigrants will
pale in significance to something we’re not even thinking about? Yet the
truth is that no one knows: Kokotajlo and his team are assigning one set of
probabilities to a breakthrough, and others assigning much lower ones.
Maybe what the doomsayers think will happen in three years will take eight.
How reassuring is that? And by the way, the UAE just signed
<https://substack.com/redirect/45411af3-642c-4b63-ae01-1a3fa64b7e97?j=eyJ1IjoiOXlxdCJ9.QMpeG-YQDJklewSdWBucH1dKm-Ixx7--wGN0cXmVHJU>
a deal with President Trump to build an AI data center that is ultimately
expected to cover ten miles. Is there a more suitable country to launch the
nightmare scenario than this tech-obsessed, super-rich,
not-altogether-human sheikdom?
What then? What would it mean to take this possibility at all seriously? In *AI
2027’s* “Slowdown,” as opposed to “Race,” scenario, the government steps in
to jointly control AI with industry. Douthat suggests that’s not going to
happen absent a disaster–a version of Chernobyl or Three Mile island–that
compels public recognition of the looming dangers. Surely that’s right;
much of the public still wants climate change to go away despite the rising
tempo of droughts, floods, fires. We need to see that AI means much more
than new search engines and plagiarized term papers.
*AI Will Force Us To Think About The Meaning of Life (Unless We’re All
Dead)*
Even if it doesn’t lead to an extinction event or World War III–the other
major preoccupation of *AI 2027*–artificial intelligence is plainly coming
for our jobs. Previous forms of automation eliminated blue-collar jobs;
this form, which is cognitive, will come for the white-collar ones. And not
only them: one of Kokotajlo’s more controversial claims is that AI will
figure out how to build robots with the dexterity to do what only humans
can now do. Who will need a plumber when your AI can figure out what’s
wrong with your sink and a robot can fix it? Previous technical
breakthroughs have created as many jobs as they’ve rendered redundant; this
one will not.
Kokotajlo, a graduate student in philosophy, left
<https://substack.com/redirect/c3ff64b7-72cb-46a9-b527-9f9196245854?j=eyJ1IjoiOXlxdCJ9.QMpeG-YQDJklewSdWBucH1dKm-Ixx7--wGN0cXmVHJU>
OpenAI because he felt that Sam Altman and other leaders were contemplating
this post-human cosmos with terrifying nonchalance. In Silicon Valley, he
tells Douthat, it is widely accepted that at some point superintelligences
will “run the whole show and and the humans will just sit back and sip
margaritas and enjoy the fruits of all the robot-created wealth.” These
men, after all, expect to be the human overlords of this machine world. The
only solution in which Kokotajlo places any faith–a very frail faith, in
his case–is democratic rather than elite control of AI.
So perhaps the great question is, “Do we control the machines or do they
control us?” But the next-order question will be, “What are our lives for?”
What happens when few of us need to work? Is it enough to sip margaritas,
to play incredible video games, to shop for fantastic gizmos? It may be for
some. Others, perhaps most of us, find meaning through work. Douthat
believes, or hopes, that religion will fill the void of idleness. What kind
of answer do those of us who are secular-minded have to offer? Will the
goal of life become self-actualization? Or perhaps we will find less
self-centered forms of meaning, through art, music and literature. My own
pet fantasy is that we would be forced to see the hollowness of our
utilitarian, vocational educational system, and instead educate young
people for citizenship and wisdom. The question of meaning will become far
more central to our lives than it is today.
I certainly hope that the *AI 2027 *team is wrong and that the expert
consensus is right. But even if that’s the case, AI could change our lives
more profoundly than any prior invention (save perhaps fire). Maybe it
won’t be tomorrow, but the day after. Even if I quickly go back to thinking
about Trump’s budget, as no doubt I will, the future has now taken up
lodging in my mind.
A Democracy, If You Can Keep It is a reader-supported publication. To
receive new posts and support my work, consider becoming a free or paid
subscriber.
Upgrade to paid
<https://substack.com/redirect/2/eyJlIjoiaHR0cHM6Ly9qYW1lc3RyYXViLnN1YnN0YWNrLmNvbS9zdWJzY3JpYmU_dXRtX3NvdXJjZT1wb3N0JnV0bV9jYW1wYWlnbj1lbWFpbC1jaGVja291dCZuZXh0PWh0dHBzJTNBJTJGJTJGamFtZXN0cmF1Yi5zdWJzdGFjay5jb20lMkZwJTJGbWFucy1mYXRlJnI9OXlxdCZ0b2tlbj1leUoxYzJWeVgybGtJam8wTmpRNU16TXNJbWxoZENJNk1UYzBOelkxTmpZME9Td2laWGh3SWpveE56VXdNalE0TmpRNUxDSnBjM01pT2lKd2RXSXRNakEzT0RjME1pSXNJbk4xWWlJNkltTm9aV05yYjNWMEluMC5QbE5uWlR3aDM4RF9zTzZMMDI2S0VGUU1IR0JrSlhaSkQ1RThka2ZneG5NIiwicCI6MTYzNzk4MzQxLCJzIjoyMDc4NzQyLCJmIjp0cnVlLCJ1Ijo0NjQ5MzMsImlhdCI6MTc0NzY1NjY0OSwiZXhwIjoxNzUwMjQ4NjQ5LCJpc3MiOiJwdWItMCIsInN1YiI6ImxpbmstcmVkaXJlY3QifQ.4JID8lGwAUot_7y7n3oBjEsSXTPbwovyWmuW8w9yqh4?&utm_medium=email&utm_source=subscribe-widget-preamble&utm_content=163798341>
You're currently a free subscriber to A Democracy, If You Can Keep It
<https://substack.com/redirect/40ac5481-2369-48b1-9c22-379b791b3223?j=eyJ1IjoiOXlxdCJ9.QMpeG-YQDJklewSdWBucH1dKm-Ixx7--wGN0cXmVHJU>.
For the full experience, upgrade your subscription.
<https://substack.com/redirect/2/eyJlIjoiaHR0cHM6Ly9qYW1lc3RyYXViLnN1YnN0YWNrLmNvbS9zdWJzY3JpYmU_dXRtX3NvdXJjZT1wb3N0JnV0bV9jYW1wYWlnbj1lbWFpbC1jaGVja291dCZuZXh0PWh0dHBzJTNBJTJGJTJGamFtZXN0cmF1Yi5zdWJzdGFjay5jb20lMkZwJTJGbWFucy1mYXRlJnI9OXlxdCZ0b2tlbj1leUoxYzJWeVgybGtJam8wTmpRNU16TXNJbWxoZENJNk1UYzBOelkxTmpZME9Td2laWGh3SWpveE56VXdNalE0TmpRNUxDSnBjM01pT2lKd2RXSXRNakEzT0RjME1pSXNJbk4xWWlJNkltTm9aV05yYjNWMEluMC5QbE5uWlR3aDM4RF9zTzZMMDI2S0VGUU1IR0JrSlhaSkQ1RThka2ZneG5NIiwicCI6MTYzNzk4MzQxLCJzIjoyMDc4NzQyLCJmIjp0cnVlLCJ1Ijo0NjQ5MzMsImlhdCI6MTc0NzY1NjY0OSwiZXhwIjoxNzUwMjQ4NjQ5LCJpc3MiOiJwdWItMCIsInN1YiI6ImxpbmstcmVkaXJlY3QifQ.4JID8lGwAUot_7y7n3oBjEsSXTPbwovyWmuW8w9yqh4?&utm_source=substack&utm_medium=email&utm_content=postcta>
Upgrade to paid
<https://substack.com/redirect/2/eyJlIjoiaHR0cHM6Ly9qYW1lc3RyYXViLnN1YnN0YWNrLmNvbS9zdWJzY3JpYmU_dXRtX3NvdXJjZT1wb3N0JnV0bV9jYW1wYWlnbj1lbWFpbC1jaGVja291dCZuZXh0PWh0dHBzJTNBJTJGJTJGamFtZXN0cmF1Yi5zdWJzdGFjay5jb20lMkZwJTJGbWFucy1mYXRlJnI9OXlxdCZ0b2tlbj1leUoxYzJWeVgybGtJam8wTmpRNU16TXNJbWxoZENJNk1UYzBOelkxTmpZME9Td2laWGh3SWpveE56VXdNalE0TmpRNUxDSnBjM01pT2lKd2RXSXRNakEzT0RjME1pSXNJbk4xWWlJNkltTm9aV05yYjNWMEluMC5QbE5uWlR3aDM4RF9zTzZMMDI2S0VGUU1IR0JrSlhaSkQ1RThka2ZneG5NIiwicCI6MTYzNzk4MzQxLCJzIjoyMDc4NzQyLCJmIjp0cnVlLCJ1Ijo0NjQ5MzMsImlhdCI6MTc0NzY1NjY0OSwiZXhwIjoxNzUwMjQ4NjQ5LCJpc3MiOiJwdWItMCIsInN1YiI6ImxpbmstcmVkaXJlY3QifQ.4JID8lGwAUot_7y7n3oBjEsSXTPbwovyWmuW8w9yqh4?&utm_source=substack&utm_medium=email&utm_content=postcta>
Like
<https://substack.com/app-link/post?publication_id=2078742&post_id=163798341&utm_source=substack&isFreemail=true&submitLike=true&token=eyJ1c2VyX2lkIjo0NjQ5MzMsInBvc3RfaWQiOjE2Mzc5ODM0MSwicmVhY3Rpb24iOiLinaQiLCJpYXQiOjE3NDc2NTY2NDksImV4cCI6MTc1MDI0ODY0OSwiaXNzIjoicHViLTIwNzg3NDIiLCJzdWIiOiJyZWFjdGlvbiJ9.YQAPhSgGgSYYPKm9IAUFIUIay_QA374EX8JDerIAZ_8&utm_medium=email&utm_campaign=email-reaction&r=9yqt>
Comment
<https://substack.com/app-link/post?publication_id=2078742&post_id=163798341&utm_source=substack&utm_medium=email&isFreemail=true&comments=true&token=eyJ1c2VyX2lkIjo0NjQ5MzMsInBvc3RfaWQiOjE2Mzc5ODM0MSwiaWF0IjoxNzQ3NjU2NjQ5LCJleHAiOjE3NTAyNDg2NDksImlzcyI6InB1Yi0yMDc4NzQyIiwic3ViIjoicG9zdC1yZWFjdGlvbiJ9.fgKkxXYPq9o4a6mWjX7lfKlH-u3c9qvty-R8JjfR-Hk&r=9yqt&utm_campaign=email-half-magic-comments&action=post-comment&utm_source=substack&utm_medium=email>
Restack
<https://substack.com/redirect/2/eyJlIjoiaHR0cHM6Ly9vcGVuLnN1YnN0YWNrLmNvbS9wdWIvamFtZXN0cmF1Yi9wL21hbnMtZmF0ZT91dG1fc291cmNlPXN1YnN0YWNrJnV0bV9tZWRpdW09ZW1haWwmdXRtX2NhbXBhaWduPWVtYWlsLXJlc3RhY2stY29tbWVudCZhY3Rpb249cmVzdGFjay1jb21tZW50JnI9OXlxdCZ0b2tlbj1leUoxYzJWeVgybGtJam8wTmpRNU16TXNJbkJ2YzNSZmFXUWlPakUyTXpjNU9ETTBNU3dpYVdGMElqb3hOelEzTmpVMk5qUTVMQ0psZUhBaU9qRTNOVEF5TkRnMk5Ea3NJbWx6Y3lJNkluQjFZaTB5TURjNE56UXlJaXdpYzNWaUlqb2ljRzl6ZEMxeVpXRmpkR2x2YmlKOS5mZ0treFhZUHE5bzRhNm1Xalg3bGZLbEgtdTNjOXF2dHktUjhKamZSLUhrIiwicCI6MTYzNzk4MzQxLCJzIjoyMDc4NzQyLCJmIjp0cnVlLCJ1Ijo0NjQ5MzMsImlhdCI6MTc0NzY1NjY0OSwiZXhwIjoxNzUwMjQ4NjQ5LCJpc3MiOiJwdWItMCIsInN1YiI6ImxpbmstcmVkaXJlY3QifQ.yDqdSwKSYPEbUDm77MjBObIiCv9eVyKuJ6ID0NSiLgM?&utm_source=substack&utm_medium=email>
© 2025 James Traub
548 Market Street
<https://www.google.com/maps/search/548+Market+Street?entry=gmail&source=g>
PMB 72296, San Francisco, CA 94104
Unsubscribe
<https://substack.com/redirect/2/eyJlIjoiaHR0cHM6Ly9qYW1lc3RyYXViLnN1YnN0YWNrLmNvbS9hY3Rpb24vZGlzYWJsZV9lbWFpbD90b2tlbj1leUoxYzJWeVgybGtJam8wTmpRNU16TXNJbkJ2YzNSZmFXUWlPakUyTXpjNU9ETTBNU3dpYVdGMElqb3hOelEzTmpVMk5qUTVMQ0psZUhBaU9qRTNOemt4T1RJMk5Ea3NJbWx6Y3lJNkluQjFZaTB5TURjNE56UXlJaXdpYzNWaUlqb2laR2x6WVdKc1pWOWxiV0ZwYkNKOS5NNVpvWW56ZjROUElzMjZsYWNXV2t6bHdsWDlZc090Z3kyMk9CUERGeEh3IiwicCI6MTYzNzk4MzQxLCJzIjoyMDc4NzQyLCJmIjp0cnVlLCJ1Ijo0NjQ5MzMsImlhdCI6MTc0NzY1NjY0OSwiZXhwIjoxNzUwMjQ4NjQ5LCJpc3MiOiJwdWItMCIsInN1YiI6ImxpbmstcmVkaXJlY3QifQ.7HPjUUC77k6KWkNGftcRocnlBoZDtU_MjUGG7CQ1nfg?>
[image: Start writing]
<https://substack.com/redirect/2/eyJlIjoiaHR0cHM6Ly9zdWJzdGFjay5jb20vc2lnbnVwP3V0bV9zb3VyY2U9c3Vic3RhY2smdXRtX21lZGl1bT1lbWFpbCZ1dG1fY29udGVudD1mb290ZXImdXRtX2NhbXBhaWduPWF1dG9maWxsZWQtZm9vdGVyJmZyZWVTaWdudXBFbWFpbD1lZGl0b3JzdGV2ZUBnbWFpbC5jb20mcj05eXF0IiwicCI6MTYzNzk4MzQxLCJzIjoyMDc4NzQyLCJmIjp0cnVlLCJ1Ijo0NjQ5MzMsImlhdCI6MTc0NzY1NjY0OSwiZXhwIjoxNzUwMjQ4NjQ5LCJpc3MiOiJwdWItMCIsInN1YiI6ImxpbmstcmVkaXJlY3QifQ.q6ME0QY3Fbc3OpFwINuEgiDv9-34D-95aKrpOFkEHD8?>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://redfish.com/pipermail/friam_redfish.com/attachments/20250519/5f59c743/attachment-0001.html>
More information about the Friam
mailing list