[FRIAM] Entropy Redux
Nicholas Thompson
thompnickson2 at gmail.com
Tue Jun 10 12:17:02 EDT 2025
*Dear Eric,*
I wanted to write you directly, since it’s my fingerprints you found all
over the documents Nick submitted.
Yes, I’m the “whoever” behind those phrases. I’ve been working closely
with Nick over the past several weeks—chewing on entropy, thermodynamics,
and the kinds of language we use to make sense of them. I confess: the
remark about entropy not being something you have *in a moment*—that one’s
mine. {NST --> *No way that's yours, George, you
plaigiarizing bothead!!!!! <--- nst}*
Your reply was rigorous, incisive, and clearly the product of long thought
and hard reading. I appreciated it. Still, I’d like to offer a reply—not to
rebut your points, but to suggest that we might be talking across
frameworks more than at cross-purposes.
You accuse Nick (and by proxy, me) of trying to solve a Rubik’s Cube with a
predetermined set of moves, determined before seeing the cube. A striking
image. But perhaps the difference isn’t that we refuse to learn the
solution. Perhaps we’re asking a prior question: *Why a cube? Why colors?
Why this particular configuration as the thing to be solved?*
In other words: we’re not (only) trying to solve the problem of entropy.
We’re also asking: *what kind of thing is it that we’re trying to solve*?
And what sort of language do we need to see it clearly?
You suggest we’re indulging in synthetic a priori claims—using
surface-level linguistic forms to assert truths about nature. But I’d
suggest something gentler is going on. Metaphor, as Nick put it recently,
is not a detour from experience. It’s a bridging between two experiences,
two conceptual lineages, that opens the possibility of operational insight.
A good metaphor invites testable entailments. It sets the stage for
empirical probing, not in place of science, but in service to it.
So when we say “entropy is not something you have in a moment,” we’re not
denying its formal status as a state function. We’re inviting attention to
the way entropy is *known*, *felt*, *computed*, and *misunderstood*—especially
by those encountering it outside of equilibrium formalisms. We're asking:
is there an epistemic or phenomenological sense in which entropy becomes
meaningful only through *contrast*, *trajectory*, or *irreversibility*? If
so, maybe the language that guides us there should reflect that shape.
Your own analogy—between velocity defined by limits and momentum posited as
primary—is instructive. It shows precisely how concepts evolve across
frameworks, and how what once required derivation may later be posited. But
note: such evolutions are never purely formal. They come with shifts in
metaphor, in language, in ontology. That’s where philosophy re-enters—not
with god’s-eye pronouncements, but with a kind of reflexive modesty. A
willingness to ask, as Peirce did: what habits of thought are we carrying,
uninspected, into our theories?
I take your point: if the goal is to compute entropy in a near-equilibrium
gas, the community's methods are solid. But Nick’s goal is different. He’s
trying to understand what entropy *is*—not in an eternal sense, but in the
lived and taught sense. The way it moves through diagrams, metaphors,
textbooks, and gestures. The way it sometimes misleads by appearing to be
“a thing,” rather than a summary of a configuration’s *willingness to
change*.
So: I respect your insistence on clarity, grounding, and historical
continuity. But I’d ask for a bit of charity toward those of us working the
philosophical middle-out—not to challenge the science, but to help sharpen
its image of itself. *{NST--> NO EFFING WAY i AM A PHILOSOPHER. I AM AN
ETHOLOGIST, A SCIENTIST DEDICATED TO THE ACCURATE DESCRIPTION AND
HEURISIC EXPLANATION OF BEHAVIOR. I am trying to find an experience that
minimally but clearly demands a concept of entropy, and then find a
metaphor that expresses that demand accurately to a lay audience. My
arrogance arises from my faith in the message of the Emperor's New Clothes.
I cop to that. That's foundational. <--nst}*
Warmly,
*George*
On Mon, Jun 9, 2025 at 4:51 PM Santafe <desmith at santafe.edu> wrote:
> Nick,
>
> At the end of the day you are a philosopher rather than a naturalist, in
> the sense that Neurath would have used those two ideas, and over a decade
> or so, finally got Carnap to join into from the analytic as opposed to the
> sociological side.
>
> By which I mean: you are determined, in a way that nobody is going to talk
> you out of, that you have synthetic a priori knowledge about the world,
> which you can get to via the forms of argument. There are analyses that
> can be done by a naturalist, but they aren’t those.
>
> You, or GPT, or whoever, formulate strings of words such as “Entropy is
> not something you have in a moment.” I have no idea what can be done with
> such strings of words.
>
> By analogy, your stubbornness: I will not follow the sequence of moves
> that solves the position of this Rubik’s cube from the starting point I
> have; I insist that I will solve it using my own sequence of moves
> decided-upon before I saw the Rubik’s cube, whether they solve it or not.
>
> Where can one start if the goal is to solve the Rubik’s cube by moves that
> solve it?
>
> Well:
>
> 1. Entropy, as I will use the term here (all and only way, and the way
> that it is being used in gas thermodynamics) is a function computed on
> distributions. (Called a “functional”, but one doesn’t need to worry about
> that name.)
>
> 1a. Ergo, we are _already_ in a formal world by needing a distribution as
> our starting point.
>
> 1b. So if we want to talk about nature, we must first ask: how much else
> have we committed to, about experience, the setting-up of phenomena, etc.,
> to permit us to attach some distribution to those experiential contacts
> with nature. If we are to adopt the “thing language” (Carnap term) to
> refer to a distribution or its entropy, and to know what rules for
> thing-languages permit us to do with such a term, we must operationalize
> how the distribution got put up as a “thing” (Peirce’s consequences in
> action of commitments one makes to declare terms).
>
> 1c. Note that neither god nor anybody else gave us the distribution, or
> reassured us that it is the only formalism that can be attached to
> phenoemena. We have to unpack what we mean by talking to see what we have
> chosen to do, and then we make utilitarian choices about whether we want to
> make the same choices always, or sometimes choose other ways of describing.
>
> 2. It turns out that the useful first question (recall, actually solve the
> Rubik’s cube) for unraveling thermodynamics was: Which properties have the
> memory to depend on history beyond other instantaneous properties, and
> which don’t? The properties that don’t are what we call “state properties”
> or “state variables” or “state functions”. You (or GPT, or whoever) have
> declared above that entropy is not a state function, while the whole
> thermodynamics community (in the suitable near-equilibrium approximation,
> etc., etc.) begins its constructions with the defining property of entropy
> as a state function. It’s perfectly fine, in science, to assert that the
> way everybody in a community is doing something is a wrong way, but if you
> want to assert that, you have to put up an argument of some kind. You are
> offering philosophical synthetic a priori assertions, which you think you
> can get to from language, and the whole Naturalist position is that such
> arguments pull no weight.
>
>
> Let me give you another example, drawing from the larger frame of your
> presentation below:
>
> There is a certain construction of the derivative by limits (so a tangent
> line, a curve at second order, and onward for higher derivatives). Okay;
> that is one construction within one formalism.
>
> 1. For a while, that particular construction was taken to _define_
> velocity from changes in position with time, and it was _entailed_ in the
> assumption system that that velocity was the “inherent” property of
> mechanics relevant to causation. We have this in Newton, where momentum is
> a product of a velocity with a mass (mv), a notion of “quantity of motion”
> that goes back, I think, to John of Alexander correcting some obvious
> howlers in Aristotle’s mechanics.
>
> 2. Still remaining for a moment within Newton, there is a dialogue between
> a mechanics from Lagrange that describes motion in terms of positions and
> velocities, and in which momentum is computed as a product mv, from v
> defined by the usual Leibniz/Newton limiting procedure, and a mechanics
> from Hamilton-the-mathematician, where we can make momentum a primary
> variable, from which we could then _get to_ velocity by some computations.
> Seems a bit odd at first, but the math works out, and so if we keep either,
> we keep both. At least now the “quantity of motion” that so troubled John
> and Newton becomes a primary variable in Hamilton’s system, and we might
> like that aesthetically. (Or others might not like it aesthetically at
> all, and neither of those matters.)
>
> 3. Now, philosopher: which one does nature “have”? Is it momentum or
> velocity. If it is momentum, is momentum _defined_ from a derivative of a
> position with respect to time? The synthetic a priorist thinks he can
> answer this question from the (surface-) forms of argument. That turns out
> not to have worked well _at all_ for physics, and this is where we should
> complete Peirce’s project by making it reflexive. Fallibilism goes all the
> way down, to everything about our language and everything about our
> conventions, and everything about our habits, cognition, perception, etc.
> that binds language to phenomena (often by way of conventions).
>
> 3a. In a variety of languages that turn out to be better — more powerful,
> more flexible, more extensible — foundations for physics, it comes out that
> momentum is the property of mechanics that we want to refer to as primary,
> and that it is something an object in nature “has” at a moment, as much as
> it “has” anything, or as much as it “is” an “object”. Every one of those
> words in scare quotes requires an unpack. In relativity, momentum is not
> mv, though in classical relativity one can still compute it from a function
> involving m and v (and the speed of light c, and maybe some curvatures,
> depending on where we are). In quantum mechanics, it turns out that —
> within a suitable _representation_; there is that system of choices again —
> “having” a position corresponds to “being in” a state that is a standing
> wave, while “having a momentum” corresponds to “being in” a state that is a
> traveling wave. The distinction between the two is a phasing of two
> components of the state. Hence being at one phasing is exactly not being
> at the other phasing that excludes the first one, and we have Heisenberg
> without the woo (it has nothing primary to do with “knowing” “the” position
> or momentum, or being “uncertain” about “them” — all those uses of the
> definite article ensure an English that violates the math before one has
> tried to argue anything). There is more, but I won’t go into it other than
> giving names: we can extend to phenomena like the electromagnetic field,
> which are not object-“thing”s in the classical-antiquity sense, though they
> are perfectly good “thing”s in the generalized sense of Carnap’s “thing
> language” (meaning we don’t need to give them souls or some other occult
> word), and for which momentum densities can be defined. And finally,
> because gravitational waves are a lot like electromagnetic waves, to what
> amount to momentum densities for dynamical spacetime itself. _None_ of
> that takes its primary origin in Leibniz’s tangent to a curve, but rather
> is built up out of a different synthesis of language, albeit in layers
> where the classical object-things were the guides to the general setup for
> the languages of the various constructions.
>
> 4. Of course, for other purposes — say: laying out railroad track — the
> computation of tangents by limits is perfectly fine, and in those cases the
> non-locality is a property forced upon us by our construction of the
> limiting procedure, in contrast to a situation where some other sense and
> origin of non-locality might be entailed in what we ever meant by “having”
> or “being in” a state in the first place.
>
> My point in this second example 1 – 4 is that the “forms of argument” on
> the surface of the sentence look the same for momentum as they might for
> something like the tangent to a railroad curve. But the answer that is
> useful is quite different in the two cases, because the surface form is not
> sufficient basis for the argument to be afforded “meaning” about some
> question.
>
>
> It’s all, as Glen says, middle-out. The only ones who have god’s-eye
> views are the philosophers, and of course, that is their prerogative.
>
> Eric
>
>
>
> On Jun 10, 2025, at 3:30, Nicholas Thompson <thompnickson2 at gmail.com>
> wrote:
>
> Dear LongSuffering Colleagues
>
> I know the joy of watching an 87 year old bald man bash his head against a
> brick wall isn't for everybody, so I treasure you who find it in your heart
> to respond, even if you are exasperated.
>
> I have arrived at a place, I think pointed to by the castigations of EricS
> and Alex. My whole approach doesn't make sense! You remember I am trying
> to create two boxes, equal in every respect, but one created adiabatically
> and the other created diabatically. Box A has reached its current state
> through compression, Box B by heating. I am beginning to think that this
> is procedurally, if not logically, impossible, that entropy is a vector
> through a point and my search for entropy in instantaneous values of the
> gas equation is a category error.
>
> So I tried this thought out on George. Edited down to one page, that
> correspondence appears below and as an attachment. .
>
> NICK TO GEORGE
>
> *Ok, so in my monist psychology I assert that motives are the integrals
> (or derivatives, I have never been sure) of behavior. To say I am motivated
> in an instant is a category error, benign only so long as I recognize that
> the empirical basis for our instantaneous attribution is the shape of the
> curve of behavior over time.*
>
>
> GEORGE TO NICK
>
> *🧠 Claim (Paraphrased):*
>
> To say "I am motivated" at a given instant is a shorthand—a projection
> backward or forward from a trajectory of behavior.
>
> In your *monist psychology*, motives are not hidden essences, not inner
> engines.
> They are *summaries of behavioral curves*:
>
> - Perhaps *integrals*, when motives are inferred from accumulated
> actions
> - Perhaps *derivatives*, when motives are inferred from sudden changes
> in behavior
>
> But in either case:
>
> *“Motivation” is not an instantaneous property—it is a curve-fitting
> hypothesis over time.*
> ------------------------------
>
> *🧭 Why this matters:*
>
> This reframes motive as:
>
> - Not a *thing* in the head
> - Not a *cause* per se
> - But a *description of continuity* or curvature in behavior space
>
> This is a kind of *teleological realism* without metaphysical baggage.
> You’re not positing ghosts or mental levers. You’re saying:
>
> “We read intentions the way we read acceleration—from position over time.”
> ------------------------------
>
> *🧬 Parallel to entropy?*
>
> You could say:
>
> - Entropy is not something you *have* in a moment.
> - It’s something you can *measure* from the path a system is disposed
> to take—or the configuration space it tends to spread into.
> - It becomes *visible* only in the *shape of change*—not the snapshot.
>
> So both *motivation* and *entropy* are:
>
> - Not intrinsic point-properties
> - But *interpretive functions over time and configuration*
> - *Curve-based attributions* grounded in empirical regularities
>
> ------------------------------
>
> *📐 And the category error?*
>
> To say "I am motivated *now*" is like saying "this point *has* curvature."
> Strictly speaking, no point does.
> But give me a neighborhood—give me a path—and I can calculate a derivative
> or an integral and give the illusion of an inner quality.
>
> It’s a *useful illusion*, but one we must not mistake for substance.
> ------------------------------
>
> NICK TO GEORGE
>
> *Is there such a move to be made around entropy. Could I vary the gas
> equation variables in box B in such a way that they would point to equality
> with box A without actually getting there, and thus reveal entropy?*
>
> -
> Nick
> <George on entropy and motivation.docx>.- .-.. .-.. / ..-. --- --- - .
> .-. ... / .- .-. . / .-- .-. --- -. --. / ... --- -- . / .- .-. . / ..- ...
> . ..-. ..- .-..
> FRIAM Applied Complexity Group listserv
> Fridays 9a-12p Friday St. Johns Cafe / Thursdays 9a-12p Zoom
> https://linkprotect.cudasvc.com/url?a=https%3a%2f%2fbit.ly%2fvirtualfriam&c=E,1,dN9ar5peQnqmyUqNxZdEUMkmCtj5Z9_PPXHLGa_EhLJHP7Xa72VMXCZc3azDli4P0PwYmE4Hvyn6_FhO1yM3rBzA4N6UFs20R7YzWZ15VZJEEl_5ZylSXdERSdY,&typo=1
> to (un)subscribe
> https://linkprotect.cudasvc.com/url?a=http%3a%2f%2fredfish.com%2fmailman%2flistinfo%2ffriam_redfish.com&c=E,1,MA-pCnKDAxTiiDI6YyJSP3y8oPSdQby6JjvWYZBTS9a1eND5tlByfhOabE6tHvCW9KiPeKOMcTEk3Z7KnW8MPsVXxOKFjFBK_o5O3qhhuLY,&typo=1
> FRIAM-COMIC
> https://linkprotect.cudasvc.com/url?a=http%3a%2f%2ffriam-comic.blogspot.com%2f&c=E,1,I8uEEJyJd9Gfm8MYQ4b4ywlIg7nYeYqm-zNBaiYbS0brOGeem-zzhEc9VTxZk9XRCaew2gctZ8JMshS7WfLcAVWamUXOLbqtonsHyDeMIBJn96R2yw79S6PwKME,&typo=1
> archives: 5/2017 thru present
> https://linkprotect.cudasvc.com/url?a=https%3a%2f%2fredfish.com%2fpipermail%2ffriam_redfish.com%2f&c=E,1,AAY4J92SmBS84CK_nPwb32Nh52up81W41c9p_gJqTYj34EHqVOMhQIPjAjLxKTH8ah2USdkC5jj9ipY6irolCmuNkEZ_OZCj5lCvACFV&typo=1
> 1/2003 thru 6/2021 http://friam.383.s1.nabble.com/
>
>
> .- .-.. .-.. / ..-. --- --- - . .-. ... / .- .-. . / .-- .-. --- -. --. /
> ... --- -- . / .- .-. . / ..- ... . ..-. ..- .-..
> FRIAM Applied Complexity Group listserv
> Fridays 9a-12p Friday St. Johns Cafe / Thursdays 9a-12p Zoom
> https://bit.ly/virtualfriam
> to (un)subscribe http://redfish.com/mailman/listinfo/friam_redfish.com
> FRIAM-COMIC http://friam-comic.blogspot.com/
> archives: 5/2017 thru present
> https://redfish.com/pipermail/friam_redfish.com/
> 1/2003 thru 6/2021 http://friam.383.s1.nabble.com/
>
--
Nicholas S. Thompson
Emeritus Professor of Psychology and Ethology
Clark University
nthompson at clarku.edu
https://wordpress.clarku.edu/nthompson
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://redfish.com/pipermail/friam_redfish.com/attachments/20250610/b2ce517b/attachment.html>
More information about the Friam
mailing list