[FRIAM] Eternal questions

Frank Wimberly wimberly3 at gmail.com
Tue Aug 24 13:39:08 EDT 2021


Glen

I feel moments of depression from time to time.  They feel to me like
heaviness in my chest more than anything attached to reality although
sometimes it's stimulated by a thought like "that dusty lightbulb may never
be cleaned" or some other trivial and irrelevant detail.  It doesn't feel
like the sadness I feel about the twin babies who were drowned in the flood
in Tennessee.  When I tell my PCP about this he suggests medication.  I can
believe that it's a biochemical phenomenon so I'm taking a tiny (5 mg) dose
of Citalopram once a day.  My doc says that's a homeopathic dose.

It didn't help when EricS pointed out that the eighties are the age of
dying 😐

Frank
---
Frank C. Wimberly
140 Calle Ojo Feliz,
Santa Fe, NM 87505

505 670-9918
Santa Fe, NM

On Tue, Aug 24, 2021, 10:54 AM uǝlƃ ☤>$ <gepropella at gmail.com> wrote:

> I suppose my problem is that I *do* think we can find "the sadness" inside
> the brain ... well, not inside the brain, exactly, but inside the *body*
> ... well, not *inside* the body but peri-body, localized *at* the body, but
> extending out into the body's context a bit.
>
> Just like with one's thumb, sadness comprises a dynamic mesh of
> interlinked feedback loops. And that dynamic mesh of feedback is *part* of
> a larger mesh of such loops. (Re: "putting it in a robot" - cf types of
> attention: soft, hard, self, etc.) Some of those loops *look at* the
> sadness cluster, register that cluster as "other" in some sense ... just
> like how I can imagine chopping off my thumb and tossing that object into
> the woods.
>
> Because I do this all the time, it would blow my mind if others did not
> also do it. I particularly do it with fear. Wake up startled. Think maybe
> there's someone in the house. Grab the bat. As I'm walking up the stairs, I
> *reflect* on my registered, now objectified fear. And, in doing so, wiggle
> my way out of the grip of that fear and think more tactically about how to
> behave if there's a human-shaped shadow in the living room.
>
> I do the same thing with pain, particularly my chronic back pain, but also
> with things like stubbed toes. It's fscking hilarious how much that hurts
> ... ridiculously over-emphasized pain for what has happened. To not step
> outside the pain and laugh out loud would be weird. It's just way too funny
> how much that hurts.
>
> I welcome a lesson in how misguided I am, here.
>
> On 8/24/21 8:36 AM, Eric Charles wrote:
> > So.... This is JUST a question of whether we are having a casual
> conversation or a technical one, right? Certainly, in a casual,
> English-language conversation talk of "having" emotions is well understood,
> and just fine, for example "Nick is /having /a fit, just let him be." (I
> can't speak for other languages, but I assume there are many others where
> that would be true.)
> >
> > If we were, for some reason, having a technical conversation about how
> the/Science of //Psychology/, should use technical language, then we /might
> /also come to all agree that isn't the best way to talk about it.
> >
> > In any case, the risk with "have" is that it reifies whatever we are
> talking about. To talk about someone /having /sadness, leads naturally ---
> linguistically naturally --- in English --- to thinking that sadness is /a
> thing/ that I could find if I looked hard enough. It is why people used to
> think (and many, many, still do) that if we just looked hard enough at
> someone's brain, we would find /the sadness/ inside there, somewhere. That
> is why it is dangerous in a technical conversation regarding psychology,
> because that implication is wrong-headed in a way that repeatedly leads
> large swaths of the field down deep rabbit holes that they can't seem to
> get out of.
> >
> > On the one hand, I /have /a large ice mocha waiting for me in the
> fridge. On the other hand, this past summer I /had /a two-week long trip to
> California. One is a straightforward object, the other was an extended
> activity I engaged in. When the robot-designers assert that their robot
> "has" emotions, which do they mean? Honestly, I think they don't mean
> either one, it is a marketing tool, and not part of a conversation at all.
> As such, it does't really fit into the dichotomy above, and is trying to
> play one off of the other. They are using the terms "emotions and
> instincts" to mean something even less than whatever Tesla means when they
> say they have an autodrive that for sure still isn't good enough to
> autodrive.
> >
> > What the robot-makers mean is simply to indicate that the robot will be
> a bit more responsive to certain things that other models on the market,
> and /hopefully /that's what most consumers understand it to mean. But not
> all will... at least some of the people being exposed to the marketing will
> take it to mean that emotion has been successfully put somewhere inside the
> robot. (The latter is a straightforward empirical claim, and if you think
> I'm wrong about that, you have way too much faith in how savvy 100% of
> people are.) As such, the marketing should be annoying to anti-dualist
> psychologists, who see it buttressing /at least some/ people's tendency to
> jump down that rabbit hole mentioned above.
> > <mailto:echarles at american.edu>
>
> --
> ☤>$ uǝlƃ
>
> - .... . -..-. . -. -.. -..-. .. ... -..-. .... . .-. .
> FRIAM Applied Complexity Group listserv
> Zoom Fridays 9:30a-12p Mtn GMT-6  bit.ly/virtualfriam
> un/subscribe http://redfish.com/mailman/listinfo/friam_redfish.com
> FRIAM-COMIC http://friam-comic.blogspot.com/
> archives: http://friam.471366.n2.nabble.com/
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://redfish.com/pipermail/friam_redfish.com/attachments/20210824/27f7cc3d/attachment.html>


More information about the Friam mailing list