<html>
  <head>

    <meta http-equiv="content-type" content="text/html; charset=UTF-8">
  </head>
  <body>
    <p>This popular-press article came through my Google News feed
      recently which I thought might be useful to the
      Journalists/English-Majors on the list to help understand how LLMs
      work, etc.   When I read it in detail (forwarded from my TS
      (TinyScreenPhone) on my LS (Large Screen Laptop)) I found it a bit
      more detailed and technical than I'd expected, but nevertheless
      rewarding and possibly offering some traction to
      Journalism/English majors as well as those with a larger
      investment in the CS/Math implied.<br>
    </p>
    <blockquote>
      <p><a moz-do-not-send="true"
href="https://www.anthropic.com/index/decomposing-language-models-into-understandable-components">Decomposing
          Language Models into Understandable Components<br>
        </a></p>
      <blockquote>
        <blockquote>
          <blockquote><img moz-do-not-send="true"
src="https://efficient-manatee.transforms.svdcdn.com/production/images/Untitled-Artwork-11.png?w=2880&h=1620&auto=compress%2Cformat&fit=crop&dm=1696477668&s=d32264d5f5e32c79026b8e310e415c74"
              alt="" width="237" height="133"></blockquote>
        </blockquote>
      </blockquote>
    </blockquote>
    <p>and the (more) technical paper behind the article</p>
    <blockquote>
      <p><a moz-do-not-send="true"
href="https://transformer-circuits.pub/2023/monosemantic-features/index.html">https://transformer-circuits.pub/2023/monosemantic-features/index.html<br>
        </a></p>
    </blockquote>
    Despite having sent a few dogs into vaguely similar scuffles in my
    careen(r):<br>
    <blockquote><a moz-do-not-send="true"
        href="https://apps.dtic.mil/sti/tr/pdf/ADA588086.pdf">Faceted
        Ontologies for Pre Incident Indicator Analysis </a><br>
      <a moz-do-not-send="true"
        href="https://www.ehu.eus/ccwintco/uploads/c/c6/HAIS2010_925.pdf">SpindleViz</a><br>
      ...<br>
    </blockquote>
    <p>... I admit to finding this both intriguing and well over my head
      on casual inspection...  the (metaphorical?) keywords that drew me
      in  most strongly included <i>Superposition</i> and <i>Thought
        Vectors</i>, though they are (nod to Glen) probably riddled
      (heaped, overflowing, bursting, bloated ... )  with excess
      meaning.<br>
    </p>
    <p><a moz-do-not-send="true"
        href="https://gabgoh.github.io/ThoughtVectors/"
        class="moz-txt-link-freetext">https://gabgoh.github.io/ThoughtVectors/</a></p>
    <p>This leads me (surprise!) to an open ended discursive series of
      thoughts probably better left for a separate posting (probably
      rendered in a semasiographic language like <a
        moz-do-not-send="true"
        href="https://en.wikipedia.org/wiki/Heptapod_languages#Orthography">Heptapod
        B</a>).  <br>
    </p>
    <p><must... stop... now... ></p>
    <p>- Steve<br>
    </p>
  </body>
</html>