<html>
<head>
<meta http-equiv="Content-Type" content="text/html; charset=UTF-8">
</head>
<body>
<p>I asked Alexa "what is GPT3?", <br>
</p>
<p>She quoted from the beginning of the Wikipedia article:</p>
<blockquote>
<p style="margin: 0.5em 0px; color: rgb(32, 33, 34); font-family:
sans-serif; font-size: 14px; font-variant-ligatures: normal;
font-variant-caps: normal; font-weight: 400; letter-spacing:
normal; text-align: start; text-indent: 0px; text-transform:
none; white-space: normal; word-spacing: 0px;
-webkit-text-stroke-width: 0px; background-color: rgb(255, 255,
255); text-decoration-thickness: initial; text-decoration-style:
initial; text-decoration-color: initial;"><i><b>Generative
Pre-trained Transformer 3</b></i><i><span> </span></i><i>(</i><i><b>GPT-3</b></i><i>;
stylized</i><i><span> </span></i><i><b>GPT·3</b></i><i>) is an</i><i><span> </span></i><i><a
href="https://en.wikipedia.org/wiki/Autoregressive_model"
title="Autoregressive model" style="text-decoration: none;
color: rgb(6, 69, 173); background: none;">autoregressive</a></i><i><span> </span></i><i><a
href="https://en.wikipedia.org/wiki/Language_model"
title="Language model" style="text-decoration: none; color:
rgb(6, 69, 173); background: none;">language model</a></i><i><span> </span></i><i>that
uses</i><i><span> </span></i><i><a
href="https://en.wikipedia.org/wiki/Deep_learning"
title="Deep learning" style="text-decoration: none; color:
rgb(6, 69, 173); background: none;">deep learning</a></i><i><span> </span></i><i>to
produce human-like text.</i></p>
<p style="margin: 0.5em 0px; color: rgb(32, 33, 34); font-family:
sans-serif; font-size: 14px; font-style: normal;
font-variant-ligatures: normal; font-variant-caps: normal;
font-weight: 400; letter-spacing: normal; orphans: 2;
text-align: start; text-indent: 0px; text-transform: none;
white-space: normal; widows: 2; word-spacing: 0px;
-webkit-text-stroke-width: 0px; background-color: rgb(255, 255,
255); text-decoration-thickness: initial; text-decoration-style:
initial; text-decoration-color: initial;"><i>The architecture is
a standard</i><i><span> </span></i><i><a
href="https://en.wikipedia.org/wiki/Transformer_(machine_learning_model)"
title="Transformer (machine learning model)"
style="text-decoration: none; color: rgb(6, 69, 173);
background: none;">transformer network</a></i><i><span> </span></i><i>(with
a few engineering tweaks) with the unprecedented size of
2048-token-long context and 175 billion parameters (requiring
800 GB of storage). The training method is "generative
pretraining", meaning that it is trained to predict what the
next token is.</i><span><i> </i><br>
</span></p>
</blockquote>
<p style="margin: 0.5em 0px; color: rgb(32, 33, 34); font-family:
sans-serif; font-size: 14px; font-style: normal;
font-variant-ligatures: normal; font-variant-caps: normal;
font-weight: 400; letter-spacing: normal; orphans: 2; text-align:
start; text-indent: 0px; text-transform: none; white-space:
normal; widows: 2; word-spacing: 0px; -webkit-text-stroke-width:
0px; background-color: rgb(255, 255, 255);
text-decoration-thickness: initial; text-decoration-style:
initial; text-decoration-color: initial;"><span>That wasn't very
inspired, yet was probably as useful as anything (to defer to a
presumed authority)... what I can't guess yet is how it decided
to snip the quote one sentence short of the first full
paragraph? Maybe Wikipedia has some markup for that...</span></p>
<p style="margin: 0.5em 0px; color: rgb(32, 33, 34); font-family:
sans-serif; font-size: 14px; font-style: normal;
font-variant-ligatures: normal; font-variant-caps: normal;
font-weight: 400; letter-spacing: normal; orphans: 2; text-align:
start; text-indent: 0px; text-transform: none; white-space:
normal; widows: 2; word-spacing: 0px; -webkit-text-stroke-width:
0px; background-color: rgb(255, 255, 255);
text-decoration-thickness: initial; text-decoration-style:
initial; text-decoration-color: initial;"><span>Alexa, when mashed
up with it's myriad "authoritative sources" is a very effective
*hybrid* intelligent assistant... I'm impressed with how
*balanced* the GPT-3 answers can be, even if they can also be a
little milquetoasty as George Duncan's example exemplifies.</span></p>
<p style="margin: 0.5em 0px; color: rgb(32, 33, 34); font-family:
sans-serif; font-size: 14px; font-style: normal;
font-variant-ligatures: normal; font-variant-caps: normal;
font-weight: 400; letter-spacing: normal; orphans: 2; text-align:
start; text-indent: 0px; text-transform: none; white-space:
normal; widows: 2; word-spacing: 0px; -webkit-text-stroke-width:
0px; background-color: rgb(255, 255, 255);
text-decoration-thickness: initial; text-decoration-style:
initial; text-decoration-color: initial;"><span><br>
</span></p>
<div class="moz-cite-prefix">On 9/14/22 2:49 PM, Jochen Fromm wrote:<br>
</div>
<blockquote type="cite"
cite="mid:202209142050.28EKo60a079046@ame3.swcp.com">
<meta http-equiv="Content-Type" content="text/html; charset=UTF-8">
<div dir="auto">I have asked GPT-3 at the OpenAI playground about
it:</div>
<div dir="auto"><br>
</div>
<div dir="auto">Q: What is God?<br>
GPT-3: There is no one answer to this question since there are
so many different religious beliefs around the world. From a
scientific standpoint, however, we can say that God is an unseen
force or concept that governs or oversees everything in the
universe.<br>
<br>
Q: Does God exist? <br>
GPT-3: There is no conclusive evidence either way on whether or
not God exists. Many people believe in a higher power, while
others don't. There is no right or wrong answer to this
question, it is a matter of personal belief.<br>
<br>
Q: Is there a life after death?<br>
GPT-3: There is no one answer to this question since it is a
matter of religious belief. Some people believe in
reincarnation, while others believe in an afterlife in which we
are reunited with our loved ones. Ultimately, it is up to each
individual to decide what they believe.<br>
</div>
<div dir="auto"><br>
</div>
<div dir="auto"><br>
</div>
<div dir="auto">It is not bad for a language model, isn't it? On
the other hand it costs millions of Dollars to build it and
consumes Terabyte of data. Try it yourself at</div>
<div dir="auto"><a class="moz-txt-link-freetext" href="https://beta.openai.com/playground">https://beta.openai.com/playground</a></div>
<div dir="auto"><br>
</div>
<div dir="auto">-J.</div>
<div dir="auto"><br>
</div>
<div><br>
</div>
<div dir="auto" style="font-size:100%;color:#000000" align="left">
<div>-------- Original message --------</div>
<div>From: Jon Zingale <a class="moz-txt-link-rfc2396E" href="mailto:jonzingale@gmail.com"><jonzingale@gmail.com></a> </div>
<div>Date: 9/14/22 9:11 PM (GMT+01:00) </div>
<div>To: <a class="moz-txt-link-abbreviated" href="mailto:friam@redfish.com">friam@redfish.com</a> </div>
<div>Subject: [FRIAM] God is not a Good Theory </div>
<div><br>
</div>
</div>
<div dir="ltr">
<div
style="font-family:verdana,sans-serif;font-size:small;color:#333333"
class="gmail_default"><a
href="https://www.youtube.com/watch?v=ew_cNONhhKI&ab_channel=PhilosophyCosmology"
moz-do-not-send="true">https://www.youtube.com/watch?v=ew_cNONhhKI&ab_channel=PhilosophyCosmology</a><br>
</div>
</div>
<br>
<fieldset class="moz-mime-attachment-header"></fieldset>
<pre class="moz-quote-pre" wrap="">-. --- - / ...- .- .-.. .. -.. / -- --- .-. ... . / -.-. --- -.. .
FRIAM Applied Complexity Group listserv
Fridays 9a-12p Friday St. Johns Cafe / Thursdays 9a-12p Zoom <a class="moz-txt-link-freetext" href="https://bit.ly/virtualfriam">https://bit.ly/virtualfriam</a>
to (un)subscribe <a class="moz-txt-link-freetext" href="http://redfish.com/mailman/listinfo/friam_redfish.com">http://redfish.com/mailman/listinfo/friam_redfish.com</a>
FRIAM-COMIC <a class="moz-txt-link-freetext" href="http://friam-comic.blogspot.com/">http://friam-comic.blogspot.com/</a>
archives: 5/2017 thru present <a class="moz-txt-link-freetext" href="https://redfish.com/pipermail/friam_redfish.com/">https://redfish.com/pipermail/friam_redfish.com/</a>
1/2003 thru 6/2021 <a class="moz-txt-link-freetext" href="http://friam.383.s1.nabble.com/">http://friam.383.s1.nabble.com/</a>
</pre>
</blockquote>
</body>
</html>