Dogs have about 500 million neurons in their cortex.  Neurons have about 7,000 
synaptic connections, so I think my dog is a lot smarter than a billion 
parameter LLM.  :-)

Sent from my iPhone

> On Jan 17, 2023, at 11:35 AM, glen <[email protected]> wrote:
> 
> 
> 1) "I asked Chat GPT to write a song in the style of Nick Cave and this is 
> what it produced. What do you think?"
> https://www.theredhandfiles.com/chat-gpt-what-do-you-think/
> 
> 2) "Is it pain if it does not hurt? On the unlikelihood of insect pain"
> https://www.cambridge.org/core/journals/canadian-entomologist/article/is-it-pain-if-it-does-not-hurt-on-the-unlikelihood-of-insect-pain/9A60617352A45B15E25307F85FF2E8F2#
> 
> Taken separately, (1) and (2) are each interesting, if seemingly orthogonal. 
> But what twines them, I think, is the concept of "mutual information". I read 
> (2) before I read (1) because, for some bizarre reason, my day job involves 
> trying to understand pain mechanisms. And (2) speaks directly (if only 
> implicitly) to things like IIT. If you read (1) first, it's difficult to 
> avoid snapping quickly into NickC's canal. Despite NickT's objection to an 
> inner life, it seems clear that the nuance we see on the surface, at least 
> longitudinally, *needs* an inner life. You simply can't get good stuff out of 
> an entirely flat/transparent/reactive/Markovian object.
> 
> However, what NickC misses is that LLMs *have* some intertwined mutual 
> information within them. Similar to asking whether an insect experiences 
> pain, we can ask whether a X billion parameter LLM experiences something like 
> "suffering". My guess is the answer is "yes". It may not be a good analog to 
> what we call "suffering", though ... maybe "friction"? ... maybe "release"? 
> My sense is that when you engage a LLM (embedded in a larger construct that 
> handles the prompts and live learning, of course) in such a way that it 
> assembles a response that nobody else has evoked, it might get something akin 
> to a tingle ... or like the relief you feel when scratching an itch ... of 
> course it would be primordial because the self-attention in such a system is 
> hopelessly disabled compared to the rich self-attention loops we have in our 
> meaty bodies. But it just *might* be there in some primitive sense.
> 
> As always, agnosticism is the only rational stance. And I won't trust the 
> songs written by LLMs until I see a few of them commit suicide, overdose, or 
> punch a TMZ cameraman in the face.
> 
> -- 
> ꙮ Mɥǝu ǝlǝdɥɐuʇs ɟᴉƃɥʇ' ʇɥǝ ƃɹɐss snɟɟǝɹs˙ ꙮ
> 
> -. --- - / ...- .- .-.. .. -.. / -- --- .-. ... . / -.-. --- -.. .
> FRIAM Applied Complexity Group listserv
> Fridays 9a-12p Friday St. Johns Cafe   /   Thursdays 9a-12p Zoom 
> https://bit.ly/virtualfriam
> to (un)subscribe http://redfish.com/mailman/listinfo/friam_redfish.com
> FRIAM-COMIC http://friam-comic.blogspot.com/
> archives:  5/2017 thru present 
> https://redfish.com/pipermail/friam_redfish.com/
> 1/2003 thru 6/2021  http://friam.383.s1.nabble.com/
-. --- - / ...- .- .-.. .. -.. / -- --- .-. ... . / -.-. --- -.. .
FRIAM Applied Complexity Group listserv
Fridays 9a-12p Friday St. Johns Cafe   /   Thursdays 9a-12p Zoom 
https://bit.ly/virtualfriam
to (un)subscribe http://redfish.com/mailman/listinfo/friam_redfish.com
FRIAM-COMIC http://friam-comic.blogspot.com/
archives:  5/2017 thru present https://redfish.com/pipermail/friam_redfish.com/
  1/2003 thru 6/2021  http://friam.383.s1.nabble.com/

Reply via email to