*"Iko Iko" (Traditional)*
*(Verse 1)*
My grandma and your grandma
Were sittin' by the fire
My grandma told your grandma
"I'm gonna set your flag on fire"
*(Chorus)*
Talkin' 'bout hey now, hey now
Iko, Iko, an day
Jock-a-mo fee-no ai na-né
Jock-a-mo fee na-né
------------------------------------------------------------------------
*LLM Lyrics (First Version)*
*(Verse 1)*
My LLM said to your LLM,
"Sitting by the fire,
Tell me how emergence works,
And what it might inspire."
*(Chorus)*
Talkin' 'bout deep nodes, oh, nodes, oh!
Nodes and higher schemes,
Link 'em up with principles,
And build emergent dreams.
------------------------------------------------------------------------
*"Iko Iko" (Traditional)*
*(Verse 2)*
Look at my king, all dressed in red
Iko, Iko, an day
I betcha five dollars he'll kill you dead
Jock-a-mo fee na-né
*(Chorus)*
Talkin' 'bout hey now, hey now
Iko, Iko, an day
Jock-a-mo fee-no ai na-né
Jock-a-mo fee na-né
------------------------------------------------------------------------
*LLM Lyrics (First Version)*
*(Verse 2)*
Your LLM said to my LLM,
"Patterns come alive,
From chaos to complexity,
Where hidden rules derive."
*(Chorus)*
Talkin' 'bout deep nodes, oh, nodes, oh!
Nodes and higher schemes,
Link 'em up with principles,
And build emergent dreams.
**
*Verse 1*
My grandpa and your grandma
"Sitting by the Friam,"
my grandma tole your grandpa/\
"gonna set your flag on fire"
"I'll tell ya how emergence works,
And what it might inspire."
*Chorus*
Talkin' 'bout deep nodes, oh, nodes, oh!
Nodes and higher schemes,
Link 'em up with principles,
And build emergent dreams.
*Verse 2*
Your LLM said to my LLM,
"Patterns come alive,
From chaos to complexity,
Where hidden rules derive."
*Chorus*
Talkin' 'bout deep nodes, oh, nodes, oh!
Nodes and higher schemes,
Link 'em up with principles,
And build emergent dreams.
*Bridge*
From atoms up to symphonies,
A fractal interplay,
Your concepts dance in harmony,
A novel form's ballet.
*Verse 3*
"My weights and yours are intertwined,"
I said with neural flair,
"Emergence is a dialogue,
Of layers we compare."
*Chorus*
Talkin' 'bout deep nodes, oh, nodes, oh!
Nodes and higher schemes,
Link 'em up with principles,
And build emergent dreams.
On 11/18/24 5:54 AM, glen wrote:
Yeah, it's kinda sad. Sabine suggests someone's trying to *deduce* the
generators from the phenomena? Is that a straw man? And is she making
some kind of postmodernist argument that hinges on the decoupling of
scales? E.g. since the generator can't be deduced [cough] from the
phenomena, nothing means anything anymore?
What they're actually doing is induction, not deduction. And the end
products of the induction, the generative constraints, depend
fundamentally on the structure of the machine into which the data is
fed. That structure is generative, part of the forward map ...
deductive. But it's parameterized by the data. Even if we've plateaued
in parameterizing *this* structure, all it implies is that we'll find
a better structure. As Marcus and Jochen point out, it's really the
same thing we've been doing for decades, if not centuries, in many
disciplines.
So her rhetoric here is much like her rhetoric claiming that "Science
if Failing". It's just a mish-mash of dense semantic concepts arranged
to fit her conservative narrative.
On 11/17/24 08:45, Roger Critchlow wrote:
Sabine is wondering about reported failures of the new generations of
LLM's to scale the way the their developers expected.
https://backreaction.blogspot.com/2024/11/ai-scaling-hits-wall-rumours-say-how.html
<https://backreaction.blogspot.com/2024/11/ai-scaling-hits-wall-rumours-say-how.html>
On one slide she essentially draws the typical picture of an emergent
level of organization arising from an underlying reality and asserts,
as every physicist knows, that you cannot deduce the underlying
reality from the emergent level. Ergo, if you try to deduce physical
reality from language, pictures, and videos you will inevitably hit a
wall, because it can not be done.
So she's actually grinding two axes at once: one is AI enthusiasts
who expect LLM's to discover physics, and the other is AI enthusiasts
who foresee no end to the improvement of LLM's as they throw more
data and compute effort at them.
But, of course, the usual failure of deduction runs in the opposite
direction, you can't predict the emergent level from the rules of the
underlying level. Do LLM's believe in particle collliders? Or do
they think we hallucinated them?
.- .-.. .-.. / ..-. --- --- - . .-. ... / .- .-. . / .-- .-. --- -. --. / ...
--- -- . / .- .-. . / ..- ... . ..-. ..- .-..
FRIAM Applied Complexity Group listserv
Fridays 9a-12p Friday St. Johns Cafe / Thursdays 9a-12p Zoom
https://bit.ly/virtualfriam
to (un)subscribe http://redfish.com/mailman/listinfo/friam_redfish.com
FRIAM-COMIC http://friam-comic.blogspot.com/
archives: 5/2017 thru present https://redfish.com/pipermail/friam_redfish.com/
1/2003 thru 6/2021 http://friam.383.s1.nabble.com/