Glen -Great riff of introspection. I don't know that my understanding of how/well we all communicate or where it breaks down aligns with yours, but do feel that mine has been significantly /informed by/ (apologies to Nick) what you have shared with us.
Your reflections on "success in business" and "pitching" which has strong parallels from within an institution and without (perhaps the larger institution that is) the sci/tech business is very familiar to me. I functioned well enough in both worlds (pre and post institutionalization as entrepreneur bushwhacking at "business") but I'd say it was acutely evident that I would never exceed or rise above most of the (other) wankers going at it in those contexts. Maybe it was my weak ambition or my weak discipline or maybe my weak imagination that limited me... because I'd watch other rise meteorically past me in various situations and many (not all) objectively (from my weak POV) were pretty un(der)informed about the technical issues involved... though they were pretty buzzword compliant/enhanced.
Trump's appeal to the MAGA-masses (and adjacent?) seems to reinforce some kind of truism about how we collect/concentrate personal power? I've found that the most successful used-car-salesmen (literal and figurative) seem to excel because they will give positive responses to anything their "mark" might ask or assert. (sadly GPT and other LLM chatbot interfaces seem to apply this effectively as well?).
What you reference re: Jobs/Musk guru-over-slave phenomena is a slightly softer-power version of what all colonization is/has-been about?
Crazy to think that one of my escape-fantasies from the avalanche trajectory we might be on is for AI to become a "benevolent colonizer" to us. Possibly many indigenous folks fell into that seduction as well, becoming early /obvious collaborators with those who would ultimately prove to be oppressors/enslavers/genociders?
- Steve On 5/27/25 7:40 AM, glen wrote:
One of the things I often fail to fold into my plans or expectations is the extent to which others fail to understand what I'm thinking/saying. E.g. a one-time boss of mine was adamant that the responsibility for getting your message across to the audience lies with you. His adamantinous (?) position belied his naïve understanding/model of humans. It always felt like he didn't understand that humans are biological animals, firmly grounded in a soup of flora and fauna. It was almost religious ... like the way a priest might appeal to some ridiculous ideal, a lossy reduction to some ungrounded World 3.It dovetailed perfectly with his desire to be one of the cool kids in the startup/VC/Angel world. I never understood that world (obviously, else I'd be rich). But it always seemed to me that it was very pitch-driven. You must be able to a) choose an audience, b) craft a slide deck, and c) deliver the pitch in such a way as to hook them. [⛧] I also did quite a bit of poking around in how to run a company. Running a company for a long period of time is a bushy, complicated thing. Tying the pitch that hooks the investors to the hairball of activity of a company is mysterious. The school teachers give lip service to things like business plans and whatnot. But I think it's mostly *luck*, both good and bad.And the ones that luck out and learn to give pitches, get funding, and manage the bramble bush well enough for the Exit will be *biased* toward the belief that they are good at it, a false merit. And part of it is self-reinforcing. Due diligence softens and fuzzifies when being pitched by those who've lucked out in the past, engineering the world to canalize luck into "a good bet". Then there's also the guru factor. Musk and Jobs were great cult leaders, telling their slaves to reify some vision, then taking credit for that slave labor when the slaves make it happen (and equivocating when the slaves fail). Fake it till you make it.[⛧] FWIW, I doubt the academy is all that different. Grantsmanship is a thing.On 5/22/25 12:10 PM, steve smith wrote:I believe that for both of them, money is merely a means or constraint to achieving much more abstract, idiosyncratic, hallucinatory, utopian visions. Musk's is clearly rooted in a-good-old-fashioned-sci-fi-future. Altman's openly expressed vision seems to be one of an (overly naive) incremental improvement of the "human condition" on an arc qualitatively similar to the one we've been on since the ramp-up of industry a century or two ago? They both seem to expect technological phase shifts but don't seem to understand that sociological/cultural/spiritual ones would seem to follow inevitably? They seem to only see Jetson-like-visions of self-flying cars and robots? I see cyberpunkesque post/transhuman utopian/dystopian jackpots.I don't trust either of them, mainly because of the outscale leverage they wield. Musk's current $$ wealth is 200x that of Altman and his industry-dominance has a much broader base. If Altman approaches AGI (asymptotically?) more quickly, he might catch up in terms of /net effect in the world/, but not directly through financial wealth?I was acutely embarrassed for Musk recently when I watched a clip of him talking about his Grok and Colossus, using his usual "schtick" around "first principles" and "physics based" and I'd swear he didn't understand (or mean) a thing he was saying? It was very buzzword compliant to his stories about Tesla and SpaceX. It sounded like hollow rhetoric aimed at Fox News, Donald Trump (and his allies) and high-school techbro wannabes. Nothing he said sounded the least bit grounded in anything truly technical?The more I listen to Altman, the more *naive* he seems to me... he is much smoother than Musk and his narcissism is possibly much less significant and much more well disguised.On 5/22/25 11:42 AM, Prof David West wrote:Unification. None implied. I agree that such would be unlikely.To the extent that some people confidently express that "this is how we use language," this is how scientists think," they are pretty much wrong to some significant extent. Reconciliation/unification would always be elusive or non-existent. You cite logicians, but my favorite example would be Whitehead's Process Philosophy contrasted with his earlier work with Russel.Even if possible, I don't immediately see the value in "unification."Cynical aside to Pieter: I think Altman's only vision is a personal net worth greater than his old partner Elon.
OpenPGP_0xD5BAF94F88AFFA63.asc
Description: OpenPGP public key
OpenPGP_signature.asc
Description: OpenPGP digital signature
.- .-.. .-.. / ..-. --- --- - . .-. ... / .- .-. . / .-- .-. --- -. --. / ... --- -- . / .- .-. . / ..- ... . ..-. ..- .-.. FRIAM Applied Complexity Group listserv Fridays 9a-12p Friday St. Johns Cafe / Thursdays 9a-12p Zoom https://bit.ly/virtualfriam to (un)subscribe http://redfish.com/mailman/listinfo/friam_redfish.com FRIAM-COMIC http://friam-comic.blogspot.com/ archives: 5/2017 thru present https://redfish.com/pipermail/friam_redfish.com/ 1/2003 thru 6/2021 http://friam.383.s1.nabble.com/
