A bit OT, but I met John Brunner once, years ago (I used to hang out with some of the SF crowd...)
J^n On Tuesday, June 10, 2025 at 3:23:48 PM UTC+1 Edward K. Ream wrote: > Here <https://gemini.google.com/gem/ba2fb3843c39/1f68d964dc8b42ea>is my > first real conversation with an AI. I am astounded by its capabilities. > > Note how gemini gently corrects some poorly-worded questions. It also > ignores without comment my saying "information" instead of the intended > "misinformation". I doubt any human could answer my questions as cogently > and completely. > > *An important trick* > > I used a hidden technique in this conversation: asking "What would it take > to accomplish an objective?" This technique comes from the > Hugo-Award-Winning SciFi novel Stand on Zanzibar > <https://en.wikipedia.org/wiki/Stand_on_Zanzibar>. > > I read this novel decades ago. For me, the pivotal moment comes when the > protagonist "unfreezes" Shalmanizar, an almost all-powerful supercomputer. > As I vaguely remember, the computer starts rejecting inputs about Beninia, > a (fictional?) region in Africa. The solution is starts with this dialog: > > QQQ > Evaluate this, then: Postulate that the data given you about Beninia are > true. > Cue: what would be necessary to reconcile them with everything else you > know? > QQQ > > After lengthy computation, Shalmanizar replies that it needs to accept the > possibility of an unknown factor influencing the Beninians' actions. > > The protagonist then instructs the computer to accept this unknown factor > as fact. "I tell you three times" :-) > > *Summary* > > You will see this (tactical? strategic?) trick in various places in my > dialog. The most important use of this trick is this question: > > What would it take to convince Pushmeet Kohli to use Gemini to improve > public policy? > > I then followed up the answer with additional questions, the first being: > > "So, would combating misinformation have (in your words), "Clear and > Measurable Impact on Complex Societal Challenges"? > > I like this approach. It lets the AI do the arguing for me. What do you > think? > > Edward > > P.S. I asked gemini to "polish" this letter. I despise the results. What > you are hearing in this email is *my* voice, not some way-too-suave > imitation. > > EKR > -- You received this message because you are subscribed to the Google Groups "leo-editor" group. To unsubscribe from this group and stop receiving emails from it, send an email to leo-editor+unsubscr...@googlegroups.com. To view this discussion visit https://groups.google.com/d/msgid/leo-editor/81b6505c-6eb9-4f96-ba1e-ec27db1bc649n%40googlegroups.com.