Oops, that is Unrestricted Warfare, NOT Unconventional, and it is on
Amazon.com

On Thu, Dec 31, 2020 at 10:22 AM Nanograte Knowledge Technologies <
nano...@live.com> wrote:

> Nothing wrong with a contemporary perspective either.
>
> Point being, it does not matter whether, or not anyone thinks the 'CCP as
> AGI' archetype is a bad idea, or a very-bad idea.
>
> I think what matters is about understanding what the CCP most-closely
> resembles, in terms of AGI. Here I'm assuming Ray Kurzweil's singularity to
> be very near indeed, for China today, but the world tomorrow.
>
> The covid justification and the hack of the century combined have probably
> provided a mighty push towards "new" pervasive technologies. And if I was a
> betting man, I'd wager that more resistance would see a 3rd wave of
> "conviction". Conspiracy? Nope. Strategy? Yes.
>
> That's the point of contemporary argument. It's in one's face and
> infinitely more measurable than war games might be.
>
> I've been looking for that book for close on 15 years. Did you purchase it
> off Amazon?
>
> ------------------------------
> *From:* Steve Richfield <steve.richfi...@gmail.com>
> *Sent:* Thursday, 31 December 2020 17:45
>
> *To:* AGI <agi@agi.topicbox.com>
> *Subject:* Re: [agi] CCP as a model for AGI
>
> I can see that some people here are talking past each other, so to clarify
> some important points...
> 1. Go read the book *Unconventional Warfare*. It could best be described
> as instructions on how to be a REALLY bad AGI with NO consideration of
> potential social blowback. COVID-19 perfectly fits into this book's
> discussions. This book's authors are now high-level PRC officers.
> 2. There are various types of intelligence. On a project I was once very
> intentionally paired with the very best trained computer expert
> from Taiwan, to add my creativity to her training. She knew everything
> about the history of computers, but she was unable to synthesize anything
> but the most trivial of applications. Creativity is VERY different from
> intelligence, and from what I have seen, Eastern education kills creativity.
> 3. I advanced this thread to challenge the idea that AGIs wouldn't
> necessarily turn out to be BAD. Indeed, a government attempting to function
> like an AGI, and the CCP appears to be trying to do, seems to be a
> reasonable test. From what I have seen, the CCP is literally living proof
> that AGI is a REALLY bad idea.
> 4. I have seen a sort of uniformly worn blinders in this group. I have
> repeatedly suggested that we hold a reverse Turing competition (where
> groups pretend to be AGIs) to see where limitless intelligence might lead,
> but so far NO ONE has shown any interest. I expect such a competition would
> produce some eye-opening results and would be a LOT of fun, as groups
> compete to save the world, take over the world, etc.
>
> This having been said, please continue your conversation. James basically
> appeared to grok what I was saying, but everyone else appeared to be
> picking at unrelated (at least to me) details.
>
> *Steve Richfield.*
>
> On Thu, Dec 31, 2020 at 6:14 AM James Bowery <jabow...@gmail.com> wrote:
>
> Ben I really hate it when people interject "go read this book" in a
> conversation but you're a voracious enough reader that I hope you'll
> forgive me when I request that you read E. O. Wilson's "The Social Conquest
> of Earth" to get a handle on where I'm coming from in my use of the word
> "agency" in the context of incorporations like "nation states" or, more
> accurately, "cultures".  But in the likely event that you won't do so, here
> is the tl;dr:
>
> Since at least CHLCA our primate line has been utilizing its higher
> cognitive capacity in ways that promise/threaten to cross the abyss from
> individual selection to eusocial selection.  Human civilization now stands
> at the precipice of full blown eusocial organization and that is why it is
> wiping out biodiversity:  Eusocial species tend to dominate their ecologies
> and unless given time to coevolve, as with eusocial insects and the one
> mammalian fully-eusocial species, the naked mole rat, biodiversity
> collapses.  Human eusociality is characterized by explosive change wrought
> by ideational (technological ) evolution threatening the biosphere is
> threatened like never before.  That's not my assertion, that's Wilson's
> whole career summed up.
>
> Certainly you are correct that human "societies" have nowhere near the
> group integrity that fully eusocial organisms possess.  But be _very_
> careful here:  Eusocial organisms (which merely _seem_ to be multiple
> "individuals" with their own "agency" but are, in fact, single organisms we
> call "hives" or "colonies", etc.) do possess agency expressing the genetic
> interest of the reproductive caste.  The sterile worker caste does _not_
> possess agency.  Reproductive specialization -- the sine qua non of
> eusociety -- is already apparent in the West in the form of the most
> intelligent sacrificing the reproductive years of its most economically
> valuable females on the altar of what is properly characterized as "Mammon
> Worship''.  This ruthless destruction is characteristic of all human
> civilizations at some stage as they begin to collapse, but the older the
> culture the more likely it is to have learned to mitigate the damage done
> by this stage.  I suspect this is at the root of why China and Jews are
> more intelligent:  A long collective memory of the damage done by
> civilizational cycles.
>
> This destructive tendency can be enhanced by an adversarial culture and
> clearly is being enhanced in the West -- transhumanism's
> two-birds-in-the-bush notwithstanding.  In this respect transhumanism
> strikes me as a classic con.
>
> On Thu, Dec 31, 2020 at 3:22 AM Ben Goertzel <b...@goertzel.org> wrote:
>
>
> I don't think it's very useful to model complex systems like major nations
> as one-dimensional utility-maximizers.    Asking "whose utility function"
> about a complex system of that nature -- which has a large number of
> shiftingly-weighted, imprecisely-and-shiftingly-defined "objectives" and
> also largely self-organizes in a non-goal-directed way -- is probably the
> wrong framing....  But asking who will exert a more major influence (e.g.
> the West versus China, or corporate shareholders vs. the scientific
> community) certainly has meaning....
>
> And I don't currently see evidence that China will exert more influence on
> AGI than the West.   Things could evolve that way.  But I note there is not
> yet a China analogue of Deep Mind or OpenAI, let alone say OpenCog or
> SingularityNET or whatever.    OpenNARS is founded by Pei Wang, who is
> mainland Chinese originally, but is centered in the West, etc.
>
> I truly don't understand why folks believe the Chinese gov't is going to
> be able to assimilate the US to its goals and thus achieve a dominant role
> in shaping AGI ....  China does have a larger population than the US and
> has an extraordinary capability for mass-manufacture of electronics, and
> plenty of other interesting advantages, but the AGI advantage seems clearly
> to US/UK ...
>
> I'd like to understand if there are better arguments though...
>
> ben
>
> ben
>
> On Wed, Dec 30, 2020 at 8:58 AM James Bowery <jabow...@gmail.com> wrote:
>
>
>
> On Wed, Dec 30, 2020 at 12:17 PM Ben Goertzel <b...@goertzel.org> wrote:
>
>
> Regarding the CCP as a general intelligence(*), I would say all societies
> and large corporations can be viewed that way, but I don't see evidence
> that the corporate-government complex of China is more generally
> intelligent than the corporate-government complexes of US or Western
> Europe.   What is the evidence or argument in that regard?...
>
>
> If the CCP is more capable of assimilating ("Turking") the US to the CCP's
> utility function than vis versa then any claims as to the US being "more
> generally intelligent" become superfluous.  That's what I meant when I said:
>
>  > The CCP-as-AGI is more capable of "Turking" the US-as-AGI than is the
> US-as-AGI of "Turking" the CCP-as-AGI.
>
>
> (*) to me calling a country or corporation an "AGI" feels needlessly
> confusing, since these are systems largely composed of humans, and not
> engineered from human parts but evolved from human social interactions.
>  But whatever, I understand what is meant.
>
>
> The Future of Humanity Institute <https://www.fhi.ox.ac.uk/> is an
> exemplar for why the question of "Whose utility function?" cannot be swept
> under the rug with regards to "systems largely composed of humans...evolved
> from human social interactions".  Indeed "artificial" means humans had
> agency in the creation of the artifact.  The concern of "Friendly
> Artificial General Intelligence" hence "The Future of Humanity" is all
> about the proper application of that agency in selecting the utility
> function of aid artifact.  What future is there for "humanity" under the
> wrong utility function of _any_ notion of AGI?
>
>
>
> On Wed, Dec 30, 2020 at 6:54 AM James Bowery <jabow...@gmail.com> wrote:
>
> As with "AI debates" in general, people can easily talk past each other by
> failing to acknowledge they are addressing different questions.  Ben
> Goertzel is addressing China's in/ability to create an "AGI" in the sense
> of Legg, Hutter, et al.  Steve Richfield is positing the CCP _is_ an "AGI"
> in a more vague sense that might, if "black boxed" also fit with "AGI" in
> the sense of Legg, Hutter, et al.  Now, it may certainly be argued that
> _if_ Steve is right, _then_ it is capable of _creating_ an AGI:  "The
> Singularity" occurs when some AI achieves the ability to create a more
> intelligent AI, and this threshold of "AI" is the most general notion of
> AGI.
>
> My approach, respecting Steve's original question, is from a position that
> what we call "The Global Economy" _is_ an AGI that is already operating
> with an "unfriendly" utility function, seeing individual human beings as
> raw materials in its environment to refine into "Mechanical Turks".  The
> only extent to which human quality of life, or even the quality of the
> biosphere, is relevant to this AGI is the extent to which it can provide
> resources to replicate its incorporations (corporations/NGOs, governments,
> etc.) wielding hive-like power over, and ultimately disintermediating life
> in seeking access to energy and matter.  The CCP is merely among the more
> conspicuous cases of evolution toward such an incipient AGI hive
> incorporation.
>
> Now, having clarified the question I am addressing (Steve's in the OP):
>
> Hive specialization in eusocial species recapitulates, in a less effective
> way, the clone-army specialization seen in sexual organism stem-cell
> differentiation (modulating SC clone gene expression) into various organs
> of the organism.  The brain is an organ. The CCP constructs its "brain" not
> so much by altering gene expression of clones but by utilizing its long
> history of civil service examination to mine the population for "neurons".
> THAT is where the math comes in to compare the CCP to the US government's
> intelligence agencies.  Having said that, Ben is correct that the CCP's
> structure is more amenable to this mining operation, and one should see the
> "private sector" coddled by the CCP as an updated form of its civil service
> examination tradition.  While it may be true that the resulting "brain" is
> not going to be as capable of producing a silicon AGI as the US, this
> misses Steve's, or at least my point:
>
> The CCP-as-AGI is more capable of "Turking" the US-as-AGI than is the
> US-as-AGI of "Turking" the CCP-as-AGI.
>
> Why do I say this?
>
> See my prior post describing all the ways the US has inhibited its own
> intelligence agencies from mining the population for intelligence that
> those intelligence agencies can "Turk".  Indeed, it is my working
> hypothesis that this inhibition was the result of the CCP engaging in the
> _real_ "Unrestricted Warfare" that the document by that name represents as
> something far more benign.
>
> On Wed, Dec 30, 2020 at 5:03 AM Ben Goertzel <b...@goertzel.org> wrote:
>
>
> I don't think China's slightly higher average IQ is a big advantage for
> them...
>
> However, their governmental organization obviously has some practical
> advantages.   As one example, they can get their intel/ military work done
> directly within their big internet tech companies, rather than via sluggish
> military contractors and limited-scope awkward back-channel-ish alliances
> with big internet tech companies like happens in the US.    This means they
> are getting on average cleverer and harder working folks working on their
> gov't oriented tech, not due to IQ issues but due to organizational
> issues...
>
> On the other hand they continue to have deep problems with radical
> technical innovation due to a persistent culture of mistrust, and this will
> cause them real issues, because there are significant differences btw US
> and China contexts and copying/adapting Western innovations will probably
> not allow them to overtake the West technologically...
>
> I predict AGI will emerge first via organizations that are centered in the
> West, and China will then attempt to copy it, but will not be fast enough
> ... because the org that first creates AGI will be very fast-moving and
> agile and not that easy for creativity-phobic Chinese institutions to catch
> up with
>
> Note I lived in HK for 9 yrs and made many dozens of trips to Beijing,
> Shanghai, Xiamen etc. etc. ... I have met w/ folks at the highest levels in
> Chinese tech companies and SOEs and fairly high up in gov't.   There is a
> lot to admire and a lot to fear there, but I don't think China is really in
> the race as regards AGI and nor do they have the capacity to extremely
> rapidly play catch-up
>
> Of course all this could change in 10 yrs, so these comments are most
> relevant if AGI is achieved in the next say 7 yrs...
>
> ben
>
> On Tue, Dec 29, 2020 at 4:22 PM James Bowery <jabow...@gmail.com> wrote:
>
> It's "Unrestricted Warfare
> <https://archive.org/stream/Unrestricted_Warfare_Qiao_Liang_and_Wang_Xiangsui/Unrestricted_Warfare_Qiao_Liang_and_Wang_Xiangsui_djvu.txt>"
> and as I've pointed out on numerous occasions, that document strikes me as
> a limited hangout disinformation.   Keep in mind the Chinese have a higher
> average IQ than Europeans, their population is several times larger and
> they have a _very_ long history of civil service examinations.
> Extrapolate that mean advantage out to the high IQ tail where the ratios
> explode and it's hard to imagine how great an advantage they have when it
> comes to "peacetime" strategy.  Add to that the belly-full of the West with
> Sassoon's steamships delivering opium and Mao calling it "a century of
> humiliation"...
>
> On Tue, Dec 29, 2020 at 6:48 PM Steve Richfield <steve.richfi...@gmail.com>
> wrote:
>
> As you are reading this, doing the best you can to survive the Pandemic,
> consider...
>
> The Chinese Communist Party (CCP) is a pretty good model for AGI, as there
> are ~500 people working together to provide the best possible management
> for China as it attempts to interact as well as possible with the rest of
> the world. A rising tide usually floats all boats, but China perceived an
> advantage to restrict information about COVID-19 to inflict it on the rest
> of the world, which is consistent with their internal manual *Unconventional
> Warfare*, which details LOTS of dirty tricks you might expect an AGI to
> employ as it seeks its goals. This manual is a REALLY scary read.
>
> Why would anyone expect an AGI to be any "friendlier" than the CCP? Why
> wouldn't anyone expect an AGI to be even nastier?
>
> This dirty deed WILL work for the CCP - unless worldwide revulsion costs
> the CCP even more. I doubt whether an AGI would greatly consider feelings
> that run counter to profit. We may all be paying dearly for not reigning in
> the CCP long ago - and we might end up paying more if we turn an AGI loose
> on the world - for exactly the SAME reasons.
>
> Thoughts?
>
> *Steve Richfield*
>
>
>
> <http://www.avg.com/email-signature?utm_medium=email&utm_source=link&utm_campaign=sig-email&utm_content=webmail>
>  Virus-free.
> www.avg.com
> <http://www.avg.com/email-signature?utm_medium=email&utm_source=link&utm_campaign=sig-email&utm_content=webmail>
> <#m_2455498096168703633_x_m_5613394566346870379_m_-9146032942051155503_m_1480166443729118461_m_1605324203124623133_m_-6270798265471999323_m_5607257108388460483_m_-1736473155726088713_m_-9213456288835467840_DAB4FAD8-2DD7-40BB-A1B8-4E2AA1F9FDF2>
>
>
>
> --
> Ben Goertzel, PhD
> http://goertzel.org
>
> “Words exist because of meaning; once you've got the meaning you can
> forget the words.  How can we build an AGI who will forget words so I can
> have a word with him?” -- Zhuangzhi++
>
>
>
> --
> Ben Goertzel, PhD
> http://goertzel.org
>
> “Words exist because of meaning; once you've got the meaning you can
> forget the words.  How can we build an AGI who will forget words so I can
> have a word with him?” -- Zhuangzhi++
>
>
>
> --
> Ben Goertzel, PhD
> http://goertzel.org
>
> “Words exist because of meaning; once you've got the meaning you can
> forget the words.  How can we build an AGI who will forget words so I can
> have a word with him?” -- Zhuangzhi++
>
>
>
> --
> Full employment can be had with the stoke of a pen. Simply institute a six
> hour workday. That will easily create enough new jobs to bring back full
> employment.
>
> *Artificial General Intelligence List <https://agi.topicbox.com/latest>*
> / AGI / see discussions <https://agi.topicbox.com/groups/agi> +
> participants <https://agi.topicbox.com/groups/agi/members> + delivery
> options <https://agi.topicbox.com/groups/agi/subscription> Permalink
> <https://agi.topicbox.com/groups/agi/Tf065676fd779dd5c-M7122177bd4b63e5c35244e94>
>
-- 
Full employment can be had with the stoke of a pen. Simply institute a six
hour workday. That will easily create enough new jobs to bring back full
employment.

------------------------------------------
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/Tf065676fd779dd5c-M4f4b0fd12fef34ea13dd1086
Delivery options: https://agi.topicbox.com/groups/agi/subscription

Reply via email to