Zamyatin, Huxley, and Orwell described future dystopias where humans were doing the surveillance to enforce uniformity. Nobody at the time foresaw computers, internet, smart phones, or AI doing the surveillance for our benefit. Huxley predicted babies being made in factories, not a world where babies aren't being made at all. He predicted using drugs to control people, Orwell predicted torture and propaganda, and Zamyatin predicted brain surgery. Nobody foresaw the possibility of controlling people by giving them more choices.
The future will be the opposite of uniformity. In dystopian fiction, you need people to maintain power, and those who oppose it are crushed. With AI, you don't need people for anything because you will have everything you want. People don't want to organize resistance, and even if they did, they wouldn't be able to because they will be socially isolated in their virtual worlds with their own private languages. On Tue, Mar 5, 2024, 2:24 PM James Bowery <[email protected]> wrote: > In the event you misunderstood the scare-quotes around "We" it was not in > order that you propose a world more like "Brave New World" (1932), in which > willing compliance is an accomplished fact, than "1984" (1949), in which it > its accomplishment is only apparent at the end with Winston worshiping Big > Brother. This is all very old territory, which I attempted to point out by > citing "We" (1921) which portrays a world much more like the one > you, in 2024, envision. > > Yes, it is far from perfected, that vision from 1921, but Huxley did a > pretty good job of taking your part if only you would bother reading the > original. From the moment of zygote on, development is guided toward > "willing" compliance. > > And what of this evolutionary direction of this identity you invoke with > "We", but the reversion to the individual organism once again, except > absent the diversity that now exists in many individual organisms? You've > often talked in the past about the degree of intelligence embodied by > energy flux through vast numbers of individual organisms, each exploring > the quasi-Hamming space of DNA's embodied intelligence. > > What is your replacement for this diversity? > > > On Tue, Mar 5, 2024 at 11:53 AM Matt Mahoney <[email protected]> > wrote: > >> On Sun, Mar 3, 2024, 8:12 PM James Bowery <[email protected]> wrote: >> >>> On Sun, Mar 3, 2024 at 10:01 AM Matt Mahoney <[email protected]> >>> wrote: >>> >>>> .... We want to be controlled. We are spending trillions on making it >>>> happen. >>>> >>> >>> "We" >>> >>> https://youtu.be/BVLvQcO7JGk >>> >> >> I didn't read "We" but I did read "1984", the book it inspired. The part >> it got right was the surveillance. The part it got wrong was how it would >> be used to control people. We want AI to watch us because it works better >> that way. We let banks track our spending because credit cards and online >> shopping are more convenient than cash. We let Google track our movements >> in return for driving directions that avoid traffic. We let Amazon listen >> to everything we say so we can turn on lights in another room and >> play music. >> >> The illusions of qualia, consciousness, and free will are the result of >> internal positive reinforcement of perception, thinking, and action, >> respectively. These illusions evolved so you would have a reason to live, >> thus producing more offspring. >> >> When you are controlled by external positive reinforcement, it >> strengthens the illusion of free will. Wolpert's law says that a computer >> cannot predict its own output (the special case of two computers being >> unable to model each other when they are identical). Just because you can't >> predict your own actions doesn't mean an AI that knows more about you than >> you do can't predict them. You will reliably choose the action you believe >> will result in the greatest reward because it was rewarded in the past. >> >> Just like video recognition and video generation are inverse functions of >> each other, so are prediction and control. Prediction is a function that >> inputs the past and outputs the future. Control is a function that inputs >> the future and outputs the past. >> >> We will have our utopia. We don't want to stop it. >> >>> *Artificial General Intelligence List <https://agi.topicbox.com/latest>* > / AGI / see discussions <https://agi.topicbox.com/groups/agi> + > participants <https://agi.topicbox.com/groups/agi/members> + > delivery options <https://agi.topicbox.com/groups/agi/subscription> > Permalink > <https://agi.topicbox.com/groups/agi/Tbf01a18ffdd0cf7e-Mf0d822ad78ffe782108f3d25> > ------------------------------------------ Artificial General Intelligence List: AGI Permalink: https://agi.topicbox.com/groups/agi/Tbf01a18ffdd0cf7e-M3c335e25ca9b8e938dd2b2ee Delivery options: https://agi.topicbox.com/groups/agi/subscription
