Doesn't surprise me, you have friends like Mentifex too.

On Thu, Mar 7, 2019 at 4:24 PM Steve Richfield <[email protected]>
wrote:

> Boris,
>
> I would like to introduce your AGI to a magician friend of mine.
>
> Steve
>
>
> On Thu, Mar 7, 2019, 12:05 Boris Kazachenko <[email protected]> wrote:
>
>> "But why would you think that AGI would not hallucinate?"
>>
>> Your "AGI" may hallucinate, because it is designed to feed on that
>> incoherent second-hand natural-language data.
>> Mine won't, it is designed to be integral and self-sufficient. It will
>> believe what it sees, not what a bunch of nuts on the net say.
>>
>>
>> On Tue, Mar 5, 2019 at 2:52 PM Linas Vepstas <[email protected]>
>> wrote:
>>
>>>
>>>
>>> On Tue, Mar 5, 2019 at 12:37 PM Matt Mahoney <[email protected]>
>>> wrote:
>>>
>>>> Steve, good luck ending the political debate over climate change. But
>>>> you have a few obstacles.
>>>>
>>>> 1. Overwhelming evidence does not end political debate. Just ask the
>>>> creationists, anti-vaxxers, moon landing hoaxers, and 9/11 conspiracy
>>>> theorists. The whole purpose of the flat earth society is not to
>>>> convince you that the earth is flat, but to show that what you think
>>>> are logical, sound, and obvious arguments supported by undisputed
>>>> facts are actually useless.
>>>>
>>>
>>> More precisely: computers have lowered the cost of publishing so low,
>>> that anyone can publish: you no longer run a gauntlet of editors, printers
>>> and proof-readers who tell you that your ideas are stupid. Facebook and
>>> twitter have taken this to a new level. As a result, we can now all hear
>>> each-others brains thinking, and, it turns out, they are incoherent,
>>> contradictory and insane. Worse: this high-connectivity, high-bandwidth
>>> (youtube) low-latency interconnect allows for the spread and amplification
>>> of "memes" disconnected from any basis in reality.
>>>
>>> In short: social media has attached our individual, singular brains into
>>> a big "global brain", and we are hearing that global brain think, and it is
>>> hallucinating a lot of the time. Old-school social sciences have already
>>> studied this: propaganda, (Hitler studied propaganda), cults, brainwashing,
>>> Stockholm syndrome, and also plenty of less harmful things: everything from
>>> pop music and Puerto-Rican low-rider automotive clubs. Uplifting things,
>>> too: from scientists to medical doctors to humanitarian activists.
>>>
>>> If you want a preview of what a mildly super-human intelligence viz AGI
>>> might think, then the hallucinatory beliefs of various memetic tribes  is a
>>> good sampling. The memeplex of creationists, anti-vaxxers, moon landing
>>> hoaxers, and 9/11 conspiracy
>>> theorists is sufficiently self-consistent to be stable, and more:
>>> sufficient to be invasive, and occupy the thought-space, the noosphere of
>>> many human brains.  It spreads.
>>>
>>> Perhaps you think that an AGI will be purely "rational" (whatever that
>>> means!) and that no AGI could ever be a creationist anti-vaxxer
>>> moon-landing denialist. But why would you think that AGI would not
>>> hallucinate?
>>>
>>> Until very recently, Reality, i.e. the universe, entrained thinking
>>> minds in such a way that you starve and die, if you cannot think clearly
>>> enough to obtain food and procreate. Squirrels who neglect nuts die sooner,
>>> rather than later. Our modern economy is sufficiently robust that you can
>>> hallucinate all day long, or watch soap operas on TV, or do whatever it is
>>> you do, and mostly not starve to death. Mostly; we seem to have a problem
>>> with Amazon employees in LA and SF. But whatever.
>>>
>>> If you want a safe, non-existential-threatening AGI in the future,
>>> attempting to understand and control the root causes of hallucinatory
>>> thinking today is a good place to start.
>>>
>>> -- Linas
>>> --
>>> cassette tapes - analog TV - film cameras - you
>>>
>> *Artificial General Intelligence List <https://agi.topicbox.com/latest>*
> / AGI / see discussions <https://agi.topicbox.com/groups/agi> +
> participants <https://agi.topicbox.com/groups/agi/members> + delivery
> options <https://agi.topicbox.com/groups/agi/subscription> Permalink
> <https://agi.topicbox.com/groups/agi/Tbefabf50a1da4070-M766237dc4983a45e6680f192>
>

------------------------------------------
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/Tbefabf50a1da4070-Mf6524887a37eee5d7f392f83
Delivery options: https://agi.topicbox.com/groups/agi/subscription

Reply via email to