On Tue, Jan 23, 2024 at 4:37 PM Brent Meeker <[email protected]> wrote:

>
>
> On 1/23/2024 12:52 PM, John Clark wrote:
>
> On Tue, Jan 23, 2024 at 3:38 PM Brent Meeker <[email protected]>
> wrote:
>
> * > Who wrote this?  you, JC?*
>>
>
> No, Scott Alexander did, he's a pretty smart guy but I think he got some
> things wrong. I did write this in the comments section:
>
> "You say "If we’re lucky, consciousness is a basic feature of information
> processing and anything smart enough to outcompete us will be at least as
> conscious as we are" and I agree with you about that because there is
> evidence that it is true. I know for a fact that random mutation and
> natural selection managed to produce consciousness at least once (me) and
> probably many billions of times, but Evolution can't directly detect
> consciousness any better than I can, except in myself, and it can't select
> for something it can't see, but evolution can detect intelligent behavior.
> I could not function if I really believed that solipsism was true,
> therefore I must take it as an axiom, as a brute fact, that consciousness
> is the way data feels when it is being processed intelligently.
>
>
> * >You've written this before, but I slightly disagree with it.  I think
> Evolution can detect consciousness as directly or indirectly as
> intelligence. *
>

I agree, Evolution can detect intelligence so it can only detect
consciousness if it is an inevitable byproduct of intelligent
data-processing.

  John K Clark    See what's on my new list at  Extropolis
<https://groups.google.com/g/extropolis>
887







> Consciouness is imagining the world with you as an actor within it.  It's
> a kind of thinking necessary for planning, i.e. for an advanced form of
> intelligence.  The consciousness you talk about is just awareness,
> perception; that's processing data.
>
>
> You also say "consciousness seems very closely linked to brain waves in
> humans" but how was that fact determined? It was observed that when people
> behave intelligently their brain waves take a certain form and when they
> don't behave intelligently the brain waves are different than that. I'm
> sure you don't think that other people are conscious when they are sleeping
> or under anesthesia or dead because when they are in those conditions they
> are not behaving very intelligently.
>
> As for the fear of paperclip maximizers, I think that's kind of silly. It
> assumes the possibility of an intelligent entity having an absolutely fixed
> goal they can never change, but such a thing is impossible. In the 1930s
> Kurt Gödel prove that there are some things that are true but have no proof
> and Alan Turing proved that there is no way to know for certain if a given
> task is even possible. For example, is it possible to prove or disprove
> that every even number greater than two is the sum of two prime numbers?
> Nobody knows. If an intelligent being was able to have goals that could
> never change it would soon be caught in an infinite loop because sooner or
> later it would attempt a task that was impossible, that's why Evolution
> invented the very important emotion of boredom.   Certainly human beings
> don't have fix goals, not even the goal of self preservation, and I don't
> see how an AI could either."
>
>
> Good point.
>
> Brent
>
> --
>

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To view this discussion on the web visit 
https://groups.google.com/d/msgid/everything-list/CAJPayv3rkA2NMqsPuY_9HkvJpQrTk1rY_oHkEuzFhrKJLFG2HA%40mail.gmail.com.

Reply via email to