I think it would be a purely academic exercise (as in, disconnected from
any practical consequences) to argue about the kinds of AGIs that could
have access to infinite resources.

Rejecting Yudkowsky's argument on the basis that reality *might* be
infinite seems like an odd move to me. If you feel, as Yudkowsky does, that
the fate of humanity rests on our ability to produce a friendly AI before
someone else produces an unfriendly one, then such esoteric objections miss
the point entirely. Even if resources were infinite, it doesn't follow that
we'd be safe from a paperclip maximizer, and anyway, we have no good reason
to suppose resources are infinite in any way that bears on the potential
realities of AGI.


On Fri, Sep 5, 2014 at 11:26 AM, Stephen Paul King <
stephe...@provensecure.com> wrote:

> Hi Terren,
>
>   Ah, nice link. Thank you. Does the assumption of a finite and fixed set
> of resources necessarily match the real world?
>
>    If an AGI's computation can occur on any active and evolving
> network of sufficient complexity, would the paperclip argument hold?
>
> ISTM that overall resources are finite, bounded and fixed only within
> snapshots of patches of the universe. Given eternal inflation and the
> potential for endless forms of resources, I find the paperclip argument
> unconvincing.
>
>
>
> On Fri, Sep 5, 2014 at 11:15 AM, Terren Suydam <terren.suy...@gmail.com>
> wrote:
>
>> http://wiki.lesswrong.com/wiki/Paperclip_maximizer
>>
>>
>> On Fri, Sep 5, 2014 at 11:13 AM, Stephen Paul King <
>> stephe...@provensecure.com> wrote:
>>
>>> AFAIK, if the AGI and humanity are not competing for the same resources,
>>> no conflict need arise...
>>>
>>>
>>> On Fri, Sep 5, 2014 at 11:08 AM, Terren Suydam <terren.suy...@gmail.com>
>>> wrote:
>>>
>>>>
>>>> On Fri, Sep 5, 2014 at 10:57 AM, John Clark <johnkcl...@gmail.com>
>>>> wrote:
>>>>
>>>>>
>>>>>
>>>>>
>>>>> On Thu, Sep 4, 2014 at 6:09 AM, Telmo Menezes <te...@telmomenezes.com>
>>>>> wrote:
>>>>>
>>>>>>
>>>>>> > Intelligence is clearly a process that can be bootstrapped -- we
>>>>>> know this from biology.
>>>>>>
>>>>>
>>>>> Yes, adults tend to be smarter than infants and infants are smarter
>>>>> than one celled zygotes.
>>>>>
>>>>>
>>>>>> > What I don't understand is how people expect to have a human-level
>>>>>> AI (many degrees of freedom) and then also be able to micro-manage it.
>>>>>>
>>>>>
>>>>> I also don't understand the people who talk about a "friendly A" I
>>>>> when what they really mean is a AI that will happily remain our slave and
>>>>> place our interests at a higher level than its own. It's just not possible
>>>>> to consistently outsmart something that is vastly more intelligent than 
>>>>> you
>>>>> are.
>>>>>
>>>>>   John K Clark
>>>>>
>>>>>
>>>> You're presupposing that an AGI must necessarily have interests that
>>>> conflict with ours.
>>>>
>>>> It's obviously a really difficult problem, if for no other reason,
>>>> you'd have to have faith that a much smarter AI was acting in our
>>>> interests. Even if you could mathematically prove beforehand that an AGI
>>>> would be friendly (which I doubt is possible), something way smarter than
>>>> us would behave in unpredictable ways and make decisions that seem contrary
>>>> to our interests, simply because we wouldn't be smart enough to follow the
>>>> reasoning (which might take hundreds of years to explain to mere humans).
>>>>
>>>> Terren
>>>>
>>>> --
>>>> You received this message because you are subscribed to a topic in the
>>>> Google Groups "Everything List" group.
>>>> To unsubscribe from this topic, visit
>>>> https://groups.google.com/d/topic/everything-list/YJeHJO5dNqQ/unsubscribe
>>>> .
>>>> To unsubscribe from this group and all its topics, send an email to
>>>> everything-list+unsubscr...@googlegroups.com.
>>>> To post to this group, send email to everything-list@googlegroups.com.
>>>> Visit this group at http://groups.google.com/group/everything-list.
>>>> For more options, visit https://groups.google.com/d/optout.
>>>>
>>>
>>>
>>>
>>> --
>>>
>>> Kindest Regards,
>>>
>>> Stephen Paul King
>>>
>>> Senior Researcher
>>>
>>> Mobile: (864) 567-3099
>>>
>>> stephe...@provensecure.com
>>>
>>>  http://www.provensecure.us/
>>>
>>>
>>> “This message (including any attachments) is intended only for the use
>>> of the individual or entity to which it is addressed, and may contain
>>> information that is non-public, proprietary, privileged, confidential and
>>> exempt from disclosure under applicable law or may be constituted as
>>> attorney work product. If you are not the intended recipient, you are
>>> hereby notified that any use, dissemination, distribution, or copying of
>>> this communication is strictly prohibited. If you have received this
>>> message in error, notify sender immediately and delete this message
>>> immediately.”
>>>
>>> --
>>> You received this message because you are subscribed to the Google
>>> Groups "Everything List" group.
>>> To unsubscribe from this group and stop receiving emails from it, send
>>> an email to everything-list+unsubscr...@googlegroups.com.
>>> To post to this group, send email to everything-list@googlegroups.com.
>>> Visit this group at http://groups.google.com/group/everything-list.
>>> For more options, visit https://groups.google.com/d/optout.
>>>
>>
>>  --
>> You received this message because you are subscribed to a topic in the
>> Google Groups "Everything List" group.
>> To unsubscribe from this topic, visit
>> https://groups.google.com/d/topic/everything-list/YJeHJO5dNqQ/unsubscribe
>> .
>> To unsubscribe from this group and all its topics, send an email to
>> everything-list+unsubscr...@googlegroups.com.
>> To post to this group, send email to everything-list@googlegroups.com.
>> Visit this group at http://groups.google.com/group/everything-list.
>> For more options, visit https://groups.google.com/d/optout.
>>
>
>
>
> --
>
> Kindest Regards,
>
> Stephen Paul King
>
> Senior Researcher
>
> Mobile: (864) 567-3099
>
> stephe...@provensecure.com
>
>  http://www.provensecure.us/
>
>
> “This message (including any attachments) is intended only for the use of
> the individual or entity to which it is addressed, and may contain
> information that is non-public, proprietary, privileged, confidential and
> exempt from disclosure under applicable law or may be constituted as
> attorney work product. If you are not the intended recipient, you are
> hereby notified that any use, dissemination, distribution, or copying of
> this communication is strictly prohibited. If you have received this
> message in error, notify sender immediately and delete this message
> immediately.”
>
> --
> You received this message because you are subscribed to the Google Groups
> "Everything List" group.
> To unsubscribe from this group and stop receiving emails from it, send an
> email to everything-list+unsubscr...@googlegroups.com.
> To post to this group, send email to everything-list@googlegroups.com.
> Visit this group at http://groups.google.com/group/everything-list.
> For more options, visit https://groups.google.com/d/optout.
>

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to everything-list+unsubscr...@googlegroups.com.
To post to this group, send email to everything-list@googlegroups.com.
Visit this group at http://groups.google.com/group/everything-list.
For more options, visit https://groups.google.com/d/optout.

Reply via email to