I think the main reason we struggle on is because we believe we will one
day transcend our humanity.  Isn't that the real reason to create an AGI or
a God for that matter?

A wise man once said: "If God did not exist, it would be necessary to
invent him."


On Sat, Sep 6, 2014 at 5:52 PM, Piaget Modeler via AGI <[email protected]>
wrote:

> The real question is not whether you trust an AGI or an Alien being, but
> do you trust people in general.
> Because people lie, steal, kill.  If not, why have children? Why continue
> to exist or survive at all?
>
> There must be something to be gained by participating in life.  Some
> raison d'etre.  Some cause.
> And that may just be the answer.
>
>
> ~PM
>
> ------------------------------
> Date: Sat, 6 Sep 2014 13:03:10 -0700
> Subject: [agi] The AGI Hypothesis...
> From: [email protected]
> To: [email protected]
>
>
> Hi all,
>
> Please correct and edit this as appropriate:
>
> The AGI hypothesis is that an infinitely intelligent (machine) will do
> VERY well in our world. We survive and thrive through our social and
> economic interactions, which most here seem to think are less important
> than raw intelligence.
>
> I have discussed in the past that there may be an optimal intelligence,
> beyond which a human or machine would be seen as being too dangerous to
> deal with, just as some people are seen as being too dangerous to deal with
> - not so much because of the AGI-specific concerns, but rather just the
> usual mundane social competition for goods, women, status, etc. - why play
> with someone who always wins?
>
> My dad used to frequently play checkers with me - and he always won. At
> about 12 years old I eventually tired of this, so I read three books on
> checker strategy, and he never won another game. After a few more games, he
> refused to play me any more.
>
> To examine an edge of this effect, I was once part of a small company that
> was negotiating with Microsoft to develop one of their products. After
> seeing the way Microsoft rose to the top, I suspected that we would be
> ripped off, so I insisted on certain provisions in the contract that would
> have been no problem had Microsoft not intended a ripoff. Microsoft
> accepted some of the provisions but refused others. The rest of the company
> accepted, and I walked. After a long and expensive development effort
> Microsoft ripped them off just as I had expected, only Microsoft got
> entangled in one of my provisions that they had caved on, which they
> eventually settled for a bunch of money - but not enough to pay for the
> development effort.
>
> Perhaps there is an "optimizing" process going on here, e.g. a "shark"
> (like Microsoft in the above example) could adjust their aggressiveness to
> optimize their return, because various people have various thresholds of
> "refusal to play", just like my threshold was lower than that of the rest
> of the small company, who saw the pot of gold at the end of the rainbow,
> without seeing the leprechaun who was there waiting to grab it first. Steve
> Balmer probably has a good answer to this question regarding optimization,
> but I doubt that he would ever choose to share it.
>
> So, just what is the distinction between AGI and an Alien? Is there any
> difference beyond us having built one, while the other one just landed
> here? Would an AGI fare any better than an Alien in our society? If so,
> then why?
>
> Consider the following video. I don't think such a thing could ever
> happen, mostly because everyone would be EXPECTING such a thing to happen:
> Nonetheless, there would doubtless be many willing victims, like the small
> company discussed above.
>
> http://www.hulu.com/watch/440883
>
> Would YOU trust an AGI to be acting in YOUR best interests, any more than
> you might trust an Alien?
>
> So, why work on something that apparently lacks a success path?
>
> Steve
>
>    *AGI* | Archives <https://www.listbox.com/member/archive/303/=now>
> <https://www.listbox.com/member/archive/rss/303/19999924-4a978ccc> |
> Modify <https://www.listbox.com/member/?&;> Your Subscription
> <http://www.listbox.com>
>    *AGI* | Archives <https://www.listbox.com/member/archive/303/=now>
> <https://www.listbox.com/member/archive/rss/303/12578217-f409cecc> |
> Modify
> <https://www.listbox.com/member/?&;>
> Your Subscription <http://www.listbox.com>
>



-------------------------------------------
AGI
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/21088071-f452e424
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=21088071&id_secret=21088071-58d57657
Powered by Listbox: http://www.listbox.com

Reply via email to