Stefan,

Could you please explain, how could I apply your research paper:
http://rationalmorality.info/wp-content/uploads/2007/11/practical-benevolence-2007-11-17_isotemp.pdf
to something useful?
It's a little too abstract for me, so some introduction that binds
this research to practical research/development goals would be quite
helpful.

Saturday, November 17, 2007, 3:19:37 PM, you wrote:

> On Nov 18, 2007 3:05 AM, Dennis Gorelik <[EMAIL PROTECTED]> wrote:
>  You assume that "when we are 100% done" -- we will get what we
> ultimately want.
> But that's not exactly true.

> The most fittest species (whether computers, humans, or androids) will 
> dominate the world.

> Let's talk about set of supergoals that such fittest species will
> have.

> I think this set would include:
> - Supergoal "Prevent being [self]destroyed".
> - Supergoal "Prevent changing supergoals". That supergoal would also
> try to prevent tampering with supergoals. I guess that supergoal will
> have to become quite strong in the environment when it's
> technologically possible to tweak supergoals.
> - Supergoal "reproduce". Supergoals of descendants would probably 
> slightly vary from supergoals of the parent.
> - Other supergoals, such as "Desire to learn", "Desire to speak", and 
> "Contribute to
> society".

> Note, that the most fittest species will not really have "Permanent 
> pleasure paradise" option.



> Dennis, I believe the same and have recently finished organizing
> my thoughts on the matter in a paper: Practical Benevolence – a
> Rational Philosophy of Morality that is available at
> http://rationalmorality.info/

> Abstract: These arguments demonstrate the a priori moral nature of reality and
> develop the basic understanding necessary for realizing the logical maxim in
> Kant's categorical imperative[1] based on the implied goal of evolution[2]. 
> The
> maxim is used to proof moral behavior as obligatory emergent phenomenon
> among evolving interacting goal driven agents.

> Kind regards, 

> Stefan



-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=8660244&id_secret=66310437-b4c064

Reply via email to