Re: [agi] Pure reason is a disease.

2007-05-24 Thread Jiri Jelinek

Mark,

I cannot hit everything now, so at least one part:


Are you *absolutely positive* that real pain and real
feelings aren't an emergent phenomenon of sufficiently complicated and
complex feedback loops?  Are you *really sure* that a sufficiently
sophisticated AGI won't experience pain?


Except some truths found in the world of math, I'm not *absolutely
positive* about anything ;-), but I don't see why it should, and when
running on computers we currently have, I don't see how it could..
Note that some people suffer from rare disorders that prevent them
from the sensation of pain (e.g. congenital insensitivity to pain).
Some of them suffer from slight mental retardation, but not all. Their
brains are pretty complex systems demonstrating general intelligence
without the pain sensation. In some of those cases, the pain is killed
by increased production of endorphins in the brain, and in other cases
the pain info doesn't even make it to the brain because of
malfunctioning nerve cells which are responsible for transmitting the
pain signals (caused by genetic mutations). Particular feelings (as we
know it) require certain sensors and chemistry. Sophisticated logical
structures (at least in our bodies) are not enough for actual
feelings. For example, to feel pleasure, you also need things like
serotonin, acetylcholine, noradrenaline, glutamate, enkephalins and
endorphins.  Worlds of real feelings and logic are loosely coupled.

Regards,
Jiri Jelinek

On 5/23/07, Mark Waser [EMAIL PROTECTED] wrote:

 AGIs (at least those that could run on current computers)
 cannot really get excited about anything. It's like when you represent
 the pain intensity with a number. No matter how high the number goes,
 it doesn't really hurt. Real feelings - that's the key difference
 between us and them and the reason why they cannot figure out on their
 own that they would rather do something else than what they were asked
 to do.

So what's the difference in your hardware that makes you have real pain and
real feelings?  Are you *absolutely positive* that real pain and real
feelings aren't an emergent phenomenon of sufficiently complicated and
complex feedback loops?  Are you *really sure* that a sufficiently
sophisticated AGI won't experience pain?

I think that I can guarantee (as in, I'd be willing to bet a pretty large
sum of money) that a sufficiently sophisticated AGI will act as if it
experiences pain . . . . and if it acts that way, maybe we should just
assume that it is true.


-
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=231415user_secret=e9e40a7e


Re: [agi] Re: There is no definition of intelligence

2007-05-24 Thread Joel Pitt

That quote made my evening!

Thanks :)

On 5/22/07, J Storrs Hall, PhD [EMAIL PROTECTED] wrote:

The best definition of intelligence comes from (of all people) Hugh Loebner:

It's like pornography -- I can't define it exactly, but I like it when I see
it.

-
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?;




--
-Joel

Unless you try to do something beyond what you have mastered, you
will never grow. -C.R. Lawton

-
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=231415user_secret=e9e40a7e


Re: [agi] Pure reason is a disease.

2007-05-24 Thread Eric Baum



Josh I think that people have this notion that because emotions are
Josh so unignorable and compelling subjectively, that they must be
Josh complex. In fact the body's contribution, in an information
Josh theoretic sense, is tiny -- I'm sure I way overestimate it with
Josh the 1%.

Emotions are also, IMO and also according to some existing literature,
essentially preprogrammed in the genome.

See wife with another man, run jealousy routine.

Hear unexpected loud noise, go into preprogrammed 7 point startle 
routine already visible in newborns.

etc.

Evolution builds you to make decisions. But you need guidance so the
decisions you make tend to actually favor its ends. You get
essentially a two  part computation, where your decision making
circuitry gets preprogrammed inputs about what it should maximize 
and what tenor it should take.
On matters close to their ends (of propagating), the genes take 
control to make sure you don't deviate from the program.

-
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=231415user_secret=e9e40a7e


Re: [agi] Pure reason is a disease.

2007-05-24 Thread Mark Waser

Note that some people suffer from rare disorders that prevent them
from the sensation of pain (e.g. congenital insensitivity to pain).



the pain info doesn't even make it to the brain because of
malfunctioning nerve cells which are responsible for transmitting the
pain signals (caused by genetic mutations).


This is equivalent to their lacking the input (the register that says your 
current pain level is 17) not the ability to feel pain if the register was 
connected (and therefore says nothing about their brain or their 
intelligence).



In some of those cases, the pain is killed
by increased production of endorphins in the brain,


In these cases, the pain is reduced but still felt . . . . but again this is 
equivalent to being register driven -- the nerves say the pain level is 17, 
the endorphins alter the register down to 5.



Particular feelings (as we
know it) require certain sensors and chemistry.


I would agree that particular sensations require certain sensors but 
chemistry is an implementation detail that IMO could be replaced with 
something else.



Sophisticated logical
structures (at least in our bodies) are not enough for actual
feelings. For example, to feel pleasure, you also need things like
serotonin, acetylcholine, noradrenaline, glutamate, enkephalins and
endorphins.  Worlds of real feelings and logic are loosely coupled.


OK.  So our particular physical implementation of our mental computation 
uses chemicals for global environment settings and logic (a very detailed 
and localized operation) uses neurons (yet, nonetheless, is affected by the 
global environment settings/chemicals).  I don't see your point unless 
you're arguing that there is something special about using chemicals for 
global environment settings rather than some other method (in which case I 
would ask What is that something special and why is it special?).


   Mark 



-
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=231415user_secret=e9e40a7e


Re: [agi] NARS: definition of intelligence

2007-05-24 Thread Eric Baum

I recommend my publisher, MIT Press. They agreed to bring my book 
out reasonably ($40 list if I recall for the hardcover); then came
with a paperback a year later that listed, I forget exactly, maybe
$26. And both versions were immediately discounted from there by 
Amazon and BN, if I recall the paperback sells for $22.

I was also recommended to them by an author who told me they are
known for keeping books in print for long periods of time.

Pei Shane, Well, I actually considered Lulu and similar publishers,
Pei though as the last option. It is much easier to publish with
Pei them, but given the nature of NARS, such a publisher will make
Pei the book even more likely to be classified as by a crackpot. :(

Pei I continued to look for a publisher with tough peer-review
Pei procedure, even after the manuscript had been rejected by more
Pei than a dozen of them. Though the price excludes most of
Pei individual buyers, it may be more likely for a research library
Pei to buy a $190 book from Springer than a $25 book from Lulu, given
Pei the topic.

Pei Pei

Pei On 5/24/07, Shane Legg [EMAIL PROTECTED] wrote:
 Pei,
 
 
  Yes, the book is the best source for most of the topics. Sorry
 for the  absurd price, which I have no way to influence.
 
 It's $190.  Somebody is making a lot of money on each copy and I'm
 sure it's not you.  To get a 400 page hard cover published at
 lulu.com is more like $25.
 
 Shane
 
  This list is sponsored by AGIRI:
 http://www.agiri.org/email
 
 To unsubscribe or change your options, please go to:
 http://v2.listbox.com/member/?;

Pei - This list is sponsored by AGIRI: http://www.agiri.org/email
Pei To unsubscribe or change your options, please go to:
Pei http://v2.listbox.com/member/?;

-
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=231415user_secret=e9e40a7e


Re: [agi] Pure reason is a disease.

2007-05-24 Thread Eric Baum


Jiri Note that some people suffer from rare
Jiri disorders that prevent them from the sensation of pain
Jiri (e.g. congenital insensitivity to pain). 

What that tells you is that the sensation you feel is genetically
programmed. Break the program, you break (or change) the sensation.
Run the intact program, you feel the sensation.

-
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=231415user_secret=e9e40a7e


Re: [agi] NARS: definition of intelligence

2007-05-24 Thread Pei Wang

MIT Press was among the first publishers I contacted. The editor said
they are not interested in the topic --- the manuscript didn't even
get a review. :(

Yes, their price is much more reasonable.

Pei

On 5/24/07, Eric Baum [EMAIL PROTECTED] wrote:


I recommend my publisher, MIT Press. They agreed to bring my book
out reasonably ($40 list if I recall for the hardcover); then came
with a paperback a year later that listed, I forget exactly, maybe
$26. And both versions were immediately discounted from there by
Amazon and BN, if I recall the paperback sells for $22.

I was also recommended to them by an author who told me they are
known for keeping books in print for long periods of time.

Pei Shane, Well, I actually considered Lulu and similar publishers,
Pei though as the last option. It is much easier to publish with
Pei them, but given the nature of NARS, such a publisher will make
Pei the book even more likely to be classified as by a crackpot. :(

Pei I continued to look for a publisher with tough peer-review
Pei procedure, even after the manuscript had been rejected by more
Pei than a dozen of them. Though the price excludes most of
Pei individual buyers, it may be more likely for a research library
Pei to buy a $190 book from Springer than a $25 book from Lulu, given
Pei the topic.

Pei Pei

Pei On 5/24/07, Shane Legg [EMAIL PROTECTED] wrote:
 Pei,


  Yes, the book is the best source for most of the topics. Sorry
 for the  absurd price, which I have no way to influence.

 It's $190.  Somebody is making a lot of money on each copy and I'm
 sure it's not you.  To get a 400 page hard cover published at
 lulu.com is more like $25.

 Shane


-
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=231415user_secret=e9e40a7e


RE: [agi] Write a doctoral dissertation, trigger a Singularity

2007-05-24 Thread John G. Rose
Different people have different ways of communicating.  Many Murray posts
are sprinkled with annoyances but then they do have some intelligence and
wisdom.  They remind me of a W. C. Fields like way of speaking with some
Snake Oil salesmanship.  Actual Snake Oil BTW can be good for certain things
but fake Snake Oil is fake, hence the reputation.  More generally speaking I
have found from my experience that some of the worst communicators have the
most to say and some of the best communicators, the least.  Not to say
Murray is a bad communicator, but we have grown accustomed to
marginalizing people who break the mold thus minimizing the variances of
personalities.  Part of this is due to a franchised-like educational
system that has existed for several decades.  Our personality pool is
diminishing due to efficiency rewardsmanship.  Also things like dialects,
language variations, cultural variations, etc. are evening out, we are
becoming an optimized, homogenized society.  Will AGI's follow the same
trend and have minimal personality variations and maximal text-book style
efficiency of communication?  Sometimes breaking the mold of expressing
oneself can have maximal effect of conveying an idea or ideas.  I'm reminded
of once taking a class where the instructor spoke very fast on purpose like
an auctioneer.  Some students immediately freaked because it was abnormal
but the theory was explained and it did work as intended where there was a
very high transfer rate of information and rapid two-way communication.
Computers, esp. AGI's could experiment intentionally with different twists
on language perhaps finding new and better ways of communicating.

 

John

 

 


Personally, I find many of his posts highly entertaining...

If your sense of humor differs, you can always use the DEL key ;-)

-- Ben G

On 5/20/07, Eliezer S. Yudkowsky [EMAIL PROTECTED] wrote:

Why is Murray allowed to remain on this mailing list, anyway?  As a
warning to others?  The others don't appear to be taking the hint.

-
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=231415user_secret=e9e40a7e

Re: [agi] Write a doctoral dissertation, trigger a Singularity

2007-05-24 Thread Mark Waser
Some people are thrown by unusual ways of communicating, some are not.  
Murray is drawing pretty consistent ratings/opinions (in terms of the validity 
of his content) so I don't think that it is his communications style that is 
the problem.

Personally, I judge content value on some vague formula involving 
communication size, correct and new content, incorrect content, and how easily 
I can tell the latter two apart.  Murray's posts have *very* little 
intelligence and wisdom particularly when compared the the amount of just plain 
incorrect content.  Thus, he has negligible content value for me.

On the other hand, since I tend not to freak -- he certainly does have 
some humor values (and there but for the grace . . . )
  - Original Message - 
  From: John G. Rose 
  To: agi@v2.listbox.com 
  Sent: Thursday, May 24, 2007 10:56 AM
  Subject: RE: [agi] Write a doctoral dissertation, trigger a Singularity


  Different people have different ways of communicating.  Many Murray posts are 
sprinkled with annoyances but then they do have some intelligence and wisdom.  
They remind me of a W. C. Fields like way of speaking with some Snake Oil 
salesmanship.  Actual Snake Oil BTW can be good for certain things but fake 
Snake Oil is fake, hence the reputation.  More generally speaking I have found 
from my experience that some of the worst communicators have the most to say 
and some of the best communicators, the least.  Not to say Murray is a bad 
communicator, but we have grown accustomed to marginalizing people who break 
the mold thus minimizing the variances of personalities.  Part of this is due 
to a franchised-like educational system that has existed for several decades. 
 Our personality pool is diminishing due to efficiency rewardsmanship.  Also 
things like dialects, language variations, cultural variations, etc. are 
evening out, we are becoming an optimized, homogenized society.  Will AGI's 
follow the same trend and have minimal personality variations and maximal 
text-book style efficiency of communication?  Sometimes breaking the mold of 
expressing oneself can have maximal effect of conveying an idea or ideas.  I'm 
reminded of once taking a class where the instructor spoke very fast on purpose 
like an auctioneer.  Some students immediately freaked because it was 
abnormal but the theory was explained and it did work as intended where there 
was a very high transfer rate of information and rapid two-way communication.  
Computers, esp. AGI's could experiment intentionally with different twists on 
language perhaps finding new and better ways of communicating.

   

  John

   

   


  Personally, I find many of his posts highly entertaining...

  If your sense of humor differs, you can always use the DEL key ;-)

  -- Ben G

  On 5/20/07, Eliezer S. Yudkowsky [EMAIL PROTECTED] wrote:

  Why is Murray allowed to remain on this mailing list, anyway?  As a
  warning to others?  The others don't appear to be taking the hint.


--
  This list is sponsored by AGIRI: http://www.agiri.org/email
  To unsubscribe or change your options, please go to:
  http://v2.listbox.com/member/?; 

-
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=231415user_secret=e9e40a7e

RE: [agi] Write a doctoral dissertation, trigger a Singularity

2007-05-24 Thread John G. Rose
He definitely has a great vocabulary you have to admit and he is a good
showman.  Also his critiques of others writings is interesting and humorous
as well.  As far as the technical validity of his AI project I don't know
because I'm still struggling with the ASCII diagrams J

 

John

 

From: Mark Waser [mailto:[EMAIL PROTECTED] 



Some people are thrown by unusual ways of communicating, some are not.
Murray is drawing pretty consistent ratings/opinions (in terms of the
validity of his content) so I don't think that it is his communications
style that is the problem.

 

Personally, I judge content value on some vague formula involving
communication size, correct and new content, incorrect content, and how
easily I can tell the latter two apart.  Murray's posts have *very* little
intelligence and wisdom particularly when compared the the amount of just
plain incorrect content.  Thus, he has negligible content value for me.

 

On the other hand, since I tend not to freak -- he certainly does have
some humor values (and there but for the grace . . . )

- Original Message - 

From: John G. Rose mailto:[EMAIL PROTECTED]  

 

Different people have different ways of communicating.  Many Murray posts
are sprinkled with annoyances but then they do have some intelligence and
wisdom.  They remind me of a W. C. Fields like way of speaking with some
Snake Oil salesmanship.  Actual Snake Oil BTW can be good for certain things
but fake Snake Oil is fake, hence the reputation.  More generally speaking I
have found from my experience that some of the worst communicators have the
most to say and some of the best communicators, the least.  Not to say
Murray is a bad communicator, but we have grown accustomed to
marginalizing people who break the mold thus minimizing the variances of
personalities.  Part of this is due to a franchised-like educational
system that has existed for several decades.  Our personality pool is
diminishing due to efficiency rewardsmanship.  Also things like dialects,
language variations, cultural variations, etc. are evening out, we are
becoming an optimized, homogenized society.  Will AGI's follow the same
trend and have minimal personality variations and maximal text-book style
efficiency of communication?  Sometimes breaking the mold of expressing
oneself can have maximal effect of conveying an idea or ideas.  I'm reminded
of once taking a class where the instructor spoke very fast on purpose like
an auctioneer.  Some students immediately freaked because it was abnormal
but the theory was explained and it did work as intended where there was a
very high transfer rate of information and rapid two-way communication.
Computers, esp. AGI's could experiment intentionally with different twists
on language perhaps finding new and better ways of communicating.

 

John

 

 


Personally, I find many of his posts highly entertaining...

If your sense of humor differs, you can always use the DEL key ;-)

-- Ben G

-
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=231415user_secret=e9e40a7e

[agi] Opensource Business Model

2007-05-24 Thread YKY (Yan King Yin)

It seems that AGI is going to require collaboration on a scale larger than
usual startups, so I'm thinking of a new business model as follows:

1.  form a group of members with equal rights
2.  let members contribute code / algorithms / architectures
3.  members vote on the worth of the contribution, company shares are
awarded to the contributor accordingly
4.  forking of branches is allowed (to allow for different AGI theories)
5.  source code is open, but download is commercial
6.  pricing and other business choices will be set via motioning and voting
7.  a president may be elected to do administrative tasks and coordination

What's special about this is that we put emphasis on code as well as more
intangible things like algorithms and architectures.  An AGI project
initially should not be focused on incremental code-level changes.

I'm wondering how many people would be interested in such a setup?

If Ben wants he can open part of Navamente this way too...

YKY

-
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=231415user_secret=e9e40a7e

Re: [agi] Opensource Business Model

2007-05-24 Thread Russell Wallace

I think it's important to have core technical decisions made up front,
before a group is gathered; there are things fundamental enough that they
can't be decided by committee.

I think it's equally important to have core business decisions made up
front, because there are aspects of the business side that are also
fundamental enough that they can't be decided by committee:

Will participants be paid, during the course of the project, market rates,
or enough to get by on, or minimal subsistence, or nothing?

Will there be further payment coming later, and if so in what form (stock
options etc?) and how much?

Where will the money come from? What will the product do that people will
pay for it? How will it be the case that they can't get what they want
without paying for this product?

How much money is likely to come in? What sort of order of magnitude of
market numbers and amount each customer will be willing and able to pay are
we looking at?

Etc. This is not a criticism of your suggestion - for all I know, maybe you
have answers to these lined up already - but a note of some things that need
to be clarified.

-
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=231415user_secret=e9e40a7e

Re: [agi] Pure reason is a disease.

2007-05-24 Thread Joel Pitt

On 5/25/07, Mark Waser [EMAIL PROTECTED] wrote:

 Sophisticated logical
 structures (at least in our bodies) are not enough for actual
 feelings. For example, to feel pleasure, you also need things like
 serotonin, acetylcholine, noradrenaline, glutamate, enkephalins and
 endorphins.  Worlds of real feelings and logic are loosely coupled.

OK.  So our particular physical implementation of our mental computation
uses chemicals for global environment settings and logic (a very detailed
and localized operation) uses neurons (yet, nonetheless, is affected by the
global environment settings/chemicals).  I don't see your point unless
you're arguing that there is something special about using chemicals for
global environment settings rather than some other method (in which case I
would ask What is that something special and why is it special?).


You possibly already know this and are simplifying for the sake of
simplicity, but chemicals are not simply global environmental
settings.

Chemicals/hormones/peptides etc. are spatial concentration gradients
across the entire brain, which are much more difficult to emulate in
software then a singular concetration value. Add to this the fact that
some of these chemicals inhibit and promote others and you get
horrendously complex reaction diffusion systems.

--
-Joel

Unless you try to do something beyond what you have mastered, you
will never grow. -C.R. Lawton

-
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=231415user_secret=e9e40a7e