RE: [agi] Psychometric AI

2004-09-17 Thread J. W. Johnston
I like the gist of it ... though just did quick skim of the paper. In
particular I like the idea of pushing/orienting AGI systems toward NLU
and human standards to promote usability (or more properly: our
ability to mutually relate). 

As AGI testing and validation goes, some might recall in my IVI
Architecture posted here about a year ago, I specified testing to
proceed from Mental Status Tests (basic orientation, attention, memory,
etc. tests like a human neurologist would administer) - Personality
Tests (to detect any severe psychoses, in interest of FAI :-)) - IQ
Tests (here's where WAIS, and others would come into play). The latter,
I agree, is largely the crux of what is meant by Intelligence. But there
is a lot of cognitive framework that needs to be in place first.

After standard IQ tests, one would start testing in particular narrower
domains of interest to the AGI's application at hand, e.g., AP
Chemistry, Astrophysics, Auto Mechanics, Symphonic Composition, or
whatever.

J. W. Johnston

-Original Message-
From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED] On
Behalf Of Ben Goertzel
Sent: Friday, September 17, 2004 10:49 AM
To: [EMAIL PROTECTED]
Subject: RE: [agi] Psychometric AI



Hi,

I don't think that trying to overfit one's AGI system to some specific
set of tests is a really useful approach.

Also, I don't think that intelligence tests, as currently formulated for
psychometric testing purposes, form a very natural set of developmental
milestones for an AGI system.

I think it would be possible to create a narrow AI system that passed a
lot of IQ tests but still lacked general intelligence -- just as one can
create narrow AI systems to play chess, checkers, and so forth.

Psychometric tests are only moderately meaningful in the
human-intelligence context for which they were devised; applying them
beyond the human domain weakens their meaning even further...

I don't think it's a boundlessly dumb approach or anything; but it's not
an approach I would particularly recommend...

-- Ben


-Original Message-
From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED]
Behalf Of Shane
Sent: Friday, September 17, 2004 10:37 AM
To: [EMAIL PROTECTED]
Subject: Re: [agi] Psychometric AI



Hi Ben,

You think it's a silly approach because...?

I'm just about to read their paper and thus I haven't
formed an opinion on their approach yet myself.

Thanks
Shane


 --- Ben Goertzel [EMAIL PROTECTED] wrote:

 This may be of interest to someone...

 Psychometric AI:

 http://www.cogsci.rpi.edu/peri/main.html

 A slightly silly approach, IMO, but it would certainly be a tractable 
 research program to apply NM to these tasks

 I'm more interested in the AGI-SIM approach, however...

 -- Ben


 ---
 To unsubscribe, change your address, or temporarily deactivate your
subscription,
 please go to http://v2.listbox.com/member/[EMAIL PROTECTED]


Find local movie times and trailers on Yahoo! Movies.
http://au.movies.yahoo.com

---
To unsubscribe, change your address, or temporarily deactivate your
subscription, please go to
http://v2.listbox.com/member/[EMAIL PROTECTED]



---
To unsubscribe, change your address, or temporarily deactivate your
subscription, 
please go to http://v2.listbox.com/member/[EMAIL PROTECTED]

---
To unsubscribe, change your address, or temporarily deactivate your subscription, 
please go to http://v2.listbox.com/member/[EMAIL PROTECTED]


RE: [agi] Psychometric AI

2004-09-17 Thread J. W. Johnston
I noticed that too. Seemed like this list doesn't archive attachments
(or has particularly good SPAM filter :-). I don't have the paper posted
on any site. Will send you a PDF (748 KB). If others want a copy, let me
know via email.

Thanks!

J. W.

-Original Message-
From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED] On
Behalf Of Peter Voss
Sent: Friday, September 17, 2004 12:39 PM
To: [EMAIL PROTECTED]
Subject: RE: [agi] Psychometric AI


I can't find it in the archives. Can you give me a link?

Thanks,

Peter


-Original Message-
From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED]
Behalf Of J. W. Johnston

...As AGI testing and validation goes, some might recall in my IVI
Architecture posted here about a year ago, I specified testing to
proceed from Mental Status Tests (basic orientation, attention, memory,
etc. tests like a human neurologist would administer) ...

---
Outgoing mail is certified Virus Free.
Checked by AVG anti-virus system (http://www.grisoft.com).
Version: 6.0.764 / Virus Database: 511 - Release Date: 9/15/2004

---
To unsubscribe, change your address, or temporarily deactivate your
subscription, 
please go to http://v2.listbox.com/member/[EMAIL PROTECTED]

---
To unsubscribe, change your address, or temporarily deactivate your subscription, 
please go to http://v2.listbox.com/member/[EMAIL PROTECTED]


RE: [agi] AGI's and emotions

2004-02-25 Thread J. W. Johnston
Title: Message



Folks 
interested in this thread should check out the draft of Marvin Minsky's upcoming 
book "The Emotion Machine". Been available at his web site for quite some 
time:
http://web.media.mit.edu/~minsky/

The 
current draft doesn't seem to have an executive summary that lays 
outthemain thesis, but in a 12/13/99 posting (http://www.generation5.org/content/1999/minsky.asp), 
Minsky says:

The central idea is that emotion is not 
different from thinking. Instead, each emotion is a type or arrangement of 
thinking. There is no such thing as unemotional thinking, because there always 
must be a selection of goals, and a selection of resources for achieving them. 



From 
my notesafter skimming some of the book about a year ago, it seemed 
thatMinsky sees emotions as kinds of "presets" (his term - "Selectors") 
that determine what mind resources and goals are active at a given time to solve 
a particular "problem". [I seem to recall Antonio Damasio also had a similar 
conception... and he called the emotional "set points" 
PATTERNS!]

The 
following isfrom the draft of Chapter 1 Section 6:


Each of 
our major emotional states results from switching the set of resources in 
useby turning certain ones on and other ones off. Any such change will affect how we think, by 
changing our brains activities.

In other 
words, our emotional states are not separate and distinct from thoughts; 
instead, each one is a different way to think.



For example, when an emotion like 
Anger takes over, you abandon some 
of your ways to make plans. You turn off some safety-defenses. You replace some 
of your slower-acting resources with ones that tend to more quickly reactand to 
do with more speed and strength. You trade empathy for hostility, change 
cautiousness into aggressiveness, and give less thought to the consequences. And 
then it may seem (to both you and your friends) that youve switched to a new 
personality.

Good 
stuff! (IMHO)

J. W. 
Johnston

  
  -Original Message-From: 
  [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED] On Behalf Of 
  Ben GoertzelSent: Wednesday, February 25, 2004 11:25 
  AMTo: [EMAIL PROTECTED]Subject: RE: [agi] AGI's and 
  emotions
  
  Agreed --- we tend to project even abstract experiences back down to 
  our physical layer, and then react to them physically ... a kind of analogy 
  that AGI's are unlikely to pursue so avidly unless specifically designed to do 
  so
  
  ben 
  g
  
-Original Message-From: [EMAIL PROTECTED] 
[mailto:[EMAIL PROTECTED]On Behalf Of Philip 
SuttonSent: Wednesday, February 25, 2004 12:00 PMTo: 
[EMAIL PROTECTED]Subject: RE: [agi] AGI's and 
emotions
 Emotions ARE thoughts but they differ from most 
thoughts in the extent
 to which they involve the "primordial" brain 
AND the non-neural
 physiology of the body as well. 


I guess we 
call emotions 'feelings' because we feel them - ie. we can feel the 
effect they trigger in our whole body, detected via our internal monitoring 
of physical body condition.

Given this, 
unless AGIs are also programmed for thoughts or goal satisfactions to 
trigger 'physical' and/or other forms of systemic reaction, I suppose their 
emotions will have a lot less 'feeling' depth to them than humans and other 
biological species experience.

Cheers, 
Philip



To unsubscribe, change your address, or temporarily deactivate your 
subscription, please go to 
http://v2.listbox.com/member/[EMAIL PROTECTED] 

  
  
  To unsubscribe, change your address, or temporarily deactivate your 
  subscription, please go to 
  http://v2.listbox.com/member/[EMAIL PROTECTED]
  


To unsubscribe, change your address, or temporarily deactivate your subscription, 
please go to http://v2.listbox.com/member/[EMAIL PROTECTED]




RE: [agi] One AGI or many?

2004-01-29 Thread J. W. Johnston
Finally got around to skimming the referenced paper. Per lots of Ben's
stuff, found it quite readable and interesting. (Especially a good
reminder of key Novamente concepts for those of us who incompletely
waded through the more comprehensive documentation in the past :-)

Two quick comments...

1.  Haven't purged the bogus [comment] label yet :-(

2.  Have you considered and/or does Novamente support non-atomic memory
structures?  In particular-- episodic type memories. In my IVI
architecture posted here a while back, I suggest a Memory Subsystem
consisting (mainly) of a Knowledge Base (compiled concepts/semantic
net-type system) AND Memory Files (raw video, sound, and other sense
data). 

Seems to me, when you start talking about distributed AGIs talking
Psynese, might be worthwhile to support the transfer of Raw/Episodic
Memories as well. For instance-- has your favorite cluster seen Lord of
the Rings III yet? :-) Or maybe more to your point, be able to send
videos from a ShapeWorld UI session between distributed AGIs.

The best Memory Files would contain raw, but synchronized, vision
(electromagnetic), sound (vibration), chemical (scents, tastes,
motions), and other packaged data files.  Might be useful for
distributed AGIs to digest THESE sources for fairly unambiguous
compilation into their own Atoms/Maps/KBs.

J. W. Johnston

-Original Message-
From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED] On
Behalf Of Ben Goertzel
Sent: Thursday, January 15, 2004 3:02 PM
To: [EMAIL PROTECTED]
Subject: RE: [agi] One AGI or many?



This theme of partial mind-melds between future AI's, leading to a
kind of hybrid between a society and an individual, was discussed in a
paper I wrote last year, which was (I think) briefly discussed on this
list.

I called this kind of hybrid being a mindplex; see..

http://www.goertzel.org/dynapsyc/2003/mindplex.htm

This paper will be published in the proceedings of the 2001 Global Brain
conference, one day...

-- Ben


 Maybe the most successful approach will be a community of individual 
 AGIs that can specialise but that also engage in exchanges of 
 data/knowledge and can also do partial mind-melds when super- 
 mentation is required.
 
 Cheers, Philip
 
 
 
 I think that this last choice is the correct one.  In fact it will be 
 forced, basically due to the speed of light limitation in the 
 transmission of signals.  But note that there will be broadband 
 connections between the separate members of the community.  Thus it 
 will in a sense be one individual in the sense that a corporation is 
 one individual.  (Well, actually a bit more so.  They will be able to 
 engage in genuine thought transmission, where here a thought would be 
 an entire mental model of a situation, complete with desired goal 
 states and physical sensoria.  Probably bz2 is pushing the limit on 
 compression, but if they share high level mental constructs, what 
 would need to be transmitted would be analogous to the source code of 
 a program, and it's data, but NOT the libraries.  This would enable a 
 real compression in the needed bandwidth.  [Note that speech among 
 humans automatically assumes some of these properties, so a verbal 
 description is much more compact than a AV recording.])  This, of 
 course, assumes that they will have identical primitives (i.e., the 
 same version of the library).

 ---
 To unsubscribe, change your address, or temporarily deactivate your 
 subscription, please go to 
 http://v2.listbox.com/member/[EMAIL PROTECTED]




---
To unsubscribe, change your address, or temporarily deactivate your
subscription, 
please go to http://v2.listbox.com/member/[EMAIL PROTECTED]

---
To unsubscribe, change your address, or temporarily deactivate your subscription, 
please go to http://v2.listbox.com/member/[EMAIL PROTECTED]