Re: [agi] general patterns induction

2002-12-10 Thread Pablo Carbonell
oh tanks a lot, would you please email it to me!!?

 
 Pablo,
 
 If you are interested in Solomonoff induction and don't 
want to
 spend all the money to buy the book just yet then you 
might be
 interested in a paper I wrote a few years back that starts 
from
 basic computation theory and proves all the key results.
 It about 25 pages of mathematics in PDF format, I can 
email it
 to you if you like.
 
 The key points are:
 
 Solomonoff induction will learn anything that is 
computationally
 expressible (i.e. anything useful) with an error rate that falls
 to zero faster than 1/n where n is the number of bits of 
input
 data that the system has been given.
 
 Solomonoff induction is not computable and it is difficult to
 approximate well.
 
 Most learning methods in statistics and machine learning 
can in
 some sense be proven to be computable approximations 
to Solomonoff
 induction.
 
 
 So, in short, it's an interesting theoretical model that's 
amazingly
 powerful but it's not something you can directly use in 
practice.
 The prize for an amazingly powerful and practical system is 
still
 very much up for grabs :)
 
 If you're serious about data compression also check out a 
book
 called Text Compression by Bell, Cleary and Witten.
 
 Cheers
 Shane
 
 
 ---
 To unsubscribe, change your address, or temporarily 
deactivate your subscription, 
 please go to http://v2.listbox.com/member/?
[EMAIL PROTECTED]




---
Estas pagando mas por lo mismo?.
Tarifa plana con Antivirus en tu mail. Internet con garantia Montevideo COMM.
Informate por el 402 25 16 o en
http://www.montevideo.net.uy/hnnoticiaj1.exe?47,0

---
To unsubscribe, change your address, or temporarily deactivate your subscription, 
please go to http://v2.listbox.com/member/?[EMAIL PROTECTED]



RE: [agi] Grounding

2002-12-10 Thread Ben Goertzel




 True. The more fundamental point is that symbols representing entities and
 concepts need to be grounded with (scalar) attributes of some sort.

 How this is *implemented* is a practical matter. One important
 consideration
 for AGI is that data is easily retrievable by vector distance (similarity)
 and that new patterns can be leaned (unlearned) incrementally.

 Peter

Again, I agree with your general point, but I'll observe that *vector
distance* is only one among many ways of measuring similarity!

We do use vector distance for some things in Novamente, but our more
fundamental distance measure is based on what we call the inference
metric... a different way of measuring distances that still obeys the
metric space axioms, but cooperates more nicely with probabilistic
inference.

Somewhere in the future, there lies a general theory of AGI of which all our
current attempts will be comprehensible as special cases ;)

-- Ben G

---
To unsubscribe, change your address, or temporarily deactivate your subscription, 
please go to http://v2.listbox.com/member/?[EMAIL PROTECTED]



Re: [agi] introduction

2002-12-10 Thread Michael Roy Ames
Damien Sullivan wrote:
 Hi!  I joined this list recently, figured I'd say who I am.  Well,
 some of you may know already, from extropians, where I used to post a
 fair bit :) or from my Vernor Vinge page.  But now I'm a first year
 comp sci/cog sci PhD student at Indiana University, hoping to work on
 extending Jim Marshall's Metacat in Hofstadter's lab.  Nothing much
 has really happened beyond hope and a few meetings and taking his
 group theory class.  I've been reading Eliezer's _Levels_ pages, and
 having Andy Clark's _Being There_ around, but mostly my life has been
 classes.  Mostly the OS class, actually.  Sigh.

 -xx- Damien X-)

 ---
 To unsubscribe, change your address, or temporarily deactivate your
 subscription, please go to
 http://v2.listbox.com/member/?[EMAIL PROTECTED]


Damien,

Hi.  I'm am quite interested in Jim's Metacat also.  It's on my To-Do list
to get it running under linux... but the way my workload is going I think
Jim will get his planned re-write done first. :)It would be interesting
to hear about what new directions Metacat is going in.  Welcome to the list.

Michael Roy Ames

---
To unsubscribe, change your address, or temporarily deactivate your subscription, 
please go to http://v2.listbox.com/member/?[EMAIL PROTECTED]