Mike,
On 5/28/08, Mike Tintner [EMAIL PROTECTED] wrote:
Steve: I have been advocating fixing the brain shorts that lead to
problems, rather than jerking the entire world around to make brain shorted
people happy.
Which brain shorts? IMO the brain's capacity for shorts in one situation
is
From: Matt Mahoney [mailto:[EMAIL PROTECTED]
--- John G. Rose [EMAIL PROTECTED] wrote:
Consciousness with minimal intelligence may be easier to build than
general
intelligence. General intelligence is the one that takes the
resources.
A general consciousness algorithm, one that creates
On Thu, May 29, 2008 at 6:41 PM, John G. Rose [EMAIL PROTECTED] wrote:
How can the two terms be equivalent? Some may think that they are
inseparable, or that one cannot exist without the other, I can understand
that perspective. But there is a quantitative relationship between the two.
When
Brad Paulsen wrote:
Fellow AGI-ers,
At the risk of being labeled the list's newsboy...
U.S. Plan for 'Thinking Machines' Repository
Posted by samzenpus on Wednesday May 28, @07:19PM
from the save-those-ideas-for-later dept.
An anonymous reader writes Information scientists organized by the
i have higher hopes for the project than richard, failing to see the
circular causality alluded to... first, human intellect is quickly
overwhelmed when trying to build logic structures with complex relationships
or even many simple relationships strung together (we max at four or five
recursions
Richard:Interesting, but I am araid that whenever I see someone report a
project
to collect all the world's knowledge in a nice, centralized format (Cyc,
and Daughters-of-Cyc) I cannot help but think of one of the early
chapters in Neal Stephenson's Quicksilver, were Wilkins, Leibnitz and
others
--- Tudor Boloni [EMAIL PROTECTED] wrote:
as a side note, does anyone else feel that intelligence and compression
(or less formally the ability to summarize) are identical?
Yes, http://cs.fit.edu/~mmahoney/compression/rationale.html
See also Hutter's work on AIXI which proves the equivalence.