Benjamin Goertzel wrote:
I think that if it were dumb enough that it could be treated as a tool,
then it would have to no be able to understand that it was being used as
a tool.
And if it could not understand that, it would just not have any hope of
being generally intelligent.
You seem to be
Edward W. Porter wrote:
In response to Richard Loosemore’s Post of Sun 11/4/2007 12:15 PM
responding to my prior message of Sat 11/3/2007 3:28 PM
ED’s prior msg For example, humans might for short sighted personal
gain (such as when using them in weapon systems)
RL Whoaa! You assume that
Richard Loosemore wrote:
Charles D Hixson wrote:
Richard Loosemore wrote:
Edward W. Porter wrote:
Richard in your November 02, 2007 11:15 AM post you stated:
...
I think you should read some stories from the 1930's by John W.
Campbell, Jr. Specifically the three stories collectively
Charles D Hixson wrote:
Richard Loosemore wrote:
Charles D Hixson wrote:
Richard Loosemore wrote:
Edward W. Porter wrote:
Richard in your November 02, 2007 11:15 AM post you stated:
...
I think you should read some stories from the 1930's by John W.
Campbell, Jr. Specifically the three
On Sat, Nov 03, 2007 at 01:17:03PM -0400, Richard Loosemore wrote:
Isn't there a fundamental contradiction in the idea of something that
can be a tool and also be intelligent? What I mean is, is the word
tool usable in this context?
In the 1960's, there was an expression you're just a
Hi,
On Sat, Nov 03, 2007 at 01:41:30AM -0400, Philip Goetz wrote:
Why don't you describe what you've done in more detail, e.g., what
parser you're using, and how you hooked it up to Cyc?
I randomly selected the link grammer parser
http://www.link.cs.cmu.edu/link/ for the parser, although
--- Linas Vepstas [EMAIL PROTECTED] wrote:
I randomly selected the link grammer parser
http://www.link.cs.cmu.edu/link/ for the parser,
It still has a few bugs.
(S (NP I)
(VP ate pizza
(PP with
(NP pepperoni)))
.)
(S (NP I)
(VP ate pizza
(PP with
Jiri Jelinek wrote:
Richard,
Question: do you believe it will really be possible to build something
that is completely intelligent -- smart enough to understand humans in
such a way as to have conversations on the subtlest of subjects, and
being able to understand the functions of things in
On Mon, Nov 05, 2007 at 11:11:41AM -0800, Matt Mahoney wrote:
--- Linas Vepstas [EMAIL PROTECTED] wrote:
I randomly selected the link grammer parser
http://www.link.cs.cmu.edu/link/ for the parser,
It still has a few bugs.
(S (NP I)
(VP ate pizza
(PP with
(NP
Matt,
We can compute behavior, but nothing indicates we can compute
feelings. Qualia research needed to figure out new platforms for
uploading.
Regards,
Jiri Jelinek
On Nov 4, 2007 1:15 PM, Matt Mahoney [EMAIL PROTECTED] wrote:
--- Jiri Jelinek [EMAIL PROTECTED] wrote:
Matt,
Create a
On Mon, Nov 05, 2007 at 03:17:13PM -0600, Linas Vepstas wrote:
On Mon, Nov 05, 2007 at 11:11:41AM -0800, Matt Mahoney wrote:
--- Linas Vepstas [EMAIL PROTECTED] wrote:
I randomly selected the link grammer parser
http://www.link.cs.cmu.edu/link/ for the parser,
It still has a few
On Nov 4, 2007 12:40 PM, Matt Mahoney [EMAIL PROTECTED] wrote:
--- Jiri Jelinek [EMAIL PROTECTED] wrote:
If you can't get meaning from clean input format then what makes you
think you can handle NL?
Humans seem to get meaning more easily from ambiguous statements than from
mathematical
Monika Krishan wrote:
2. Would it be a worthwhile exercise to explore what Human General
Intelligence, in it's present state, is capable of ?
Nah.
--
Eliezer S. Yudkowsky http://singinst.org/
Research Fellow, Singularity Institute for Artificial Intelligence
-
Richard Loosemore wrote:
Charles D Hixson wrote:
Richard Loosemore wrote:
Charles D Hixson wrote:
Richard Loosemore wrote:
Edward W. Porter wrote:
Richard in your November 02, 2007 11:15 AM post you stated:
...
In parents, sure, those motives exist.
But in an AGI there is no earthly
Matt Mahoney wrote:
--- Linas Vepstas [EMAIL PROTECTED] wrote:
...
It still has a few bugs.
...
(S (NP I)
(VP ate pizza
(PP with
(NP Bob)))
.)
My name is Hannibal Lector.
...
-- Matt Mahoney, [EMAIL PROTECTED]
(Hannibal Lector was a movie cannibal)
--- Monika Krishan [EMAIL PROTECTED] wrote:
Hi All,
I'm new to the list. So I'm not sure if these issues have been already been
raised.
1. Do you think AGIs will eventually reach a point in their evolution when
self improvement might come to mean attempting to solve previously solved
16 matches
Mail list logo