Mike Tintner, et al,
After failing to get ANY response to what I thought was an important point (
*Paradigm Shifting regarding Consciousness) *I went back through my AGI
inbox to see what other postings by others weren't getting any responses.
Mike Tintner was way ahead of me in no-response
Hi Steve,
I'm thinking about the Texai bootstrap dialog system, and in particular about
adding grammar rules and vocabulary for the utterance Compile a class.
Cheers.
-Steve
Stephen L. Reed
Artificial Intelligence Researcher
http://texai.org/blog
http://texai.org
3008 Oak Crest Ave.
Austin,
Mike Tintner [mailto:[EMAIL PROTECTED] wrote
And that's the same mistake people are making with AGI generally - no one
has a model of what general intelligence involves, or of the kind of
problems it must solve - what it actually DOES - and everyone has left that
till later, and is instead
Hi Steve,
I'm thinking about the solution to the Friendliness problem, and in
particular desperately need to finish my paper on it for the AAAI Fall
Symposium that is due by next Sunday.
What I would suggest, however, is that quickly formatted e-mail postings are
exactly the wrong method for
Steve,
Those of us w/ experience in the field have heard the objections you
and Tintner are making hundreds or thousands of times before. We have
already processed the arguments you're making and found them wanting.
And we have already gotten tired of arguing those same points, back in
our
Steve,
A quick response for now. I was going to reply to an earlier post of yours, in
which you made the most important point for me:
The difficulties in proceeding in both neuroscience and AI/AGI is NOT a lack
of technology or clever people to apply it, but is rather a lack of
understanding
The truth is, one of the big problems in
the field is that nearly everyone working on a concrete AI system has
**their own** particular idea of how to do it, and wants to proceed
independently rather than compromising with others on various design
points. It's hardly a herd mentality -- the
2008/6/8 Ben Goertzel [EMAIL PROTECTED]:
Those of us w/ experience in the field have heard the objections you
and Tintner are making hundreds or thousands of times before. We have
already processed the arguments you're making and found them wanting.
I entirely agree with this response. To
The abnormalis sapiens Herr Doktor Steve Richfield wrote:
Hey you guys with some gray hair and/or bald spots,
WHAT THE HECK ARE YOU THINKING?
prin Goertzel genesthai, ego eimi
http://www.scn.org/~mentifex/mentifex_faq.html
My hair is graying so much and such a Glatze is beginning,
that I
Steve Richfield wrote
In short, most people on this
list appear to be interested only in HOW to straight-line program an AGI
(with the implicit assumption that we operate anything at all like we
appear
to operate), but not in WHAT to program, and most especially not in any
apparent
Steve Richfield asked:
Hey you guys with some gray hair and/or bald spots, WHAT THE HECK ARE YOU
THINKING?
We're thinking Don't feed the Trolls!
_
agi | Archives http://www.listbox.com/member/archive/303/=now
http://www.listbox.com/member/archive/rss/303/ | Modify
Ben and Mike,
WOW, two WONDERFUL in-your-face postings that CLEARLY delimit a central AGI
issue. Since my original posting ended with a question and Ben took a shot
at the question, I would like to know a little more...
On 6/8/08, Ben Goertzel [EMAIL PROTECTED] wrote:
Those of us w/ experience
Gary Miller writes:
We're thinking Don't feed the Trolls!
Yeah, typical trollish behavior -- upon failing to stir the pot with one
approcah, start adding blanket insults. I put Steve Richfield in my killfile a
week ago or so, but I went back to the archive to read the message in question.
From: Dr. Matthias Heger [mailto:[EMAIL PROTECTED]
The problem of consciousness is not only a hard problem because of
unknown
mechanisms in the brain but it is a problem of finding the DEFINITION of
necessary conditions for consciousness.
I think, consciousness without intelligence is not
From: A. T. Murray [mailto:[EMAIL PROTECTED]
The abnormalis sapiens Herr Doktor Steve Richfield wrote:
Hey you guys with some gray hair and/or bald spots,
WHAT THE HECK ARE YOU THINKING?
prin Goertzel genesthai, ego eimi
http://www.scn.org/~mentifex/mentifex_faq.html
My hair
John G. Rose [mailto:[EMAIL PROTECTED] wrote
For general intelligence some components and sub-components of consciousness
need to be there and some don't. And some could be replaced with a human
operator as in an augmentation-like system. Also some components could be
designed drastically
While the details vary widely, Mike and I were addressing the very concept
of writing code to perform functions (e.g. thinking) that apparently
develop on their own as emergent properties, and in the process foreclosing
on many opportunities, e.g. developing in variant ways to address problems
From: Dr. Matthias Heger [mailto:[EMAIL PROTECTED]
For general intelligence some components and sub-components of
consciousness
need to be there and some don't. And some could be replaced with a human
operator as in an augmentation-like system. Also some components could
be
designed
Ben: No one knows which brain functions rely on emergence to which extents
...
we're still puzzling this out even in relatively well-understood brain
regions
like visual cortex. ... But, the neural structures that carry out
object-recognition may well emerge
as a result of complex nonlinear
- Original Message
From: Mike Tintner [EMAIL PROTECTED]
My approach is: first you look at the problem of crossing domains in its own
terms - work out an ideal way to solve it - which will probably be close to
the way the mind does solve it - then think about how to implement your
- Original Message
From: Mike Tintner [EMAIL PROTECTED]
My approach is: first you look at the problem of crossing domains in its own
terms - work out an ideal way to solve it - which will probably be close to
the way the mind does solve it - then think about how to implement your
Nothing will ever be attempted if all possible objections must be
first overcome - Dr Samuel Johnson
-- Ben G
On Mon, Jun 9, 2008 at 7:41 AM, Jim Bromer [EMAIL PROTECTED] wrote:
- Original Message
From: Mike Tintner [EMAIL PROTECTED]
My approach is: first you look at the problem
J. Andrew Rogers wrote:
On Jun 7, 2008, at 5:06 PM, Richard Loosemore wrote:
But that is a world away from the idea that neurons, as they are, are
as simple as transistors. I do not believe this was a simple
misunderstanding on my part: the claim that neurons are as simple as
transistors
On Jun 8, 2008, at 7:27 PM, Richard Loosemore wrote:
I directly and exactly *quoted* several passages that you wrote.
And completely ignored both the context and intended semantics. Hence
why I might be under the impression that there is a reading
comprehension issue.
But enough of
Steve Richfield wrote:
Mike Tintner, et al,
After failing to get ANY response to what I thought was an important
point (*Paradigm Shifting regarding Consciousness) *I went back through
my AGI inbox to see what other postings by others weren't getting any
responses. Mike Tintner was way
Regarding how much of the complexity of real neurons we would need to
put into a computational neural net model in order to make a model
displaying a realistic emulation of neural behavior -- the truth is
we JUST DON'T KNOW
Izhikevich for instance
But enough of that, let's get to the meat of it: Are you arguing that the
function that is a neuron is not an elementary operator for whatever
computational model describes the brain?
We don't know which function that describes a neuron we need to use --
are Izhikevich's nonlinear dynamics
John G. Rose wrote:
[...]
Hey you guys with some gray hair and/or bald spots,
WHAT THE HECK ARE YOU THINKING?
prin Goertzel genesthai, ego eimi
Before Goertzel came to be, I am. (a Biblical allusion in Greek :-)
http://www.scn.org/~mentifex/mentifex_faq.html
The above link is an
John G. Rose wrote:
Does this mean that now maybe you can afford to integrate
some AJAX into that JavaScript AI mind of yours?
John
No, because I remain largely ignorant of Ajax.
http://mind.sourceforge.net/Mind.html
and the JavaScript Mind User Manual (JMUM) at
I don't think anyone anywhere on this list ever suggested time sequential
was required for consciousness. Now as data streams in from sensory
receptors that initially is time sequential. But as it is processed that
changes to where time is changed. And time is sort of like an index eh? Or
is time
30 matches
Mail list logo