Steve,

Those of us w/ experience in the field have heard the objections you
and Tintner are making hundreds or thousands of times before.  We have
already processed the arguments you're making and found them wanting.
And we have already gotten tired of arguing those same points, back in
our undergrad or grad school days (or analogous time periods for those
who didn't get PhD's...).

The points you guys are making are not as original as you seem to
think.  And the reason we don't take time to argue against them in
detail is that it's boring and we're busy.  These points have already
been extensively argued by others in the published literature over the
past few decades; but I also don't want to take the time to dig up
citations for you....

I'm not saying that I have an argument in favor of my approach, that
would convince a skeptic.  I know I don't.  The only argument that
will convince a skeptic is to complete a functional human-level AGI.
And even that won't be enough for some skeptics.  (Maybe a fully
rigorous formal theory of how to create an AGI with a certain
intelligence level given specific resource constraints would convince
some skeptics, but not many I suppose -- discussions would devolve
into quibbles over the definition of intelligence, and other
particular mathematical assumptions of the sort that any formal
analysis must make.)

OK.  Back to work on the OpenCog Prime documentation, which IMO is a
better use of my time than endlessly repeating the arguments from
philosophy-of-mind and cog-sci class on an email list ;-)

Sorry if my tone seems obnoxious, but I didn't find your description
of those of us working on actual AI systems as having a "herd
mentality" very appealing.  The truth is, one of the big problems in
the field is that nearly everyone working on a concrete AI system has
**their own** particular idea of how to do it, and wants to proceed
independently rather than compromising with others on various design
points.  It's hardly a herd mentality -- the different systems out
there vary wildly in many respects.

-- Ben G

On Sun, Jun 8, 2008 at 3:28 PM, Steve Richfield
<[EMAIL PROTECTED]> wrote:
> Mike Tintner, et al,
>
> After failing to get ANY response to what I thought was an important point
> (Paradigm Shifting regarding Consciousness) I went back through my AGI inbox
> to see what other postings by others weren't getting any responses. Mike
> Tintner was way ahead of me in no-response postings.
>
> A quick scan showed that these also tended to address high-level issues that
> challenge the contemporary herd mentality. In short, most people on this
> list appear to be interested only in HOW to straight-line program an AGI
> (with the implicit assumption that we operate anything at all like we appear
> to operate), but not in WHAT to program, and most especially not in any
> apparent insurmountable barriers to successful open-ended capabilities,
> where attention would seem to be crucial to ultimate success.
>
> Anyone who has been in high-tech for a few years KNOWS that success can come
> only after you fully understand what you must overcome to succeed. Hence,
> based on my own past personal experiences and present observations here,
> present efforts here would seem to be doomed to fail - for personal if not
> for technological reasons.
>
> Normally I would simply dismiss this as rookie error, but I know that at
> least some of the people on this list have been around as long as I have
> been, and hence they certainly should know better since they have doubtless
> seen many other exuberant rookies fall into similar swamps of programming
> complex systems without adequate analysis.
>
> Hey you guys with some gray hair and/or bald spots, WHAT THE HECK ARE YOU
> THINKING?
>
> Steve Richfield
>
> ________________________________
> agi | Archives | Modify Your Subscription



-- 
Ben Goertzel, PhD
CEO, Novamente LLC and Biomind LLC
Director of Research, SIAI
[EMAIL PROTECTED]

"If men cease to believe that they will one day become gods then they
will surely become worms."
-- Henry Miller


-------------------------------------------
agi
Archives: http://www.listbox.com/member/archive/303/=now
RSS Feed: http://www.listbox.com/member/archive/rss/303/
Modify Your Subscription: 
http://www.listbox.com/member/?member_id=8660244&id_secret=103754539-40ed26
Powered by Listbox: http://www.listbox.com

Reply via email to