My judgment as list moderator:

1)  Discussions of particular, speculative algorithms for solving SAT
are not really germane for this list

2)  Announcements of really groundbreaking new SAT algorithms would
certainly be germane to the list

3) Discussions of issues specifically regarding the integration of SAT solvers
into AGI architectures are highly relevant to this list

4) If you think some supernatural being placed an insight in your mind, you're
probably better off NOT mentioning this when discussing the insight in a
scientific forum, as it will just cause your idea to be taken way less seriously
by a vast majority of scientific-minded people...

-- Ben G, List Owner

On Sun, Mar 30, 2008 at 4:41 PM, Mark Waser <[EMAIL PROTECTED]> wrote:
>
>
> I agree with Richard and hereby formally request that Ben chime in.
>
> It is my contention that SAT is a relatively narrow form of Narrow AI and
> not general enough to be on an AGI list.
>
> This is not meant, in any way shape or form, to denigrate the work that you
> are doing.  It is very important work.
>
> It's just that you're performing the equivalent of presenting a biology
> paper at a physics convention.    :-)
>
>
>
>
> ----- Original Message -----
> From: Jim Bromer
> To: [email protected]
> Sent: Sunday, March 30, 2008 11:52 AM
> Subject: **SPAM** Re: [agi] Logical Satisfiability...Get used to it.
>
>
>
>
>
> > On the contrary, Vladimir is completely correct in requesting that the
> > discussion go elsewhere:  this has no relevance to the AGI list, and
> > there are other places where it would be pertinent.
> >
> >
> > Richard Loosemore
> >
> >
>
>  If Ben doesn't want me to continue, I will stop posting to this group.
> Otherwise please try to understand what I said about the relevance of SAT to
> AGI and try to address the specific issues that I mentioned.  On the other
> hand, if you don't want to waste your time in this kind of discussion then
> do just that: Stay out of it.
> Jim Bromer
>
>
> Jim Bromer
>  ________________________________
>
>  agi | Archives | Modify Your Subscription
>  ________________________________
>
>  agi | Archives | Modify Your Subscription



-- 
Ben Goertzel, PhD
CEO, Novamente LLC and Biomind LLC
Director of Research, SIAI
[EMAIL PROTECTED]

"If men cease to believe that they will one day become gods then they
will surely become worms."
-- Henry Miller

-------------------------------------------
agi
Archives: http://www.listbox.com/member/archive/303/=now
RSS Feed: http://www.listbox.com/member/archive/rss/303/
Modify Your Subscription: 
http://www.listbox.com/member/?member_id=8660244&id_secret=98558129-0bdb63
Powered by Listbox: http://www.listbox.com

Reply via email to