Perhaps the sub-groups/sub-lists could have the "custom" of marking it
in the subject, say if it was about analog AI having the subject start
with "[Analog]". I'm not talking about anything as formal as a
guideline, and definitely not a requirement, just something so that
those not interested could set their mail filters to mark the posts
already read.
On 07/08/2012 08:15 AM, Ben Goertzel wrote:
In general, I think it would be good if subgroups of people sharing
certain AI intuitions could carry out a discussion on this list, with
others listening in and contributing occasionally, but with others NOT
repetitively chiming into the discussion with comments of the basic
meaning "By the way, I told you guys 100 times before that your
paradigm sucks, so why do you keep on pursuing it?!"
For example, I would be happy to listen in on others' discussions on
analog computing approaches to AGI, making technical comments or
asking technical questions occasionally; and I would not feel the need
to interrupt these discussions repeatedly with comments of the form
"Why don't you guys adopt my preferred AGI paradigm instead!!"
This is almost making me feel motivated to create a set of posting
guidelines for the list ;p .. but, not quite...
-- Ben G
On Sun, Jul 8, 2012 at 10:51 PM, Russell Wallace
<[email protected] <mailto:[email protected]>> wrote:
On Sat, Jul 7, 2012 at 12:11 AM, Steve Richfield
<[email protected] <mailto:[email protected]>> wrote:
OK, perhaps we should just stay here and distinguish "weak
AGI" where people attempt to somehow leverage data point
computation into an intelligent process as now seems to be the
norm on this forum, and "strong AGI" where we attempt to move
up to whatever metalevel is at least as high as our brains
operate on, and which can also conceivably be performed by
plausibly manufacturable hardware, albeit not anything like
present CPUs.
Any problem with those terms?
Yes, 'strong AI' already has an established meaning, denoting the
aim of producing a fully human level mind (by whatever method), as
opposed to 'weak AI' which merely aims to make computers smarter
and more useful than they currently are.
Besides, you don't exactly need a PhD in psychology to figure out
that many people will object to the word 'weak' being applied to
their line of research! Personally I don't care about that so much
as about the fact that your proposed usage is highly uninformative.
Until you get enough like-minded people to start a separate
mailing list, I would recommend coming up with a more descriptive
term for your proposed line of research.
*AGI* | Archives <https://www.listbox.com/member/archive/303/=now>
<https://www.listbox.com/member/archive/rss/303/212726-11ac2389> |
Modify <https://www.listbox.com/member/?&> Your Subscription
[Powered by Listbox] <http://www.listbox.com>
--
Ben Goertzel, PhD
http://goertzel.org
"My humanity is a constant self-overcoming" -- Friedrich Nietzsche
*AGI* | Archives <https://www.listbox.com/member/archive/303/=now>
<https://www.listbox.com/member/archive/rss/303/232072-58998042> |
Modify
<https://www.listbox.com/member/?&>
Your Subscription [Powered by Listbox] <http://www.listbox.com>
--
Charles Hixson
-------------------------------------------
AGI
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/21088071-c97d2393
Modify Your Subscription:
https://www.listbox.com/member/?member_id=21088071&id_secret=21088071-2484a968
Powered by Listbox: http://www.listbox.com