Alan,
Several people, whose opinions I respect, have asked me to unsub you from
this e-mail list, because they perceive your recent e-mails as having a very
low signal to noise ratio.
I prefer to be accepting, rather than banning people from the list.
However, I'm going to have to ask you to coo
om
ATTN: Members of the Singularity Action Group and board of directors. If
a representitive minority of the members of the singularity action group
(at least 1) does not show up in either #accelerating or #extropy on
irc.extropy.org by midnight Sunday, I will declare the Singularity
Action Group
Olivier - (JConcept / Cetip) wrote:
> I think you are trying to resolve a very high level problem. Would not
> it be simpler to start resolving basic problems like an animal have.
> Hunger, thirst, and try to build a system to model a simple
> animal. After this we can talk about more sophi
Alan,
I think you are trying to resolve a very high level problem. Would not it be
simpler to start resolving basic problems like
an animal have. Hunger, thirst, and try to build a system to model a
simple animal. After this we can talk about more
sophisticated systems ??
You are saying that
On Tue, 14 Jan 2003, Pei Wang wrote:
> I'm working on a paper to compare predicate logic and term logic. One
> argument I want to make is that it is hard to infer on uncountable nouns in
> predicate logic, such as to derive ``Rain-drop is a kind of liquid'' from
> "Water is a kind of liquid'' and
Olivier - (JConcept / Cetip) wrote:
> But why is it necessary to reproduce our internal brain way of working
> to build an intelligent system ?
One one level, it would be very advantageous to replace biological
architectures with much more powerful/scalable/reliable/efficient
approaches.
On the
Tony Lofthouse wrote:
> It seems that your 'layered hierarchy' approach is very similar to Rod
> Brooks subsumption architecture. This has been used to good effect in
> generating natural behaviours in robotics but has not been very useful
> in developing higher level cognition.
> Or maybe you are
Alan,
I think that all you said is right in the point of view of biology.
But why is it necessary to reproduce our internal brain way of working to
build an intelligent system ?
This is a big point on which i disagree. We should try to model the
resulting behaviour but not the internal bilogic be
> SYMETRY: All output channels are associated with at least one
> input/feedback mechanism.
>
> SEMANTIC RELATIVITY: The primary symantic foundation of the system is
> the input and output systems. (almost everything is expressed in terms
> of input and output at some level..)
>
> TEMPORALITY: Bot
It seems that your 'layered hierarchy' approach is very similar to Rod
Brooks subsumption architecture. This has been used to good effect in
generating natural behaviours in robotics but has not been very useful
in developing higher level cognition.
Or maybe you are suggesting something else?
---
10 matches
Mail list logo