1. Brembs and his colleagues reasoned that if fruit flies (Drosophila
melanogaster) *were simply reactive robots entirely determined by their
environment*, in completely featureless rooms they should move completely
randomly.
Yes, but no one has ever argued that a flier is a stateless machine. It
On 24/01/2008, Robert Wensman [EMAIL PROTECTED] wrote:
Yes, but no one has ever argued that a flier is a stateless machine. It
seems like their argument ignores the concept of internal state. If they
went through all this trouble just to prove that the brain of the flies has
an internal state,
On Jan 24, 2008 4:14 PM, Bob Mottram [EMAIL PROTECTED] wrote:
I don't think anyone with knowledge of insect nervous systems would
argue that they're stateless machines. Even simple invertebrates such
as slugs can exhibit classical condition effects which means that at
least some minimal state
I think a more precise way to phrase what they showed,
philosophically, would be like this:
Very likely, to the extent that flies are conscious, then they have a
SUBJECTIVE FEELING of possessing free will.
In other words, flies seem to possess the same kind of internal
spontaneity-generation
I don't think anyone with knowledge of insect nervous systems would
argue that they're stateless machines. Even simple invertebrates such
as slugs can exhibit classical condition effects which means that at
least some minimal state is retained.
To me the idea of free will suggests that a
In other words, flies seem to possess the same kind of internal
spontaneity-generation that we possess, and that we associate with our
subjectively-experienced feeling of free will.
-- Ben G
To clarify further:
Suppose you are told to sit still for a while, and then move your hand
suddenly
You and others are right in that Brembs was perhaps confused about the
difference between spontaneity and free will.
But perhaps the experiment, in demonstrating spontaneity, does weigh against
the idea of the fly being programmed?
Robert:
1. Brembs and his colleagues reasoned that if
On Jan 24, 2008 5:35 PM, Mike Tintner [EMAIL PROTECTED] wrote:
But perhaps the experiment, in demonstrating spontaneity, does weigh against
the idea of the fly being programmed?
What does this idea state? What do you mean when you say that
something is programmed? Can you provide examples of
- Original Message
From: Bob Mottram [EMAIL PROTECTED]
To: agi@v2.listbox.com
Sent: Thursday, January 24, 2008 8:14:09 AM
Subject: Re: [agi] Study hints that fruit flies have free will
I don't think anyone with knowledge of insect nervous systems would
argue that they're stateless
Matt Mahoney wrote:
--- Richard Loosemore [EMAIL PROTECTED] wrote:
The problem with the scenarios that people imagine (many of which are
Nightmare Scenarios) is that the vast majority of them involve
completely untenable assumptions. One example is the idea that there
will be a situation in
That there is some series of instructions, contained presumably in neurons
(or in a computer) which produces a consistent series of
movements/thoughts/actions in a family of situations. So when you/I write
when it is almost certainly a programmed action, which can be and is
automatically
The question vis-a-vis the fly - or any animal - is whether the *whole*
course of action of the fly in that experiment can be accounted for by one -
or a set of - programmed routines or programs period. My impression -
without having studied the experiment in detail - is that it weighs against
I take your general point re how complex systems can produce apparently
spontaneous behaviour.
But to what actual courses of action of actual animals (such as the fly
here) or humans has this theory been successfully applied?
Ben: The question vis-a-vis the fly - or any animal - is whether
If you're asking whether there are accurate complex-systems simulations
of whole animals, there aren't yet ...
At present, we lack instrumentation capable of gathering detailed data about
how animals work; and we lack computers powerful enough to run such
simulations (though some supercomputers
Theory suggests that such simulations will be possible, but it hasn't been
proved conclusively ... so I guess you can still maintain some kind of
vitalism
for a couple decades or so if you really want to ;-)
Possible major misunderstanding : I am not in any shape or form a vitalist.
My
On Jan 24, 2008, at 10:25 AM, Richard Loosemore wrote:
Matt Mahoney wrote:
--- Richard Loosemore [EMAIL PROTECTED] wrote:
The problem with the scenarios that people imagine (many of which
are Nightmare Scenarios) is that the vast majority of them
involve completely untenable assumptions.
Possible major misunderstanding : I am not in any shape or form a vitalist.
My argument is solely about whether a thinking machine (brain or computer)
has to be instructed to think rigidly or freely, with or without prior
rules - and whether, with the class of problems that come under AGI,
--- Richard Loosemore [EMAIL PROTECTED] wrote:
Matt Mahoney wrote:
--- Richard Loosemore [EMAIL PROTECTED] wrote:
The problem with the scenarios that people imagine (many of which are
Nightmare Scenarios) is that the vast majority of them involve
completely untenable assumptions. One
Randall Randall wrote:
On Jan 24, 2008, at 10:25 AM, Richard Loosemore wrote:
Matt Mahoney wrote:
--- Richard Loosemore [EMAIL PROTECTED] wrote:
The problem with the scenarios that people imagine (many of which
are Nightmare Scenarios) is that the vast majority of them involve
completely
On Jan 24, 2008 6:30 PM, Mike Tintner [EMAIL PROTECTED] wrote:
That there is some series of instructions, contained presumably in neurons
(or in a computer) which produces a consistent series of
movements/thoughts/actions in a family of situations. So when you/I write
when it is almost
Matt Mahoney wrote:
--- Richard Loosemore [EMAIL PROTECTED] wrote:
Matt Mahoney wrote:
--- Richard Loosemore [EMAIL PROTECTED] wrote:
The problem with the scenarios that people imagine (many of which are
Nightmare Scenarios) is that the vast majority of them involve
completely untenable
On Jan 24, 2008 4:29 AM, Matt Mahoney [EMAIL PROTECTED] wrote:
Just about all humans claim to have an awareness of sensations, thoughts, and
feelings, and control over decisions they make, what we commonly call
consciousness. A P-zombie would make such claims too (because by definition
a
On Jan 24, 2008 11:28 PM, Matt Mahoney [EMAIL PROTECTED] wrote:
Episodic memory is an aspect of belief in consciousness. Consciousness does
not exist.
OK, thank you, now I understand what you are talking about. You use
'belief in consciousness' to designate behavioral patterns that are
--- Richard Loosemore [EMAIL PROTECTED] wrote:
Matt Mahoney wrote:
Because recursive self improvement is a competitive evolutionary process
even
if all agents have a common ancestor.
As explained in parallel post: this is a non-sequiteur.
OK, consider a network of agents, such as my
Matt Mahoney wrote:
--- Richard Loosemore [EMAIL PROTECTED] wrote:
Matt Mahoney wrote:
Because recursive self improvement is a competitive evolutionary process
even
if all agents have a common ancestor.
As explained in parallel post: this is a non-sequiteur.
OK, consider a network of
25 matches
Mail list logo