On 18 March 2010 16:36, Brent Meeker meeke...@dslextreme.com wrote:
Is it coherent to say a black box accidentally reproduces the I/O? It is
over some relatively small number to of I/Os, but over a large enough number
and range to sustain human behavior - that seems very doubtful. One would
On 18 Mar 2010, at 07:01, Stathis Papaioannou wrote:
On 18 March 2010 16:36, Brent Meeker meeke...@dslextreme.com wrote:
Is it coherent to say a black box accidentally reproduces the I/
O? It is
over some relatively small number to of I/Os, but over a large
enough number
and range to
On 17 Mar 2010, at 18:34, Brent Meeker wrote:
On 3/17/2010 3:34 AM, Stathis Papaioannou wrote:
On 17 March 2010 05:29, Brent Meeker meeke...@dslextreme.com wrote:
I think this is a dubious argument based on our lack of
understanding of
qualia. Presumably one has many thoughts that do
On 17 Mar 2010, at 18:50, Brent Meeker wrote:
On 3/17/2010 5:47 AM, HZ wrote:
I'm quite confused about the state of zombieness. If the requirement
for zombiehood is that it doesn't understand anything at all but it
behaves as if it does what makes us not zombies? How do we not we are
not?
On 17 Mar 2010, at 19:12, Brent Meeker wrote:
On 3/17/2010 10:01 AM, Bruno Marchal wrote:
On 17 Mar 2010, at 13:47, HZ wrote:
I'm quite confused about the state of zombieness. If the requirement
for zombiehood is that it doesn't understand anything at all but it
behaves as if it does what
Bruno,
Can you clarify the origins of the Lobian Machine? Does it arise
out of the theorem of Hugo Martin Lob? Is it shorthand for the lobes of the
human brain? What is the difference between a lobian machine and a universal
lobian machine? And how do they relate to the question
On 3/17/2010 11:01 PM, Stathis Papaioannou wrote:
On 18 March 2010 16:36, Brent Meekermeeke...@dslextreme.com wrote:
Is it coherent to say a black box accidentally reproduces the I/O? It is
over some relatively small number to of I/Os, but over a large enough number
and range to sustain
Bruno and others,
Perhaps more progress can be made by avoiding self referential
problems and viewing this issue mechanistically. Where I start: Haim
Sompolinsky, Statistical Mechanics of Neural Networks, Physics Today
(December 1988). He discussed emergent computational properties of
On 3/18/2010 10:06 AM, L.W. Sterritt wrote:
Bruno and others,
Perhaps more progress can be made by avoiding self referential
problems and viewing this issue mechanistically. Where I start: Haim
Sompolinsky, Statistical Mechanics of Neural Networks, /Physics
Today /(December 1988). He
On 18 March 2010 17:06, L.W. Sterritt lannysterr...@comcast.net wrote:
Perhaps more progress can be made by avoiding self referential problems and
viewing this issue mechanistically.
Undoubtedly.
I guess I'm in the QM camp
that believes that what you can measure is what you can know.
But
David,
I think that I have to agree with your comments. I do think that we
will learn something from the quest for conscious machines, perhaps
not what we had in mind.
Lanny
On Mar 18, 2010, at 10:45 AM, David Nyman wrote:
On 18 March 2010 17:06, L.W. Sterritt lannysterr...@comcast.net
Brent,
There are some quite interesting observations in the paper by Koch and
Tonini, e.g.
Remarkably, consciousness does not seem to require many of the things
we associate most deeply with being human: emotions, memory, self-
reflection, language, sensing the world and acting in it...
On 3/18/2010 12:03 PM, L.W. Sterritt wrote:
Brent,
There are some quite interesting observations in the paper by Koch and
Tonini, e.g.
Remarkably, consciousness does not seem to require many of the things
we associate most deeply with being human: emotions, memory,
self-reflection,
Brent,
This link should work. IEEE sometimes makes their articles available
to non-members and non-subscribers:
http://spectrum.ieee.org/biomedical/imaging/can-machines-be-conscious/3
If this does not work, please let me know and I'll find another path
to the article. I could also go
Brent,
I notice that the link that I forwarded opens on the 3rd page; just
select view all, toward the upper right of the page.
This brief article on consciousness as integrated information may also
be interesting:
Thanks. I got it.
Some assertions seem dubious:
Primal emotions like anger, fear, surprise, and joy are useful and
perhaps even essential for the survival of a conscious organism.
Likewise, a conscious machine might rely on emotions to make choices and
deal with the complexities of the
On 19 March 2010 04:01, Brent Meeker meeke...@dslextreme.com wrote:
On 3/17/2010 11:01 PM, Stathis Papaioannou wrote:
On 18 March 2010 16:36, Brent Meeker meeke...@dslextreme.com wrote:
Is it coherent to say a black box accidentally reproduces the I/O? It is
over some relatively small
17 matches
Mail list logo