Brain-computer interface and quantum robots

2009-09-10 Thread ronaldheld

 arXiv.org/abs/0909.1508
I saw the title and thought of what Bruno would make of it. Any
thoughts?
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en
-~--~~~~--~~--~--~---



Re: Brain-computer interface and quantum robots

2009-09-10 Thread John Mikes
Ronald.
I pursue (vaguely) such development and - though have no intention to
outguess Bruno's opinion - find it a VERY PRACTICAL (may I call it: e-bio)
line. (lineS - plural). Quite amazing results have been so far achieved in
this IMO totally initial phase. I can't wait how the ultra-theoreticians on
this list will include such results into 'machine-consciousness' etc.
ideas.
John M

On Thu, Sep 10, 2009 at 8:06 AM, ronaldheld ronaldh...@gmail.com wrote:


  arXiv.org/abs/0909.1508
 I saw the title and thought of what Bruno would make of it. Any
 thoughts?
 


--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en
-~--~~~~--~~--~--~---



Re: Dreaming On

2009-09-10 Thread David Nyman

2009/9/9 Flammarion peterdjo...@yahoo.com:

 What you say above seems pretty much in sympathy with the reductio
 arguments based on arbitrariness of implementation.

 It is strictly an argument against the claim that
 computation causes consciousness , as opposed
 to the claim that mental states are identical to computational
 states.

I'm not sure I see what distinction you're making.  If as you say the
realisation of computation in a physical system doesn't cause
consciousness, that would entail that no physically-realised
computation could be identical to any mental state. This is what
follows if one accepts the argument from MGA or Olympia that
consciousness does not attach to physical states qua computatio.

 But CTM is not engaged on such a project; in fact it entails
 the opposite conclusion: i.e. by stipulating its type-token identities
 purely functionally it requires that a homogeneous phenomenal state
 must somehow be associated with a teeming plurality of heterogeneous
 physical states.

 It doesn't suggest that any mental state can be associated with any
 phsycial
 state.

It doesn't need to say that to be obscure as a physical theory.  The
point is that it can ex hypothesi say nothing remotely physically
illuminating about what causes a mental state.  To say that it results
whenever a physical system implements a specific computation is to say
nothing physical about that system other than to insist that it is
'physical'.


 It has been accused of overdoing  Multiple Realisability, but MR
 can be underdone as well.

I agree.  Nonetheless, when two states are functionally equivalent one
can still say what it is about them that is physically relevant.  For
example, in driving from A to B it is functionally irrelevant to my
experience whether my car is fuelled by petrol or diesel.  But there
is no ambiguity about the physical details of my car trip or precisely
how either fuel contributes to this effect.

 Various arguments - Olympia, MGA, the Chinese Room etc. - seek to
 expose the myriad physical implausibilities consequential on such
 implementation independence.  But the root of all this is that CTM
 makes impossible at the outset any possibility of linking a phenomenal
 state to any unique, fully-explicated physical reduction.

 That's probably a good thing. We want to be able to say that
 two people with fine-grained differences in their brain structure
 can both be (for instance) apprehensiveness.

Yes, I agree.  But if we're after a physical theory, we also want to
be able to give in either case a clear physical account of their
apprehensiveness, which would include a physical justification of why
the fine-grained differences make no difference at the level of
experience.

 If nothing
 physical can in principle be ruled out as an explanation for
 experience,

 That isn't an implication of CTM. CTM can regard computers as
 a small subset of physical systems, and conscious computers as
 a small subset of computers.

Yes, but we needn't push nothing physical to the extent of random
association to make the point at issue.  The relevant point is that,
in picking out the subset of physical systems solely qua computatio,
no kind of physical realisation is capable of being ruled out in
principle.  That is unproblematic in the usual case because our
interest is restricted to the computational output of such systems,
and we are unconcerned by the physical details that occasion this.
But if we are seeking a physical explanation of consciousness, then it
is precisely the coupling of the physical process and the mental
process which requires explication in a physical theory, and this is
now obscured from any general resolution by the computational posit.

 no uniquely-justified physical explanation need - or in
 practice could - be explicated.

 I don't think unique justification is a requirement

The detailed implausibilities
 variously invoked all fall out of this.


 So if a physical theory of mind is what is needed, CTM would seem to
 fail even as a candidate because its arbitrariness with respect to
 physical realisation renders it incapable of grounding consciousness
 in any specific fundamental physical reduction.

 MR is not complete arbitrariness.

I can only suppose that complete arbitrariness would be a random
association between physical states and mental states.  This is not
what is meant by arbitrary realisation.  What is meant is that the
requirement that a physical system be deemed conscious purely in
virtue of its implementing a computation rules out no particular kind
of physical realisation.  Consequently a theory of this type is
incapable of explicating general principles of physical-mental
association independent of its functional posit.

 If CTM had the implication that one material
 system could realise more than one computation, then there
 would be a conflict with the phsyical supervenience principle.

I agree.


 But CTM only has the implication that one computation
 system 

Re: Brain-computer interface and quantum robots

2009-09-10 Thread ronaldheld

I have to agree that I am curious what responses I will get from the
frequent posters.
I see this as someday being able to say,yes, Doctor.
Ronald

On Sep 10, 9:17 am, John Mikes jami...@gmail.com wrote:
 Ronald.
 I pursue (vaguely) such development and - though have no intention to
 outguess Bruno's opinion - find it a VERY PRACTICAL (may I call it: e-bio)
 line. (lineS - plural). Quite amazing results have been so far achieved in
 this IMO totally initial phase. I can't wait how the ultra-theoreticians on
 this list will include such results into 'machine-consciousness' etc.
 ideas.
 John M



 On Thu, Sep 10, 2009 at 8:06 AM, ronaldheld ronaldh...@gmail.com wrote:

   arXiv.org/abs/0909.1508
  I saw the title and thought of what Bruno would make of it. Any
  thoughts?- Hide quoted text -

 - Show quoted text -
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en
-~--~~~~--~~--~--~---



Re: Brain-computer interface and quantum robots

2009-09-10 Thread Brent Meeker

ronaldheld wrote:
  arXiv.org/abs/0909.1508
 I saw the title and thought of what Bruno would make of it. Any
 thoughts?
   
The authors write, However, recent studies lead to the conclusion that 
the human mind is not a classical computer, and, in general, not 
completely reducible to any kind of computer (not even classical) 
because of the
non-algorithmic nature of some mental processes.  But they give to no 
reference to these recent studies.  The paper seems to be about well 
known problems in training artificial neural networks and other 
artificial learning algorithms.  Sure EEG is inadequate to define 
intention, there's just not much information there.  I don't see that 
as having any foundational implications.

Brent

--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en
-~--~~~~--~~--~--~---



Re: Dreaming On

2009-09-10 Thread Brent Meeker

David Nyman wrote:
 2009/9/9 Flammarion peterdjo...@yahoo.com:

   
 What you say above seems pretty much in sympathy with the reductio
 arguments based on arbitrariness of implementation.
   
 It is strictly an argument against the claim that
 computation causes consciousness , as opposed
 to the claim that mental states are identical to computational
 states.
 

 I'm not sure I see what distinction you're making.  If as you say the
 realisation of computation in a physical system doesn't cause
 consciousness, that would entail that no physically-realised
 computation could be identical to any mental state. This is what
 follows if one accepts the argument from MGA or Olympia that
 consciousness does not attach to physical states qua computatio.

   
 But CTM is not engaged on such a project; in fact it entails
 the opposite conclusion: i.e. by stipulating its type-token identities
 purely functionally it requires that a homogeneous phenomenal state
 must somehow be associated with a teeming plurality of heterogeneous
 physical states.
   
 It doesn't suggest that any mental state can be associated with any
 phsycial
 state.
 

 It doesn't need to say that to be obscure as a physical theory.  The
 point is that it can ex hypothesi say nothing remotely physically
 illuminating about what causes a mental state.  To say that it results
 whenever a physical system implements a specific computation is to say
 nothing physical about that system other than to insist that it is
 'physical'.

   
 It has been accused of overdoing  Multiple Realisability, but MR
 can be underdone as well.
 

 I agree.  Nonetheless, when two states are functionally equivalent one
 can still say what it is about them that is physically relevant.  For
 example, in driving from A to B it is functionally irrelevant to my
 experience whether my car is fuelled by petrol or diesel.  But there
 is no ambiguity about the physical details of my car trip or precisely
 how either fuel contributes to this effect.

   
 Various arguments - Olympia, MGA, the Chinese Room etc. - seek to
 expose the myriad physical implausibilities consequential on such
 implementation independence.  But the root of all this is that CTM
 makes impossible at the outset any possibility of linking a phenomenal
 state to any unique, fully-explicated physical reduction.
   
 That's probably a good thing. We want to be able to say that
 two people with fine-grained differences in their brain structure
 can both be (for instance) apprehensiveness.
 

 Yes, I agree.  But if we're after a physical theory, we also want to
 be able to give in either case a clear physical account of their
 apprehensiveness, which would include a physical justification of why
 the fine-grained differences make no difference at the level of
 experience.
   

Consider what a clear physical account of apprehensiveness might be:  
There's an increased level of brain activity which is similar to that 
caused by a strange sound when along in the dark, a slight rise in 
adrenaline, a tensing of muscles that would be used to flee, brain 
patterns formed as memories while watching slasher movies become more 
excited.  Fine-grained differences below these levels, as might differ 
in others, are irrelevant to the experience.  For comparison consider a 
Mars rover experiencing apprehension: Sensor signals indicate lack of 
traction which implies likely inability to reach it's next sampling 
point.  Extra battery power is put on line and various changes in paths 
and backtracking are calculated.  Mission control is apprised.   The 
soil appearance related to poor traction is entered into a database with 
a warning note.

Notice how the meaning, the content of 'apprehension' comes from the 
context of action and purpose and interaction with an external world.  
We summarize these things as a single word 'apprehension' which we then 
take to describe a strictly internal state. But that is because we have 
abstracted away the circumstances that give the meaning.  There are 
difference cirmcustances that would give the same hightened states.

Brent

--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en
-~--~~~~--~~--~--~---



Re: Brain-computer interface and quantum robots

2009-09-10 Thread Bruno Marchal

On 10 Sep 2009, at 19:38, Brent Meeker wrote:


 ronaldheld wrote:
 arXiv.org/abs/0909.1508
 I saw the title and thought of what Bruno would make of it. Any
 thoughts?

 The authors write, However, recent studies lead to the conclusion  
 that
 the human mind is not a classical computer, and, in general, not
 completely reducible to any kind of computer (not even classical)
 because of the
 non-algorithmic nature of some mental processes.  But they give to no
 reference to these recent studies.  The paper seems to be about well
 known problems in training artificial neural networks and other
 artificial learning algorithms.  Sure EEG is inadequate to define
 intention, there's just not much information there.  I don't see  
 that
 as having any foundational implications.

I think so. Yet the authors postulate a wave collapse, and conclude


The previous arguments showed that the quantum approach predicts the  
possibility of a direct action
of mind on matter.
 

Just an old idea, it seems to me.

See Deutsch and Albert for quantum intospection in Everett and Bohm  
respectively.

Bruno

http://iridia.ulb.ac.be/~marchal/




--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en
-~--~~~~--~~--~--~---



Re: Dreaming On

2009-09-10 Thread Brent Meeker

David Nyman wrote:
 2009/9/10 Brent Meeker meeke...@dslextreme.com:

   
 Yes, I agree.  But if we're after a physical theory, we also want to
 be able to give in either case a clear physical account of their
 apprehensiveness, which would include a physical justification of why
 the fine-grained differences make no difference at the level of
 experience.
   
 Consider what a clear physical account of apprehensiveness might be:
 There's an increased level of brain activity which is similar to that
 caused by a strange sound when along in the dark, a slight rise in
 adrenaline, a tensing of muscles that would be used to flee, brain
 patterns formed as memories while watching slasher movies become more
 excited.  Fine-grained differences below these levels, as might differ
 in others, are irrelevant to the experience.  For comparison consider a
 Mars rover experiencing apprehension: Sensor signals indicate lack of
 traction which implies likely inability to reach it's next sampling
 point.  Extra battery power is put on line and various changes in paths
 and backtracking are calculated.  Mission control is apprised.   The
 soil appearance related to poor traction is entered into a database with
 a warning note.
 Notice how the meaning, the content of 'apprehension' comes from the
 context of action and purpose and interaction with an external world.
 We summarize these things as a single word 'apprehension' which we then
 take to describe a strictly internal state. But that is because we have
 abstracted away the circumstances that give the meaning.  There are
 difference cirmcustances that would give the same hightened states.
 

 Whilst I am of course in sympathy with the larger import of you're
 saying, Brent, I'm not sure how it's relevant to the intentionally
 more restricted focus of the current discussion.  It is by definition
 true that fine-grained differences below these levels, as might
 differ in others, are irrelevant to the experience.  My point still
 is that a complete physical theory of consciousness would be capable
 of explicating - both in general physical principles and in detail -
 the relation between coarse and fine-grained physical accounts of an
 experiential state, whatever the wider context in which it might be
 embedded.  Or IOW, of explaining what physical principles and
 processes are responsible for the fineness of fine graining and the
 coarseness of coarse graining.  CTM doesn't appear to offer any
 physically explicit route to this goal.

 David
But isn't that because the computational in CTM is abstracted away 
from a context in which there is action and purpose.  It's the same 
problem that leads to the question, Does a rock compute every 
function?  When looking at a physical process as a computation one has 
to ask, Computing what? and the answer is in terms of some interaction 
with the rest of the world in which the computation is embedded, e.g. 
the answer will mean something to the programmer who started it and it 
means something to him because he's a human animal that evolved to have 
goals and values and can take actions.  The level of experience, the 
finess or coarsenss of physical process, is determined by the level at 
which there are actions.

Brent

Bretn

--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en
-~--~~~~--~~--~--~---



books on logic/computing

2009-09-10 Thread ronaldheld

I thought that I would start a thread to consolidate some of the books
useful in following current and old threads. if people alos want to
post key papers here, I do not see a problem with that.
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en
-~--~~~~--~~--~--~---



Re: Dreaming On

2009-09-10 Thread David Nyman

2009/9/10 Brent Meeker meeke...@dslextreme.com:

 But isn't that because the computational in CTM is abstracted away
 from a context in which there is action and purpose.  It's the same
 problem that leads to the question, Does a rock compute every
 function?  When looking at a physical process as a computation one has
 to ask, Computing what? and the answer is in terms of some interaction
 with the rest of the world in which the computation is embedded, e.g.
 the answer will mean something to the programmer who started it and it
 means something to him because he's a human animal that evolved to have
 goals and values and can take actions.  The level of experience, the
 finess or coarsenss of physical process, is determined by the level at
 which there are actions.

Yes, I agree with your analysis completely when evaluating any
externally observed situation.  The trouble is that I think if this
approach is followed with mentality then the experiential aspect just
gets lost in the processual account.  For example your saying the
level of experience, the finess or coarsenss of physical process, is
determined by the level at which there are actions immediately
focuses attention at the interface with the environment, where inputs
and outputs can be equivalent for many internally heterogeneous
internal processes.  This makes perfect sense in the evaluation of a
person's, a computer's, or a rock's computational status, if any,
because this becomes relevant only at the point where something
emerges from the interior to engage with the environment.

It's a big leap from that to showing how heterogeneous physical
processes are internally experientially equivalent *for clearly
explicable physical reasons*.  The reason for my emphasis of
*physical* is that my problem with CTM, at least in this discussion,
is not that it is computational, but that it isn't a physical theory
in any standard sense, since it can't justify the attachment of
experience to any particular events for other than *functional*
reasons.

Re-reading the foregoing reminds me of my basic problem with any
purely third person approach to mentality, whether physical or
functional. Considered from the third person perspective, 'mental'
processes have no need to be experiential homogeneous because
everything functionally relevant is assumed to be exhausted in the
processual account, and hence experience could be nothing but
epiphenomenal to this.  So what difference could it make?  But that is
another discussion.

David


 David Nyman wrote:
 2009/9/10 Brent Meeker meeke...@dslextreme.com:


 Yes, I agree.  But if we're after a physical theory, we also want to
 be able to give in either case a clear physical account of their
 apprehensiveness, which would include a physical justification of why
 the fine-grained differences make no difference at the level of
 experience.

 Consider what a clear physical account of apprehensiveness might be:
 There's an increased level of brain activity which is similar to that
 caused by a strange sound when along in the dark, a slight rise in
 adrenaline, a tensing of muscles that would be used to flee, brain
 patterns formed as memories while watching slasher movies become more
 excited.  Fine-grained differences below these levels, as might differ
 in others, are irrelevant to the experience.  For comparison consider a
 Mars rover experiencing apprehension: Sensor signals indicate lack of
 traction which implies likely inability to reach it's next sampling
 point.  Extra battery power is put on line and various changes in paths
 and backtracking are calculated.  Mission control is apprised.   The
 soil appearance related to poor traction is entered into a database with
 a warning note.
 Notice how the meaning, the content of 'apprehension' comes from the
 context of action and purpose and interaction with an external world.
 We summarize these things as a single word 'apprehension' which we then
 take to describe a strictly internal state. But that is because we have
 abstracted away the circumstances that give the meaning.  There are
 difference cirmcustances that would give the same hightened states.


 Whilst I am of course in sympathy with the larger import of you're
 saying, Brent, I'm not sure how it's relevant to the intentionally
 more restricted focus of the current discussion.  It is by definition
 true that fine-grained differences below these levels, as might
 differ in others, are irrelevant to the experience.  My point still
 is that a complete physical theory of consciousness would be capable
 of explicating - both in general physical principles and in detail -
 the relation between coarse and fine-grained physical accounts of an
 experiential state, whatever the wider context in which it might be
 embedded.  Or IOW, of explaining what physical principles and
 processes are responsible for the fineness of fine graining and the
 coarseness of coarse graining.  CTM doesn't appear to offer any