Re: Arguments against uploading

2013-05-02 Thread Craig Weinberg
Nice. It could be heavier on support on the points, but not bad for a 
superficial pop-sci treatment.

My comments:

It’s a mistake to think of this debate in terms of having insufficient 
understanding or technology to simulate consciousness. The point is that we 
already have sufficient understanding of the problem to suspect that in 
fact, the entire assumption that private experience can be assembled by 
public bodies is false. I see this not as a point of religious sentiment, 
but of physical ontology. To presume that we could ever make a program, for 
instance, which projects an image that we can see without any physical 
projection technology would be an error. No amount of logic can turn a 
simulation of water into actual water that we can drink. To quote 
Korzybski, “The map is not the territory”, or Magritte “Ceci n’est pas une 
pipe.”

It seems that we have become so enamored with computation that we have lost 
this sense of discernment between figures which we use to represent and the 
genuine presentations which are experienced first hand. Figures and symbols 
are only valid within a particular mode of interpretation. What is stored 
in a computer has no aesthetic content. If you tell the computer the data 
is a picture, it will barf out onto the screen whatever noise corresponds 
to that picture. If you tell the computer to use the sound card instead, 
then it will dump the noise as acoustic vibration. The computer doesn’t 
care, either way, data is just data. It is a-signifying and generic - the 
exact opposite of conscious experience which derives its significance from 
proprietary experience through time rather than mechanical function or 
forms. Consciousness is neither form nor function, it is the participatory 
aesthetic appreciation of form and function, and I am willing to bet that 
it is actually the fundamental principle of the cosmos, upon which all 
forms and functions, all matter and energy depend.

As far as embodiment goes, the issue should be refocused so that human 
consciousness in particular is understood as a special case within the 
universal phenomenon of sensory-motor participation, which goes all the way 
down to the bottom. It’s not that mind needs a body, its that private 
awareness correlates to specific public presentations. These public 
presentations, while possible to imitate and substitute to the extent that 
the insensitivity of the perceiver permits, there is no way, from an 
absolute perspective to completely replace any experience with anything 
other than that particular experience. Unlike figures and symbols, 
experiences are rooted in the firmament of eternity. They make a certain 
kind of sense from every angle which is transparent - experiences allow us 
to triangulate meaning through them, and to elide or bridge gaps with leaps 
of understanding. (“A-ha!”).

Experiences can misrepresent each other on different levels, conflicting 
expectations can produce ‘illusions’ but these all ultimately have the 
potential to be revealed through the fullness of time. Simulated reality 
offers no such universal grounding, and promises true prisons which are 
isolated from any possibility of escape. That could happen in theory as a 
consequence of Strong AI, but it won’t in reality, because Strong AI will, 
I think, evaporate in a cloud of hype eventually, and I think that this 
very conversation is a clue that it is happening already. This is not a bad 
thing, not a cause for mourning and disappointment, but an exciting time 
when we can set aside our toy model of physics which disqualifies its model 
maker for long enough to form a new, fully integrated model of the universe 
which sees perception not as a metaphysical ‘emergent property’ but as the 
private view of physics itself. Physics is perception and participation, 
i.e. consciousness.


On Wednesday, May 1, 2013 9:41:36 PM UTC-4, Stephen Paul King wrote:


 http://io9.com/you-ll-probably-never-upload-your-mind-into-a-computer-474941498
  


 -- 
 Onward! 

 Stephen 

 I apologize in advance for the gross errors that this post 
 and all of my posts will contain. ;-) 




-- 
You received this message because you are subscribed to the Google Groups 
Everything List group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to everything-list+unsubscr...@googlegroups.com.
To post to this group, send email to everything-list@googlegroups.com.
Visit this group at http://groups.google.com/group/everything-list?hl=en.
For more options, visit https://groups.google.com/groups/opt_out.




Re: Arguments against uploading

2013-05-02 Thread Jason Resch
The arguments are not so much arguments, but a collection of dubious
assumptions.

His first argument is that the brain is not computable, which requires
assuming the brain does not operate according to known physics, as all
known physics is computable.

The second and third objections are that we need to understand
consciousness and solve the hard problem before we can replicate a brain.
 I don't see how this follows.  Ted Berger offers a convincing argument
against the necessity of needing a theory of mind to do his work (which is
creating neural prosthesis): I don't need a grand theory of the mind to
fix what is essentially a signal-processing problem.  A repairman doesn't
need to understand music to fix your broken CD player.

The fourth argument is that special materials are needed for consciousness.
 Where is the evidence?

The fifth argument is that a non-physical soul is required.  If a
gelatinous blob of cells can have a soul, why can't any other machine?

The sixth, that it would be unethical is surprising.  Is it unethical to
give people artificial hearts, or limbs?  Why will it be unethical to give
them prosthetic brain regions or entire brains?

The seventh, again requires belief in some kind of non-physical soul that
can't be duplicated and is necessary for identity.

The eight, well who wouldn't take the risk of hacking over the certainty of
biological death?

Despite the large number of arguments, I find none of them convincing.

Jason


On Thu, May 2, 2013 at 11:45 AM, Craig Weinberg whatsons...@gmail.comwrote:

 Nice. It could be heavier on support on the points, but not bad for a
 superficial pop-sci treatment.

 My comments:

 It’s a mistake to think of this debate in terms of having insufficient
 understanding or technology to simulate consciousness. The point is that we
 already have sufficient understanding of the problem to suspect that in
 fact, the entire assumption that private experience can be assembled by
 public bodies is false. I see this not as a point of religious sentiment,
 but of physical ontology. To presume that we could ever make a program, for
 instance, which projects an image that we can see without any physical
 projection technology would be an error. No amount of logic can turn a
 simulation of water into actual water that we can drink. To quote
 Korzybski, “The map is not the territory”, or Magritte “Ceci n’est pas une
 pipe.”

 It seems that we have become so enamored with computation that we have
 lost this sense of discernment between figures which we use to represent
 and the genuine presentations which are experienced first hand. Figures and
 symbols are only valid within a particular mode of interpretation. What is
 stored in a computer has no aesthetic content. If you tell the computer the
 data is a picture, it will barf out onto the screen whatever noise
 corresponds to that picture. If you tell the computer to use the sound card
 instead, then it will dump the noise as acoustic vibration. The computer
 doesn’t care, either way, data is just data. It is a-signifying and generic
 - the exact opposite of conscious experience which derives its significance
 from proprietary experience through time rather than mechanical function or
 forms. Consciousness is neither form nor function, it is the participatory
 aesthetic appreciation of form and function, and I am willing to bet that
 it is actually the fundamental principle of the cosmos, upon which all
 forms and functions, all matter and energy depend.

 As far as embodiment goes, the issue should be refocused so that human
 consciousness in particular is understood as a special case within the
 universal phenomenon of sensory-motor participation, which goes all the way
 down to the bottom. It’s not that mind needs a body, its that private
 awareness correlates to specific public presentations. These public
 presentations, while possible to imitate and substitute to the extent that
 the insensitivity of the perceiver permits, there is no way, from an
 absolute perspective to completely replace any experience with anything
 other than that particular experience. Unlike figures and symbols,
 experiences are rooted in the firmament of eternity. They make a certain
 kind of sense from every angle which is transparent - experiences allow us
 to triangulate meaning through them, and to elide or bridge gaps with leaps
 of understanding. (“A-ha!”).

 Experiences can misrepresent each other on different levels, conflicting
 expectations can produce ‘illusions’ but these all ultimately have the
 potential to be revealed through the fullness of time. Simulated reality
 offers no such universal grounding, and promises true prisons which are
 isolated from any possibility of escape. That could happen in theory as a
 consequence of Strong AI, but it won’t in reality, because Strong AI will,
 I think, evaporate in a cloud of hype eventually, and I think that this
 very conversation is a clue that it is happening 

Re: Arguments against uploading

2013-05-02 Thread Craig Weinberg


On Thursday, May 2, 2013 3:08:17 PM UTC-4, Jason wrote:

 The arguments are not so much arguments, but a collection of dubious 
 assumptions.

 His first argument is that the brain is not computable, which requires 
 assuming the brain does not operate according to known physics, as all 
 known physics is computable.


All known physics is computable because it is based on the public 
interaction of material bodies. Consciousness is not isomorphic to those 
kinds of interactions. Our thoughts and emotions are known physics as much 
as the measurements of objects, but we are not used to thinking of them 
that way. Whether or not the brain is computable I think doesn't matter 
because brain activity is only a representation of one aspect of 
experience, which is not going to be very useful taken out of the context 
of the total history of experience. The brain is a flatland footprint of 
experience. We might compute the contours of the sole of the shoe, but that 
doesn't tell us about the person wearing them.
 


 The second and third objections are that we need to understand 
 consciousness and solve the hard problem before we can replicate a brain. 
  I don't see how this follows.  Ted Berger offers a convincing argument 
 against the necessity of needing a theory of mind to do his work (which is 
 creating neural prosthesis): I don't need a grand theory of the mind to 
 fix what is essentially a signal-processing problem.  A repairman doesn't 
 need to understand music to fix your broken CD player.


We don't need to understand the hard problem if we can replicate a brain, 
but understanding the hard problem tells us why replicating a brain doesn't 
mean that there is any subjective experience associated with its function.
 


 The fourth argument is that special materials are needed for 
 consciousness.  Where is the evidence?


Well, there is the complete lack of any inorganic consciousness in the 
universe as far as we know. That isn't evidence, but it might be a clue. 
Materials matter to our body quite a bit.
 


 The fifth argument is that a non-physical soul is required.  If a 
 gelatinous blob of cells can have a soul, why can't any other machine?


Because the blob of cells was once a single cell which divided itself 
because it had the power to do so. Perhaps a synthetic biology would work 
similarly, but the approach right now to machines is to assemble them out 
of dumb parts. There may be an important difference between an organism and 
an organization.
 


 The sixth, that it would be unethical is surprising.  Is it unethical to 
 give people artificial hearts, or limbs?  Why will it be unethical to give 
 them prosthetic brain regions or entire brains?


If you had to develop artificial hearts by legions of making mutant 
children who had to live their lives in misery, then there would be an 
ethical issue. That would be the case if computation alone could indeed 
become conscious. Any program loop left running might be conjuring 
inconceivable agony for some machine-person in the Platonic aethers.


 The seventh, again requires belief in some kind of non-physical soul that 
 can't be duplicated and is necessary for identity.


Your position requires denial of any significant difference between 
conscious intent and unconscious reflex.
 


 The eight, well who wouldn't take the risk of hacking over the certainty 
 of biological death?


Yeah, that the risk of hacking is a red herring. We are already being 
hacked by commercial interests.

 


 Despite the large number of arguments, I find none of them convincing.


I wouldn't either from that article alone, but they are ok as a short list 
to begin to investigate the deeper issues.

Craig
 


 Jason


 On Thu, May 2, 2013 at 11:45 AM, Craig Weinberg 
 whats...@gmail.comjavascript:
  wrote:

 Nice. It could be heavier on support on the points, but not bad for a 
 superficial pop-sci treatment.

 My comments:

 It’s a mistake to think of this debate in terms of having insufficient 
 understanding or technology to simulate consciousness. The point is that we 
 already have sufficient understanding of the problem to suspect that in 
 fact, the entire assumption that private experience can be assembled by 
 public bodies is false. I see this not as a point of religious sentiment, 
 but of physical ontology. To presume that we could ever make a program, for 
 instance, which projects an image that we can see without any physical 
 projection technology would be an error. No amount of logic can turn a 
 simulation of water into actual water that we can drink. To quote 
 Korzybski, “The map is not the territory”, or Magritte “Ceci n’est pas une 
 pipe.”

 It seems that we have become so enamored with computation that we have 
 lost this sense of discernment between figures which we use to represent 
 and the genuine presentations which are experienced first hand. Figures and 
 symbols are only valid within a particular mode of interpretation. 

Re: Arguments against uploading

2013-05-02 Thread John Clark
On Wed, May 1, 2013 at 9:41 PM, Stephen P. King stephe...@charter.netwrote:

http://io9.com/you-ll-probably-never-upload-your-mind-into-a-computer-474941498


 1) Brain functions are not computable because* *most of its important
 features are the result of unpredictable, nonlinear interactions among
 billions of cells.


Well 10^11 neurons in the brain is a big number and 10^15 synapses in that
brain is a even bigger number, but it is no where near to being infinite
and every one of those neurons and every one of the 10^4 neurons in that
neuron operates according to the laws of physics, therefore it is
computable. It's true that random behavior is not computable, but hardware
electronic random number generators cost about $2 if you think having one
is important.

 2) we may never be able to explain how and why we have qualia


Even if that is true it would be irrelevant if you're reverse engineering a
brain, if a upload works you don't need to understand why it works.

 3) we still need to figure out how our brains segregate elements in
 complex patterns, a process that allows us to distinguish them as discrete
 objects.


Computers can perform object recognition, I admit that today's computers
are slow at it but they are rapidly getting better and it is certainly no
show stopper.


  4) Mind-body dualism is true, consciousness lies somewhere outside the
 brai*n*, perhaps as some ethereal soul or spirit.


We know for a fact that if we change the brain consciousness changes and we
know for a fact that if our consciousness changes so does our brain, and
that certainly doesn't sound like dualism to me. And so ethereal soul joins
luminiferous aether and phlogiston as obsolete scientific terms, although
this point is sure to be a hit with  Jesus freaks and snake handlers.

 5) It would be unethical to develop


I see nothing unethical about it but it would be irrelevant even if it was.
This was supposed to be a list of reasons why uploading couldn't happen not
why it shouldn't.

 6) We can never be sure it works[...]  the* *continuity of consciousness
 problem


We can't be sure about anything. I think. And there is no continuity
problem, the external world might jump ahead but to itself consciousness is
always continuous.

7) Uploaded minds would be vulnerable to hacking and abuse


And non uploaded biological brains are vulnerable to bacteria, viruses, and
physical abuse; and at least with uploads you can always keep a up to date
backup stashed away in a safe place far away.

In short  these pathetic reasons would not convince one single member of
the species Homo sapiens that uploading was impossible unless they already
very much wanted to be convinced.

  John K Clark

-- 
You received this message because you are subscribed to the Google Groups 
Everything List group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to everything-list+unsubscr...@googlegroups.com.
To post to this group, send email to everything-list@googlegroups.com.
Visit this group at http://groups.google.com/group/everything-list?hl=en.
For more options, visit https://groups.google.com/groups/opt_out.




Arguments against uploading

2013-05-01 Thread Stephen P. King
http://io9.com/you-ll-probably-never-upload-your-mind-into-a-computer-474941498


-- 
Onward!

Stephen

I apologize in advance for the gross errors that this post
and all of my posts will contain. ;-)


-- 
You received this message because you are subscribed to the Google Groups 
Everything List group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to everything-list+unsubscr...@googlegroups.com.
To post to this group, send email to everything-list@googlegroups.com.
Visit this group at http://groups.google.com/group/everything-list?hl=en.
For more options, visit https://groups.google.com/groups/opt_out.