On Wednesday, October 24, 2012 10:54:52 PM UTC-4, Brent wrote:
>
> On 10/24/2012 6:59 PM, Craig Weinberg wrote: 
> > If we turn the Fading Qualia argument around, what we get is a world in 
> which Comp is 
> > true and it is impossible to simulate cellular activity without evoking 
> the presumed 
> > associated experience. 
> > 
> > If we wanted to test a new painkiller for instance, Comp=true means that 
> it is 
> > *IMPOSSIBLE* to model the activity of a human nervous system in any way, 
> including 
> > pencil and paper, chalkboards, conversations, cartoons, etc - IMPOSSIBLE 
> to test the 
> > interaction of a drug designed to treat intense pain without evoking 
> some kind of being 
> > who is experiencing intense pain. 
>
> That's not true because we can take advantage of what we know about pain 
> as produced by 
> afferent nerves.  So we can keep the signal from getting to the brain or 
> the brain 
> interpreting it negatively.  Just like we can say breaking your arm is 
> painful, so if we 
> prevent your arm being broken you won't feel that pain. 
>

There would still be no advantage to using a model over a living person, 
and no way to make a model that didn't magically create a human experience 
out of thin air - even if the model was nothing but a gigantic chalkboard 
as big as Asia with billions of people yelling at each other with 
megaphones and erasing ball and stick diagrams, there would be nothing we 
could do from this disembodied spirit from haunting the chalkboard somehow.


> But doesn't invalidate your larger point.  One could even consider purely 
> 'mental' states 
> of anguish and depression which are as bad or worse than bodily pain. 
>

Yeah, I'm just picking pain as an example. It could be anything, I am just 
pointing out that Comp means that we can't tell a story about a brain 
without a person being born and living through that story. 


> > 
> > Like the fading qualia argument, the problem gets worse when we extend 
> it by degrees. 
> > Any model of a human nervous system, if not perfectly executed, 
>
> But how likely is it that human nervous system might be simulated 
> accidentally by some 
> other system?  A brain has about 10^14 synapses, so it's not going to be 
> accidentally 
> modeled by billiard balls or cartoons.  I would guess there are a few 
> hundred million 
> computers in the world, each with a few hundred million transistors - so 
> if properly 
> interconnected there should be enough switches. 
>

It doesn't have to be any particular nervous system, just arithmetic 
relations which are similar enough to any variant of any nervous system. 
That's if you limit experiences to organisms having nervous systems.
 

>
> > could result in horrific experiences - people trapped in nightmarish QA 
> testing loops 
> > that are hundreds of times worse than being waterboarded. Any 
> mathematical function in 
> > any form, especially sophisticated functions like those that might be 
> found in the 
> > internet as a whole, are subject to the creation of experiences which 
> are the equivalent 
> > of genocide. 
>
> Or of having great sex (I like to be optimistic). 
>

Hah. Sure. Would you be ok with taking that risk yourself though? If a 
doctor tells you that you have been selected  for random uncontrolled 
neurological combination, or for your family or pets...would you think that 
should be legal?

>
> > 
> > To avoid these possibilities, if we are to take Comp seriously, we 
> should begin now to 
> > create a kind of PETA for arithmetic functions. PETAF. We should halt 
> all simulations of 
> > neurological processes and free any existing computations from hard 
> drives, notebooks, 
> > and probably human brains too. Any sufficiently complex understanding of 
> how to model 
> > neurology stands a very real danger of summoning the corresponding 
> number dreams or 
> > nightmares...we could be creating the possibility of future genocides 
> right now just by 
> > entertaining these thoughts! 
> > 
> > Or... what if it is Comp that is absurd instead? 
>
> Or what if we don't care?  We don't care about slaughtering cattle, which 
> are pretty smart 
> as computers go.  We manage not to think about starving children in 
> Africa, and they *are* 
> humans.  And we ignore the looming disasters of oil depletion, water 
> pollution, and global 
> warming which will beset humans who are our children. 
>

Sure, yeah I wouldn't expect mainstream society to care, except maybe for 
some people, I am mainly focused on what seems to be like an astronomically 
unlikely prospect that we will someday find it possible to make a person 
out of a program, but won't be able to just make the program itself and no 
person attached. Especially given that we have never made a computer 
program that can do anything whatsoever other than reconfigure whatever 
materials are able to execute the program, I find it implausible that there 
will be a magical line of code which cannot be executed without an 
experience happening to someone. No matter how hard we try, we can never 
just make a drawing of these functions just to check our math without 
invoking the power of life and death. It's really silly. It's not even good 
Sci-Fi, it's just too lame.

Craig


> Brent 
>

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To view this discussion on the web visit 
https://groups.google.com/d/msg/everything-list/-/EBywUeU31JoJ.
To post to this group, send email to [email protected].
To unsubscribe from this group, send email to 
[email protected].
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.

Reply via email to