On Mon, Oct 22, 2012  Craig Weinberg <whatsons...@gmail.com> wrote:

>> it's also true that the letter "e" is not Shakespeare's play "Hamlet",
>> but its part of it.
> > By that analogy, you are crediting the letter "e" for authoring Hamlet.

The letter "e" did not write Hamlet and neither did one neuron inside
Shakespeare's bone box, it took 10^11 of them.

>>  You can have GABA and acetylcholine without paychecks and days off but
>> unless you move to electronics you can't have paychecks and days off
>> without GABA and acetylcholine.
> > Even more reason that the dumb neurotransmitters need the high level
> teleology to inform them.

If you had one scrap of experimental evidence that wishing or wanting can
alter chemical reactions, in other words if you could show that magical
thinking worked, then I would concede that you are right and you will have
won the argument, but there is no such evidence because that's not the way
things work.

>You can have bricks without having the Taj Mahal, but you can't have the
> Taj Mahal without bricks.

Exactly, you can have neurotransmitters without paychecks and days off but
you can't have paychecks and days off without neurotransmitters.

> That doesn't mean that bricks or even bricklayers are responsible for the
> Taj Mahal.

You also need a architect to supply the information on where to put the
bricks, but it remains true that you can't have the Taj Mahal without
bricks and bricklayers.

> My neurons can influence my consciousness


> but they cannot decide for me to get a better job.

If you consciously decide to get a better job and if neurons can influence
your consciousness as you say then you're wrong, neurons CAN decide for you
to get a better job because what they do IS you.

> Why would the high level description level be different from the low
> level description?

Now Craig, if you calm down and look at what you just said with a
dispassionate eye I think you will admit that it was not the brightest
question in the world. The short answer is that one is high and the other
is low. Saying "the temperature is 79 degrees" and saying "oxygen molecule
number 93475626636574074514574 hit your nose 1.0624221 seconds ago from a
south south east direction at a speed of 88.621 feet per second" are both
accurate descriptions of reality, but they are different.

> There is no unexpected gap between the behavior of those two levels of
> description. With subjectivity, the gap is infinite.

The gap is certainly astronomically huge but I'm surprised to hear you say
it was literally infinite because you're the one pushing the idea that
everything can sense its environment and everything is at least a little
bit conscious, so let's view the submicroscopic world with your eyes; I
personally find this sort of description awkward and needlessly
anthropomorphic however it is functionally equivalent to the most
impersonal hard nosed description of any physicist:

Two helium atoms are moving along until they sense they have made contact
and then they find that they don't like each other one bit so they both
decide to change the path they were moving in so they can get away from
each other.  And a chlorine and a sodium atom are moving along until they
sense they have made contact and then they find that they passionately love
each other so they decide to remain very close to each other and turn into
a salt molecule; however as the temperature gets higher their love gets
cooler until it gets so hot they decide to get a divorce and go their
separate ways. That's a very odd way to describe what's going on but it
doesn't conflict with any experimental test.

Generally speaking as the accumulation of matter increases, as more atoms
are involved, the range of possible behaviors gets larger and more complex,
and for some very specific types of structures, like brains or computers,
the growth is exponential and the range gets astronomically (but not
infinitely) larger.

> What you are saying suggests a subjectivity where every time I almost put
> my hand on a hot stove a specific memory is called up and decoded (decoded
> into what?). I don't think it works that way,

I don't think it works that way either, jerking your hand off a hot stove
is just a simple reflex, but you were talking about social experiences and
you have none of them except the ones where the information has been
encoded by neurons, in other words the ones you remember.

>>>  Let's compare. Does your computer worry about it's job?
>> >>  I don't know for sure, I don't even know if you worry about your job
>> because all I can observe is behavior. I do have a theory that extrapolates
>> consciousness and emotion from behavior and I think it's a pretty good
>> theory but it's not proven and never will be, so I just do the best I can.
> > If you had to bet though. If it really mattered and there was a right
> answer and you had to pick yes or no that you computer worries about its
> job, could you honestly say that it is likely?

If I had to guess, and it's only a guess, I'd say that the range of
behavior that computers display is still not as rich as the behavior
displayed by humans so you are probably more conscious than a computer, but
computers are improving so rapidly (and humans are not) that if you asked
me again tomorrow my guess might be entirely different.

John K Clark

You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
For more options, visit this group at 

Reply via email to