>If it is not present physically, then what causes a person to say "I
>am imagining a blue chair"?

A sensorimotive circuit. A sensory feeling which is a desire to
fulfill itself through the motive impulse to communicate that
statement.

>Could you please define this term?  I looked it up but the
>definitions  I found did not seem to fit.

Nerves are referred to as afferent and efferent also. My idea is that
all nerve functionality is sense (input) and motive (output). I would
say motor, but it's confusing because something like changing your
mind or making a choice is motive but not physically expressed as
motor activity, but I think that they are the same thing. I am
generalizing what nerves do to the level of physics, so that our
nerves are doing the same thing that all matter is doing, just
hypertrophied to host more meta-elaborated sensorimotive phenomena.

>There is such a thing as too low a level.  What leads you to believe
>the neuron is the appropriate level to find qualia, rather than the
>states of neuron groups or the whole brain?

I didn't say it was. I was just talking about the more similar you can
get to imitating a human neuron, the more similar a brain based on
that imitation will be to having the potential for human
consciousness.

>You would have to show that the presence of DNA in part determines the
>evolution of the brains neural network.  If not, it is as relevant to
>you and your mind as the neutrinos passing through you.

Chromosome mutations cause mutations in the brain's neural network, do
they not? btw, I don't interpret neutrinos, photons, or other massless
chargeless phenomena as literal particles. QM is a misinterpretation.
Accurate, but misinterpreted.

>> A digital simulation is just a pattern in an abacus.

>The state of an abacus is just a number, not a process.  I think you
>may not have a full understanding of the differences between a turing
>machine and a string of bits.  A Turing machine can mimick any process
>that is defineable and does not take an infinite number of steps.
>Turing machines are dynamic, self-directed entities.  This
>distinguishes them from cartoons, YouTube videos and the state if an
>abacus.

A pattern is not necessarily static, especially not an abacus, the
purpose of which is to be able to change the positions to any number.
Just like a cartoon. If you are defining Turing machines as self-
directed entities then you have already defined them as conscious, so
it's a fallacy to present it as a question. Since I think that a
machine cannot have a self, but is instead the self's perception of
the self's opposite, I'm not compelled by any arguments which imagine
that purely quantitative phenomena (if there were such a thing) can be
made to feel.

>Then, if you deny the logical possibilitt of zombies, or fading
>qualia, you must accept such an emulation of a human mind would be
>equally conscious.

These ideas are not applicable in my model of consciousness and it's
relation to neurology.

>The idea behind a computer simulation of a mind is not to make
>something that looks like a brain but to make something that behaves
>and works like a brain.

I think that for it to work exactly like a brain it has to be a brain.
If you want something that behaves like an intelligent automaton, then
you can use a machine made of inorganic matter. If you want something
that feels and behaves like a living organism then you have to create
a living organism out of matter that can self replicate and die.

>Rejection requires the body knowing there is a difference, which is
>against the starting assumption.

If you are already defining something as biologically identical, then
you are effectively asking 'if something non-biological were
biological, would it perform biological functions?'

>I pasted real life counter examples to this.  Artificial cochlea and
>retinas.

Those are not replacements for neurons, they are prostheses for a
nervous system. Big difference. I can replace a car engine with
horses, but I can't replace a horse's brain with a car engine.

>At what point does the replacement magically stop working?

At what point does cancer magically stop you from waking up?

>So it can use an artificial retina but not an artificial neuron?

A neuron can use an artificial neuron but a person can't use an
artificial neuron except through a living neuron.

Craig

On Jul 13, 9:16 pm, Jason Resch <jasonre...@gmail.com> wrote:
> On Jul 13, 2011, at 7:04 PM, Craig Weinberg <whatsons...@gmail.com>  
> wrote:
>
> >> Again, all that matters is that the *outputs* that influence other  
> >> neurons are just like those of a real neuron, any *internal*  
> >> processes in the substitute are just supposed to be >artificial  
> >> simulations of what goes on in a real neuron, so there might be  
> >> simulated genes (in a simulation running on something like a  
> >> silicon chip or other future computing >technology) but there'd be  
> >> no need for actual DNA molecules inside the substitute.
>
> > The assumption is that there is a meaningful difference between the
> > processes physically within the cell and those that are input and
> > output between the cells. That is not my view. Just as the glowing
> > blue chair you are imagining now (is it a recliner? A futuristic
> > cartoon?) is not physically present in any neuron or group of neurons
> > in your skull -
>
> If it is not present physically, then what causes a person to say "I  
> am imagining a blue chair"?
>
> > under any imaging system or magnification. My idea of
> > 'interior' is different from the physical inside of the cell body of a
> > neuron. It is the interior topology. It's not even a place, it's just
> > a sensorimotive
>
> Could you please define this term?  I looked it up but the  
> definitions  I found did not seem to fit.
>
> > awareness of itself and it's surroundings - hanging on
> > to it's neighbors, reaching out to connect, expanding and contracting
> > with the mood of the collective. This is what consciousness is. This
> > is who we are. The closer you get to the exact nature of the neuron,
> > the closer you get to human consciousness.
>
> There is such a thing as too low a level.  What leads you to believe  
> the neuron is the appropriate level to find qualia, rather than the  
> states of neuron groups or the whole brain?  Taking the opposite  
> direction, why not say it must be explained in terms if chemistry or  
> quarks?  What led you to conclude it is the neurons?  Afterall, are  
> rat neurons very different from human neurons?  Do rats have the same  
> range of qualia as we?
>
> > If you insist upon using
> > inorganic materials, that really limits the degree to which the
> > feelings it can host will be similar.
>
> Assuming qualia supervene on the individual cells or their chemistry.
>
> > Why wouldn't you need DNA to
> > feel like something based on DNA in practically every one of it's
> > cells?
>
> You would have to show that the presence of DNA in part determines the  
> evolution of the brains neural network.  If not, it is as relevant to  
> you and your mind as the neutrinos passing through you.
>
>
>
> >> The idea is just that *some* sufficiently detailed digital  
> >> simulation would behave just like real neurons and a real brain,  
> >> and "functionalism" as a philosophical view says that this  
> >> >simulation would have the same mental properties (such as qualia,  
> >> if the functionalist thinks of "qualia" as something more than just  
> >> a name for a certain type of physical process) >as the original brain
>
> > A digital simulation is just a pattern in an abacus.
>
> The state of an abacus is just a number, not a process.  I think you  
> may not have a full understanding of the differences between a turing  
> machine and a string of bits.  A Turing machine can mimick any process  
> that is defineable and does not take an infinite number of steps.  
> Turing machines are dynamic, self-directed entities.  This  
> distinguishes them from cartoons, YouTube videos and the state if an  
> abacus.
>
> Since they have such a universal capability to mimick processes, then  
> the idea that the brain is a process leads naturally to the idea of  
> intelligent computers which could function identically to organic  
> brains.
>
> Then, if you deny the logical possibilitt of zombies, or fading  
> qualia, you must accept such an emulation of a human mind would be  
> equally conscious.
>
> > If you've got a
> > gigantic abacus and a helicopter, you can make something that looks
> > like whatever you want it to look like from a distance, but it's still
> > just an abacus. It has no subjectivity beyond the physical materials
> > that make up the beads.
>
> The idea behind a computer simulation of a mind is not to make  
> something that looks like a brain but to make something that behaves  
> and works like a brain.
>
>
>
> >> Everything internal to the boundary of the neuron is simulated,  
> >> possibly using materials that have no resemblance to biological ones.
>
> > It's a dynamic system,
>
> So is a turing machine.
>
>
>
>
>
>
>
>
>
> > there is no boundary like that. The
> > neurotransmitters are produced by and received within the neurons
> > themselves. If something produces and metabolizes biological
> > molecules, then it is functioning at a biochemical level and not at
> > the level of a digital electronic simulation. If you have a heat sink
> > for your device it's electromotive. If you have an insulin pump it's
> > biological, if you have a serotonin reuptake receptor, it's
> > neurological.
>
> >> So if you replace the inside of one volume with a very different  
> >> system that nevertheless emits the same pattern of particles at the  
> >> boundary of the volume, systems in other >adjacent volumes "don't  
> >> know the difference" and their behavior is unaffected.
>
> > No, I don't that's how living things work. Remember that people's
> > bodies often reject living tissue transplanted from other human
> > beings.
>
> Rejection requires the body knowing there is a difference, which is  
> against the starting assumption.
>
>
>
>
>
>
>
>
>
>
>
> >> You didn't address my question about whether you agree or disagree  
> >> with physical reductionism in my last post, can you please do that  
> >> in your next response to me?
>
> > I agree with physical reductionism as far as the physical side of
> > things is concerned. Qualia is the opposite that would be subject to
> > experiential irreductionism. Which is why you can print Shakespeare on
> > a poster or a fortune cookie and it's still Shakeapeare, but you can't
> > make enriched uranium out of corned beef or a human brain out of table
> > salt.
>
> >> Because I'm just talking about the behavioral aspects of  
> >> consciousness now, since it's not clear if you actually accept or  
> >> reject the premise that it would be possible to replace >neurons  
> >> with functional equivalents that would leave *behavior* unaffected
>
> > I'm rejecting the premise that there is a such thing as a functional
> > replacement for a neuron that is sufficiently different from a neuron
> > that it would matter.
>
> I pasted real life counter examples to this.  Artificial cochlea and  
> retinas.
>
> > You can make a prosthetic appliance which your
> > nervous system will make do with, but it can't replace the nervous
> > system altogether.
>
> At what point does the replacement magically stop working?
>
> > The nervous system predicts and guesses. It can
> > route around damage or utilize a device which it can understand how to
> > use.
>
> So it can use an artificial retina but not an artificial neuron?
>
>
>
>
>
>
>
>
>
> >> first I want to focus on this issue of whether you accept that in  
> >> principle it would be possible to replace neurons with "functional  
> >> equivalents" which emit the same signals to other >neurons but have  
> >> a totally different internal structure, and whether you accept that  
> >> this would leave behavior unchanged, both for nearby neurons and  
> >> the muscle movements of the >body as a whole.
>
> > This is tautological. You are making a nonsense distinction between
> > it's 'internal' structure and what it does. If the internal structure
> > is equivalent enough, then it will be functionally equivalent to other
> > neurons and the organism at large. If it's not, then it won't be.
> > Interior mechanics that produce organic molecules and absorb them
> > through a semipermeable membrane are biological cells. If you can make
> > something that does that out of something other than nucleic acids,
> > then cool, but why bother? Just build the cell you want
> > nanotechnologically.
>
> >> Again, not talking about consciousness at the moment, just  
> >> behaviors that we associate with consciousness. That's why, in  
> >> answer to your question about synthetic water, I >imagined a robot  
> >> whose limb movements depend on the motions of water in an internal  
> >> tank, and pointed out that if you replaced the tank with a  
> >> sufficiently good simulation, the >external limb movements of the  
> >> robot shouldn't be any different.
>
> > If you are interested in the behaviors of consciousness only, all you
> > have to do is watch a youtube and you will see a simulated
> > consciousness behaving. Can you produce something that acts like it's
> > conscious? Of course.
>
> >> My point was that if you agree that the basic notion of "Darwinian  
> >> evolution" is purely a matter of organization and not the details  
> >> of what a system is made of (Do you in fact agree >with that?  
> >> Regardless of whether it might be *easier* to implement Darwinian  
> >> evolution in an organic system, hopefully you wouldn't say it's in-
> >> principle impossible to implement >self-replication with heredity  
> >> and mutation in a non-organic system?), then it's clear that in  
> >> general it cannot be true that "Feature X which we see in organic  
> >> systems is purely a >matter of organization" implies "We should  
> >> expect to see natural examples of Feature X in non-organic systems  
> >> as well".
>
> > It's a false equivalence. Darwinian evolution is a relational
> > abstraction and consciousness or life is a concrete experience. The
> > fact that we can call anything which follows a statistical pattern of
> > iterative selection 'Darwinian evolution' just means that it is a
> > basic relation of self-replicating elements in a dynamic mechanical
> > system. That living matter and consciousness only appears out of a
> > particular recipe of organic molecules doesn't mean that there can't
> > be another recipe, however it does tend to support the observation
> > that life and consciousness is made out of some things and not others,
> > and
>
> ...
>
> read more »

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.

Reply via email to