On Aug 26, 9:05 am, Stathis Papaioannou <stath...@gmail.com> wrote:
> On Thu, Aug 25, 2011 at 12:31 AM, Craig Weinberg <whatsons...@gmail.com> 
> wrote:
> > Feeling doesn't come from a substance, it's the first person
> > experience of energy itself. Substance is the third person
> > presentation of energy patterns. If you turn it around so that feeling
> > is observed in third person perspective, it looks like determinism or
> > chance, while substance has no first person experience (which is why a
> > machine, as an abstraction, can't feel, but what a machine is made of
> > can feel to the extent that substance can feel.)
>
> > Whether there are other substances in the brain that we haven't
> > discovered yet is not the point. There might be, but so what. It's not
> > the mechanism of brain chemistry that feels, it's the effect that
> > mechanism has on the cumulatively entangled experience of the brain as
> > a whole, as it experiences with the cumulatively entangled experiences
> > of a human life as a whole.
>
> This is a bit hard to understand. Are you agreeing that there is no
> special consciousness stuff, but that consciousness results from the
> matter in the brain going about its business? That is more or less the
> conventional view.
>
> >> Do you think it's possible to reproduce the function of anything at all?
>
> > It's possible to reproduce functions of everything, but there is no
> > such thing as *the* function of something. To reproduce *all* possible
> > functions of something is to be identical to that thing. If the
> > reproduction even occupies a different space then it is not identical
> > and does not have the same function. Think about it. If you have one
> > ping pong ball in the universe, it has one set of finite states (which
> > would be pretty damn finite).
>
> > If you have another ping pong ball exactly the same there is a whole
> > other set of states conjured out of thin air - they can smack
> > together, roll over each other, move together and apart, etc. BUT, the
> > original ball loses states that it never could have anticipated. True
> > solitude becomes impossible. Solipsism becomes unlikely as the other
> > ball becomes an object that it cannot not relate to.
>
> > What you're not factoring in is that 'pattern' is a function of our
> > pattern recognition abilities. Even though you firmly believe that our
> > experience is flawed and illusory, somehow that gets set aside when
> > you want to prove that logic is different. Your faith is that the
> > logical patterns that we understand *are* what actually exists, rather
> > than a particular kind of interpretation contingency. You think that
> > A=A because it must by definition... but I'm pointing out that it's
> > your definition that makes something = something, and has no
> > explanatory power over A. In fact, the defining = can, like the second
> > ping pong ball, obscure the truth of what A is by itself. This is
> > critical when you're looking at this level of ontological comparison.
> > Describing awareness itself cannot be accomplished by taking awareness
> > for granted in the first place. First you have to kill "=" and start
> > from nothing.
>
> The function I am talking about is relatively modest, like making a
> ping-pong ball out of a new plastic and designing it so that it weighs
> the same and is just as elastic. If you then put this ping-pong ball
> in with balls of the older type, the collection of balls will bounce
> around normally, even though the new ball might be different in
> colour, reflectivity, flammability etc.

I understand that, but you are still assuming a metaphysical
appearance of an 'awareness' somehow coming into being as a
consequence of 'bouncingness' itself rather than seeing that awareness
is a property OF the balls themselves. Although the bouncing certainly
is part of what goes into the contents of any awareness that might
already be there, the awareness itself is ultimately determined by
what fundamental unit you are looking at. Organic molecules do a lot
of strange things when they are bouncing around together - very
different things compared to inorganic atoms, ping pong balls, or
programmable abstractions.

> There is no need to figure out
> exactly where all the balls will be after bouncing around for an hour,
> just the important parameters of a single ball so that it can slot
> into the community of balls as one of their own. A fire could come
> along and it will be obvious that the new ball, being less flammable,
> behaves differently, but we are not interested in what happens in the
> event of a fire, otherwise we would have included that in the design
> specifications; we are only interested in balls bouncing around in a
> room.

That's the problem. You're interested in the wrong thing. Cells and
organsims are not billiard balls. If you treat them as predictable
mechanisms, you lose the very dimension that you are trying to
emulate. The unpredictable behavior of a cell doesn't arise out of
complexity, it arises out of a higher order of simplicity that organic
molecules facilitate.

> Similarly with an artificial neuron, for the purposes of this
> discussion we are interested only in whether it stimulates the other
> neurons with the same timing and in response to the same inputs as a
> biological neuron would.

Even if you could create an artificial neuron which could impersonate
the responsiveness of an natural one, it wouldn't matter because it
still doesn't feel anything. You could make a movie that responds to
vocal commands, but it's still just a movie. Imitating how a neuron
seems to act to our instruments doesn't make something a neuron. That
might not be a problem when it comes to a bionic arm, bit it's a huge
problem when it comes to a bionic brain.

> If it does, then the network of neurons will
> respond in the usual way and ultimately make muscles move in the usual
> way.

It's a fantasy. Emulating the function of a neural network has never
led to anything like feeling. Feeling does not arise from the shape of
a network of dumb nodes - it's the nodes themselves that are doing the
feeling. Without that, all you have is a mechanical pantomime.

(Please note that while the artificial neuron can in a thought
> experiment be said to perform this function exactly the same as its
> biological equivalent, in practice it would only need to perform it
> approximately the same, since all biological tissue functions slightly
> differently from moment to moment anyway.) The question is whether
> given that the artificial neuron does this job adequately, would it
> necessarily follow that the qualia of the brain would be unchanged? I
> think it would, otherwise we would have the situation where you
> declare that everything is normal (because the neurons driving the
> muscles of speech are firing normally) while in fact feeling that
> everything is different.

Qualia doesn't just appear whenever some kind of mathematical
incantation is ritually enacted. If it could, it would just be a
matter of spanking silicon until it did what you want. The whole
presumption of qualia being dependent upon neuronlike functions alone
rather than functions of neuron-like organisms is a complete non-
starter. I can't tell if you are understanding what I'm saying at all
because you keep repeating the same fundamental mistake over and over
as if it were a matter of course.

> >> Figuring out the internal dynamics of the neuron will tell you when
> >> the neuron will fire in response to any given stimulus. You seem to be
> >> saying that it won't,
>
> > Right, it won't. Just like figuring out the internal dynamics of a
> > router won't tell you when something is going to happen on the
> > internet. A neuron by itself is just a specialized cell.
>
> But figuring out the internal dynamics of a router will tell you
> exactly what the router will do in response to any input, and that's
> what we want.

If you follow that through, then it scales up to an empty internet of
routers sending ACK packets to each other forever. That's not what we
want. We want something that generates new content and new users.

> The web browser I am using to write this doesn't know
> what I'm going to say next and its programmers didn't need that
> information; they just needed to program it to respond in a certain
> way to keystrokes.

Yes, but it needs a user to be of any use.

> >> because some non-physical influence which is the
> >> basis for feelings may come along and make it fire in an unpredictable
> >> way.
>
> > The cell is a living organism that has sensitivities to particular
> > molecules and electromagnetic states of it's own molecules and those
> > of other cells. It is a machine on the outside, but an anti-machine on
> > the inside. As such it's 'behavior' is a constant flux between
> > predictable and unpredictable. If you isolated it and test it in
> > linear, deterministic contexts, then, like a router unplugged from the
> > internet, you will probably see default FSM behaviors - cycling
> > through sending keep alive packets and waiting for ack packets. It has
> > nothing to do with being able to predict the content of the internet.
> > Do you really think that it is possible to come up with a computer
> > program that will tell you exactly what you are going to do and say
> > five years from now?
>
> I don't know what I'm going to do and say five years from now, so why
> should you expect the computer program to do better than me?

Because if you don't tell the computer program what to expect, it will
fail to produce a coherent response. That's the difference between
animal awareness and computation (post about it from today:
http://s33light.org/post/9414824827)

> If you
> write a program that simulates my mind and then subject the program to
> five years of experience, as occurs with the real person, then that
> will tell you what I might do five years from now. Of course it won't
> tell you exactly what the real me will be doing, since the inputs will
> be different unless the model is perfect and occupies the same space,
> but that would be the same if rather than writing a program you made
> an atom for atom replica of me.

I get what you're thinking, but I don't think it works like that. Your
inputs and outputs are subject to your interpretation. A program that
pantomimes feelings won't be able to generate the appropriate cascades
of consequences. Your mind isn't a a ping pong ball trajectory which
can be determined from initial conditions. It's the opposite of that.
It makes initial conditions continuously. There are mechanical
consequences which can be determined also, but it's a constant
interaction between teleology and teleonomy.

> >> This would be something amenable to experimental verification,
> >> since you just need to show that sometimes a neuron (or other cell, I
> >> assume they all have this vital essence)
>
> > It's only seems like a vital essence to us, because it's more similar
> > to us than a dead cell. To the universe there is no particular
> > preference, I would imagine.
>
> >> will do things apparently
> >> miraculously, not in accordance with the known laws of nature.
>
> > That's only in your world where my argument has to be wrong because it
> > contradicts yours. To me, that living things do what they want to do
> > sometimes is an ordinary fact - it is the law of nature.
>
> >> Surely
> >> if such amazing things happened someone would have noticed and it
> >> would be common scientific knowledge.
>
> > It is common scientific knowledge. It's why biology, neurology, and
> > psychology are different from physics, and why they are different in a
> > particular way which respects the different laws of nature which apply
> > to cells, brains, and minds as opposed to bowling pins and silicon.
>
> Can you point to an experiment where a process in a cell contradicts
> the laws of chemistry, such as an ion channel in a cell membrane
> opening without any apparent stimulus, because the person decided to
> do something?

It doesn't have to 'contradict the laws of chemistry' because cells
themselves are not predicted by chemistry. Had we not encountered
cells for ourselves, we never could have guessed that such a things
living organisms were possible. Everything that we decide to do is
reflected in our neurochemistry, just as every TV show you watch is
reflected in pixels on a screen, but the shows are not written and
directed by pixels or scan lines or 1080i frames of HD color. I don't
know how many times you want me to say this in how many different
ways, but I can tell you that you're not getting it. I understand
completely what you're saying, because I've said it before myself, but
I'm not sure that I can change your programming - you have to figure
it out for yourself.

> >> You do understand that a neuron fires in response to another neuron to
> >> which it is connected via synapses?
>
> > You do understand that you're argument is condescension?
>
> You have said that neurons sometimes do things because the person
> decides to do them, so I am asking if this is consistent with the view
> that they fire in response to stimulation from other neurons.

Don't you see that it's a problem to insist that neurons only fire in
response to each other? You would not be able change your mind. Every
neuron can't only be a passive machine, otherwise the mind as a whole
could never generate any new ideas or voluntary program-interrupts.
There would be no reason that we should feel like we have awareness at
all, since it's all going to be a chain of involuntary subroutines.

> I would
> say that they do, and the feeling that you have decided to do
> something is supervenient on the deterministic behaviour of the
> neurons. I don't think you agree, so could you explain what physical
> factors could move neurons other than the ones known to science?

It's begging the question. If I cite the ordinary and unimpeachable
fact that we feel that we make decisions, rather than addressing that
fact yourself and the complete failure of substance monism to account
for that feeling, you try to put me on the defensive. You're smuggling
in metaphysical assumptions into my opinion to equate any idea of
sense as a primitive with witchcraft. The fact that the chemistry of
carbon, hydrogen, oxygen, and nitrogen do not explain a kangaroo does
not mean that a kangaroo doesn't exist or that it cannot be made out
of CHON.

Neurons move themselves. They are alive. Just like other single celled
animals. They build a complex network between them but that doesn't
mean that some mathematical abstraction of networkness controls them.
They participate. They have jobs to do. They are motivated by
sensorimotive experiences on their own level as well as the experience
of being transparent to the larger organisms motives and senses. It's
not metaphysical, it's just the part of physicality that's private.

> >> The neurons in the language centre
> >> (and everywhere else) would be getting the same signals from the
> >> artificial cortex as they would normally get, so they would fire in
> >> the same sequence as they normally would, so the muscles involved in
> >> speech would get the same signals they normally would, so you would
> >> say the same things you normally would.
>
> > That's the same fallacy three times. There is no "same". You cannot
> > make an artificial router that produced the 'same' signals as the
> > internet does. You are seeing the brain as some kind of clockwork
> > orange of replaceable parts. It's not like that. It's a giant sense
> > organ. It thinks, chooses, and feels because what it is made of can
> > also, in it's own contextual version, think, choose, and feel.
>
> > The 'signals' are just what we can detect through a machine which does
> > not have the ability to do those things - it just gives you the white
> > light, not the spectrum of what's going on inside the 'signals'. You
> > cannot reproduce a prism of signals with white light signals, even
> > though they will both look the same on a black and white monitor.
>
> So in other words you think there is some other influence than the
> "clockwork" determining when the neurons fire. You don't think it's an
> immaterial soul; what else could it be?

It's plain old ordinary feeling. Being. Sense. Perception. Experience.
Significance. Pattern recognition. It's the qualitative, interior
complement of the external, objective, quantitative topology of all
phenomena. If you call it a soul, you make it a substance - a thing.
It's the opposite of a thing, it's the very phenomonology through
which all thingness persists. We can't think of it very well because
we ARE it. Perception isn't something we can perceive but we can infer
it's existence intellectually once we understand the dynamic of how it
presents external phenomena as that which is the ontological opposite
of itself.

> >> Of course I agree that if you have qualia, you know you have qualia.
> >> That is why I think it is impossible to make an artificial device that
> >> replicates the normal pattern of neuronal firings without also
> >> replicating the part those neurons play in consciousness.
>
> > I agree, but you are assuming that there is a such thing as a normal
> > pattern of neuronal firings. I'm saying, as are others here, that we
> > don't know what we're looking at. a does not = A. A router by itself
> > is not the internet. The collected works of Shakespeare (or Dan
> > Dennett if you like) are not emulable by simulating one of his neurons
> > 'firing pattern' any more than your career path can be predicted by
> > the 'firing pattern' of your arms and legs.
>
> Where do you get the idea that simulating a neuron involves simulating
> the entire universe that might affect the neuron? A biological neuron
> is not programmed with such information, it is just programmed with
> how to respond to stimuli.

It might be programmed to respond to stimuli in part, but a muscle
cell can do that. If you are going to substitute your brain from cells
that can *only* respond to 'stimuli' then you can't expect it to feel
or care about anything. You can't expect a deep neurological
interiority.

> >> But if the artificial visual cortex sends the same neural signals to
> >> the rest of the brain, how could the rest of the brain notice that
> >> anything was different?
>
> > Because what you think is the 'signals' is not the only thing going
> > on. There's a civilization in there. Thousands of substances are being
> > produced and consumed. Do you really think that is necessary to make a
> > blob of grey jello send and receive electrical signals alone? If that
> > were the case then lightning strikes over the millennia might have
> > evolved into an atmospheric consciousness by now. There could be
> > simple cells of continuous electrical storms hovering over areas of
> > the ocean, reproducing and having lives.
>
> How is that an argument? It's like saying if jet aircraft could fly
> then jet engines would have evolved naturally in birds.

No, it's like saying that the fact that jet engines do not evolve into
birds demonstrates that flight alone does not define a flying object.
Just because a brain sends 'signals' doesn't mean that's all it is and
it doesn't mean that 'signalness' is all we are.

> >  f I recorded your activities from a telescope in the Andromeda
> > galaxy, and studied the computer enhanced cartoon of what you do, I
> > would make assumptions about your physical environment because that's
> > what I see through a telescope. I might say that your normal working
> > state is to sit in a chair, so that chair makes you work, or your bed
> > is a resting state of your signal pattern.
>
> These might be reasonable hypotheses. You could devise experiments to test 
> them.

Haha. That is the reasoning of the True Believer. The devout substance
monist would rather entertain the idea that their careers and families
are generated by furniture than even consider for a second that there
might be a reason why it's absurd.

> >>>> If two identical twins differ mentally, then obviously this is because
> >>>> they differ physically in their brain configuration. My mental state
> >>>> is different today than it was yesterday, and there is less difference
> >>>> between my brain on two consecutive days than there would be between
> >>>> the brains of identical twins.
>
> >>> If I impersonate someone, does that obviously mean it's because I have
> >>> changed my physical brain configuration?
>
> >> Yes, of course! How could you change your mental state if your brain
> >> state stays the same?
>
> > That's circular reasoning. How is it that you think that my mind is
> > only the brain but the brain is not my mind? Also you're confusing
> > your levels of computation. We were talking about changes to the
> > hardware of the brain - identical twins, genetics, and now you're
> > conflating hardware with software ('brain states'). With conjoined
> > twins, who have the same genes in the same body, which by your
> > reasoning should produce the same brains.
>
> > How would you explain that the production of a brain from the same
> > genes is never the same, yet you say the production of a 'signal
> > pattern' from a neuron is going to be the same from neuron to neuron.
> > In other words, in conjoined twins, you have two brains formed from
> > genetically identical neurons, grown in the same body, yet the
> > character of the people who develop through those neurons is
> > verifiably and significantly different.
>
> Genetics gives a rough blueprint but the environment determining how
> the brain actually develops

Genetics and environment certainly contribute, but the individual
determines nothing? This conversation is just between my genes/
environment and yours? To me the idea that everything in the universe
has a hand in making us do whatever we do, except for us, is, as I've
mentioned before, the height of anthropomorphism. The magic monkeys
are totally unable to determine anything in a universe of phenomena
which are constantly determining everything for us.

>. But the general point is that every time
> you have a thought this is because there is a physical change in the
> brain.

There's no 'cause' there. They are the same thing. You having a
thought IS a physical change in the brain. There are top down semantic
influences and bottom up biochemical influences to both the content of
our experience and the activity of our neurology, but one is not an
end result of the other.

>There is no real distinction between hardware and software
> either in a brain or in a computer. "Programming" involves making
> physical changes in the device to achieve a certain purpose. A device
> can only do what it is "programmed" to do, i.e. it can only do what
> its starting configuration allows it to do.

There's no fixed ontological distinction between software or hardware,
but there is a relative relationship between the two. Software makes
physical changes to bridge the user's purposes with the hardware's
operation, just as the mind is the intermediary between (parts of) the
brain and the person whose brain it is.

> >> It's the same with a brain or computer. The environment acts on
> >> brain/computer state S1 at time T1 and results in brain/computer state
> >> S2 at time T2.
>
> > Why does the 'environment' get to act but brains and computers can
> > only react? Substance monism has it backwards. It is the subject who
> > chooses, determines, and acts. If it weren't, how could anything feel
> > like it were doing so? What would be the mechanical advantage of that?
>
> >  Both the brain and the computer are sensitive (and insensitive) to
> > their environment in different specific ways. It is the user of the
> > brain and the computer which interprets S1 (T1 is part of S1, not some
> > independent reality) and determines whether or not there will be an S2
> > and what that will be, based upon their accumulated experienceS and
> > the inherent qualities which they have preferred to use to integrate
> > them.
>
> I don't think you're actually disagreeing with me. There is no real
> distinction between brain/computer and environment, they are both
> parts of a larger system. But even within a brain you can arbitrarily
> draw a border around a structure and consider how that structure
> behaves in response to its internal state in conjunction with
> environmental inputs. These are the *only* two factors that can have
> an effect. Even if magic is involved, that is a kind of environmental
> input. What you call "interpreting" and "accumulated experience" is
> completely captured in the physical states.

It's not at all captured in the physical states. That would
necessitate a Catesian homunculus. Chunks of images and sounds
floating around the gray matter. It doesn't physically exist - it
physically INsists. We need to expand physics to accommodate it's
private, proprietary side. Right now physics only looks at the public,
generic side.

> It isn't something extra,
> as you seem to imply sometimes. That would be like saying that the
> computer added 2+3 because they are numbers rather than because of
> movement of charge on capacitors in a DRAM chip.

No, just the opposite. You're saying the computer added 2+3. I'm
saying there is only capacitors winding up and discharging. The
computer knows nothing about 2 or + or 3.

> >> When signals from the environment are processed, for example when an
> >> animal sees some food and through a series of neural events moves
> >> towards the food and starts eating, that is associated with awareness;
> >> at least when the animal is human, probably for other animals as well.
>
> > 'Seeing' is awareness. What kind of neural events move an animal
> > towards food and why does it invent us to pretend that we are aware of
> > that fact? So I ask again: What is information without awareness?
>
> There is a chain of physical events between light from the food
> reaching the animal's eyes and the animal moving over to eat the food.
> Those physical events result in awareness. Without those events, no
> awareness.

Untrue. An animal can have the same awareness in a dream, with no
events, no light from the food reaching their eyes. Those physical
events do not result in awareness. Awareness results in the experience
of those physical effects.

> Information, like beauty, manifests physically but
> ultimately is in the eye of the beholder. The beholder needs to have
> the right sort of physical events going on in their brain in order to
> appreciate the information. What have I left out?

The beholder first needs to be able to behold. Information comes into
being through sense, not the other way around. What's going on in the
brain is the mechanical symptom of perception, but the mechanism alone
is meaningless.

> >> If I am the result of these biochemical reactions how does it make
> >> sense to say that I am a puppet? It's like saying I have no power
> >> because I am being pushed around by myself. There is no separate "I"
> >> to be pushed around.
>
> > What 'result'? You are saying biochemical reactions in, biochemical
> > reactions out. Where in biology do you find yourself? Why is there
> > even a question about it? If your experiences were biological, then
> > you would find them under a microscope. Since you do not, then they
> > must either be metaphysical solipsistic 'emergent properties' in
> > Platonia, or biochemistry itself must feel, see, and have free will.
> > My view explains that biochemistry resolves that by realizing the
> > fundamental dynamic of sense as the most elemental principle of the
> > cosmos.
>
> If I think and feel when certain biochemical reactions occur, what do
> you call that?

I call that sensorimotive electromagnetism.

> It's a matter of taste whether you say that I am the
> biochemical reactions, I supervene on the biochemical reactions, I
> emerge as a result of the biochemical reactions, or I am "fundamental
> dynamic of sense" in the biochemical reactions. None of these phrases
> really add anything. I'm still me, and I want the biochemical
> reactions to continue running so that I can continue as me.

It's a matter of taste for informal contexts, but our whole point here
is getting down to the essentials of mind/body. All of those phrases
are true in different senses to me, but you are saying that only the
'I emerge as a result of biochemical reactions' is true, although you
don't specify what 'I' is if not a metaphysical voyeuristic
'illusion'.

> > Biochemistry does feel, but that feeling scales up qualitatively over
> > time on the inside as the complexity scales up across space
> > quantitatively on the outside. That's how it works. That's why you
> > don't find a homunculus in a Cartesian Theater, or communities of
> > talking rocks. We can't see the interiority of external phenomena for
> > the same reason that we can't see the externality of our own
> > psyche...because interiority is how the cosmos creates the ontology of
> > privacy.
>
> > I wish there was some way to put my understanding of this in a
> > physical form and then you could just install it in your brain, but I
> > can't. You have to reason it out for yourself.
>
> I wish I could see what you are getting at, because I don't understand
> your previous paragraph. I also don't understand your motivation to
> make a complicated theory going against Occam's Razor, which says the
> simplest explanation in keeping with the facts is the best.

Another loaded question fallacy. My view is unfamiliar so it must go
against Occam's Razor as well as the laws of physics. I think my view
is actually the simplest explanation possible which fully accounts for
subjectivity, objectivity, and their relationship to each other in the
universe.

> >> If the neurons won't do anything magical then their behaviour is
> >> defined by their biochemistry
>
> > You keep going back to that. It's like Linus' security blanket.
> > Biochemistry is just what we have figured out so far based upon a
> > particular set of tools and particular logical approach which has been
> > developed recently by domesticated primates. It doesn't define what
> > living organisms are or what they are capable of. You want to be
> > right. You want it to be 'my way or the highway'. I'm trying to show
> > you that is is not your way and it is not the highway. There's other
> > ways, you have only to be interested in them.
>
> Please state clearly: do you think your brain can do something which
> goes contrary to the deterministic (or perhaps partly random) physical
> processes that occur within?

You mean like do I think the brain can turn glucose into gold or
something? No. Do I think that ideas change the patterns of brain
activity from 'within' the psyche as well as from the external
environment in a sometimes uncomputable way? Yes.

>
> >  and is thus in theory predictable and
> >> can be modelled on a computer. They have a very wide repertoire of
> >> behaviour because they constitute a very complex system. The 26
> >> letters of the alphabet can only be used in 27^n different sentences
> >> of n or fewer letters (27 rather than 26 because we include a space as
> >> another symbol). That is a lot of possible sentences, but it is not
> >> infinite.
>
> > Why are assuming n is not infinite? Circular reasoning. You are
> > inserting a limit and then citing the consequences of that limit as
> > proof that it's not unlimited. This is why I say that sooner or later
> > an computer brain reveals that it's not a natural brain, because in
> > the fullness of time the infinity of n exhausts the finite isomorphism
> > of functionalism. If the isomorphism is inexhaustible then there
> > simulation can only be the genuine instance - it is the original. (As
> > Stephen might say, the original is the best possible simulation of
> > what it is).
>
> N can increase without bound, but whatever it is only 27^n sentences
> are possible.

Why do you keep introducing arbitrary limits. A sentence can be
infinitely long or they can be strung together in infinite
combinations. You don't even need an alphabet, you can just write
every digit of Pi.

>A Turing machine has an infinite memory, but a Turing
> machine is a mathematical abstraction (the video linked to previously
> was not strictly that of a Turing machine, since its memory was
> limited). A human brain weighing about 1 kg has only a finite number
> of possible thoughts.

Then the universe weighting 1 universe has only a finite number of
possible forms and motions, so it too could be simulated by a Turing
Machine - including itself in infinite regress. It's absurd. Even a
single 'thought' cannot be considered finite, and the idea of counting
them by the pound is misguided.

> The brain can in theory be re-engineered to make
> it bigger but you would need an infinite brain to have infinite
> thoughts.

In theory? The brain gets bigger every few generations, doesn't it?
It's a total Red Herring though. Bandwidth, sure. Brain size or size
ratio seems to correlate in some ways to the level of intelligence, to
the kinds of thoughts that can be expressed, but that has nothing to
do with an upper limit on the number of thoughts or feelings an
organism can have.

> >> Similarly, the brain has a very large number of possible
> >> thoughts, but not an infinite number. For it to have an infinite
> >> number of thoughts it would need to be infinite in extent.
>
> > Your network connection doesn't have infinite bandwidth. Does that
> > mean that you are going to run out of internet eventually? The
> > substance monist view of thought is a strawman. It's a delusion where
> > the brain is a sealed can of cellular automata. Discrete patterns
> > which can be isolated and reproduced. Nothing could be more opposite
> > of the truth.
>
> Over infinite time you could have a non-repeating stream of data over
> a low bandwidth connection, but the memory of your computer or your
> brain would only be able to hold a finite amount of this data stream.
> That means after a certain period your computer or brain would either
> fill or you would have to erase data and start again, so that the data
> would repeat. There is no other way without infinite memory.

So you have storage.

> >> True, most of what the neurons do does not directly manifest as
> >> consciousness; but all of what we experience as consciousness is due
> >> to what the neurons do.
>
> > Not what they do, but what they feel. Detecting something is easy.
> > Your skin detects light, but feels it as warmth. To see light, we need
> > to feel it with our eyes. If we could see with our skin, then we
> > wouldn't need eyes. Neurons feel both warmth and see light - and hear
> > and think, etc. but as a group. It's like the internet. There are at
> > least seven layers of conversation going on in this internet exchange.
> > Only one of those layers is accessible physically - Layer 1, wires,
> > chips, electronic components. Nothing above that is comprehensible in
> > physical terms alone. Reproducing switches, routers, SANs, servers,
> > etc wonn't give you anything more than warm metal unless you
> > understand not only packets and tcp/ip, authentication, http, web
> > browsers, and this particular group, but you have to understand users
> > and people and why they created and use the internet in the first
> > place.
>
> Yes, and in the brain there are just mundane physical processes which
> give rise to consciousness. The protein molecule or the neuron does
> not understand what is going on, but the network of neurons does.

No, the network is even dumber than the neurons. It's just a service
for the neurons, like Google. The internet arises out of human
purposes. It serves users, not just routers.

> > The idea that substance monism is invested in is that what we
> > experience on the internet is due to what the transistors do. That's
> > true in the most literal sense, but if the universe were that literal
> > then there would be no other sense possible...which is ironically the
> > most fantastically delusional fantasy of all. It is 'let's pretend
> > that we aren't REALLY in the universe, and let's pretend that there is
> > no meaning in the universe except for anything that would support the
> > idea that it has no meaning'. It just can't accept that there aren't
> > little particles of light and sound tucked away somewhere in the brain
> > decides that the light and sound are an illusion, but somehow
> > necessary anyhow. And that these illusions are 'information' but they
> > can only be biochemistry. It's a mess.
>
> What we experience in interacting with the Internet is the qualia
> generated by the biochemical processes in our brains in response to
> the data that comes down the network cable and is processed by our
> computer.

So now you are saying that biological processes respond to 'data'
being processed by our computer. What is the biochemistry of 'data'?
Is it soluble by water or benzene?

> >> The observation is that we are made of matter and that we have
> >> feelings; therefore, putting matter together in a particular way can
> >> result in feelings.
>
> > Feelings to who? Where? If they aren't a physical precipitate that can
> > be collected in a test tube, and they aren't metaphysical logics in
> > Platonia, then from where does this 'result' emerge and where does it
> > physically play out? It's like saying that putting pixels together in
> > a particular way can results in TV shows. It can be interpreted in a
> > way that would be legally true, but the understanding is completely
> > false. TV is produced top down, not bottom up. It doesn't arise
> > spontaneously from a pool of FSM pixel possibilities.
>
> The TV show is not conscious and needs to be interpreted by a
> conscious entity to make sense.

YES! At last.

> Brains, on the other hand are
> conscious, and by way of deduction computers of a certain complex kind
> would also be conscious.

You mean by way of fallacious conflation. I'm pointing out how
complexity alone makes no difference. It's not a matter of complexity
- feeling is simple. Cells are simple. They grow, replicate, and die,
and in doing so they experience the fundamental unit of our
consciousness. Computers don't do that.

> > Neurons work the
> > same way. The ones we like to use are us. We push them around like
> > beads on an abacus when we want. The abacus does things back to us in
> > response.
>
> "We" push the neurons around? Again you seem to hint at a separate,
> non-physical soul.

We are the pushing. It's not separate, and it is physical, but not
currently recognized as physics. It's the private topology of physics.

Craig

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.

Reply via email to