On Sep 13, 9:25 pm, Jason Resch <jasonre...@gmail.com> wrote:

>
> Everything that can exist does, for there is no meta-rule prohibiting that
> object's existence.


I would call that a hasty generalization. Let's say it's the year
1066. Do cell phones exist in England? Is there a meta-rule
prohibiting their existence?

Have you ever considered that rules are created through existence as
well as existence being created through rules?

>
> > It's easy to assume that it helps, just as it's easy for me to assume
> > that we have free will. If we don't need our conscious mind to make
> > decisions, then we certainly don't need the fantasyland associated
> > with our conscious minds to help with that process. Think of building
> > a robot that walks around and looks for food and avoids danger. Why
> > would it help to construct some kind of Cartesian theater inside of
> > it?
>
> That "Cartesian theater" is a necessary consequence of the sensory data
> being perceived.  You can't have a perception without a perceiver.

It's not a matter of perception or not perception, it's a matter of
the depth and quality of the perception and the power and volition of
the sentience. A Cartesian theater-like experience is not a necessary
consequence of any mechanical process. If it were, then why would it
also be necessary to hide that experience in an invisible interiority?
Why not have an actual Cartesian theater where actual miniature images
and full sensory movies can be seen under a microscope. Why does this
impenetrable cloaking happen?

>
> > Functionally, there is no reasonable explanation for perception or
> > experience, especially if you believe in determinism.
>
> I disagree with this.  I don't see what adding randomness or capriciousness
> to the mix does for perception.

I expect that you disagree with this, but how do you justify that
disagreement rationally? What is a reasonable explanation for a
deterministic universe to develop a capacity to perceive itself? What
would be the point?

>
> You think that because consciousness is a mystery,

No, you think consciousness is a mystery. I think it's ordinary.
Amazing, but ordinary. It's just the opposite of objective matter in
space (subjective 'energy' through time).

 >it must not involve
> anything we can explain or understand.

Not at all. I understand it, and my understanding explains it.

> Many mysteries have existed in the
> past and been answered.  This is a unique time where our knowledge can
> explain so much of ordinary occurances that the few things we lack
> sufficient explanation of appear to be insoluble, but don't let the present
> lack of an answer cause you to lose faith that such answers exist.

I have found the answer already. It's not a mystery to me.

>
>
>
> > > > > If you are non-deterministic,
> > > > > then nothing determines what you do and you are a slave to the roll
> > of a
> > > > > die.
>
> > > > Only if you a priori eliminate the possibility of free will and sense.
> > > > Then you are left with the options that make no sense and have no free
> > > > will.
>
> > > You have a will, which determines what you do, but your will in turn is
> > > determined by other things.
>
> > If it was completely determined by other things, then it's existence
> > would be redundant.
>
> The movement of cars is determined by the movement of atoms, but that
> doesn't mean cars are redundant.

Cars are redundant without human beings to drive them. They don't
exist as a self-sustaining phenomenon. Without us, they are just piles
of junk. Atoms, molecules, maybe shelter for some feral cats or
insects.

>
> > The fact that it is influenced by other things
> > doesn't mean that it is completely determined by other things and
>
> So what could these other things be that also help in determining it?  How
> does it manifest physical effects with third-person visible consequences?

If you flip a coin, you flip both sides of the coin. They are
essentially the same thing, just with two opposite views.

>
> > Then what would there be a point of capturing them? If high level
> > complexity is entirely described by low level behaviors added
> > together, then what is the point of adding them?
>
> I think there are two reasons to do this.  One is for simplifying our world
> view.

Circular reasoning. It's like the Woody Allen story, about the
psychiatrist who won't cure his patient that thinks he's a chicken
because he 'needs the eggs'. If we don't exist, then there is nothing
to appreciate a simplified worldview.

> If we had to explain everything, such as political issues, economies,
> biology in terms of the motions of atoms it would take forever and be
> impossible to understand.

No, it would take longer to take those atomic realities and then add a
continuously produced abstraction layer on top of that. We wouldn't
need to understand it because we wouldn't need to understand anything
because we would be determined to be and know whatever the groups of
atoms end up pretending to be and know.

>It is the same reason we invent higher-level
> programming languages like LISP and Java rather than working in Machine
> code.

No, you're making a counter example for me. We invent higher-level
languages for *our* convenience. It's actually much less efficient to
have to compile that object oriented code into machine language. If
higher-level languages were functionally desirable in any way from the
machine's perspective then we would focus our efforts on fabricating
chips that process natively in English or Chinese.

>In the end all these languages result in execution of machine code on
> physical hardware, but programmers are spared writing millions of lines of
> incomprehensible and error prone machine code.

Which would make sense if we had high level programmers of our
neurons, but you say that we don't. You say it's all predictable
through atomic code. You're mixing up the needs and purposes of human
programmers - who appreciate the simplicity, and the needs and
purposes of any machine, which could not care less about simplicity.
You can make a machine that makes coffee by printing out the
Gettysburg Address every three seconds if you want, it's never going
to get tired of doing it that way. That's why it's 'like a machine'.

>
> The other purpose to speak of these higher levels is to realize that
> different patterns are, when seen from their level, identical, or nearly
> identical.  The you from a last month might have changed 90% of its matter,
> but your patterns are nonetheless mostly the same.  A computer made of wood
> and gears could likewise, be performing the same computation as another very
> differently organized computer.

I think you're on the right track, but for the wrong reason. If you
make up a fictional character, it may very well allow you to tell
stories from a different perspective than you would have otherwise,
but that presupposes that there are high level stories to tell in the
first place. Still every high level function could be accomplished
mechanically without any kind of perception being generated (if we
believed the world was purely mechanistic).

>
> > Isn't it obvious that
> > different levels of perception yield different novel possibilities?
> > That a ripe peach does something that a piece of charcoal doesn't?
> > That yellow is different from just a bluer kind of red?
>
> I believe that the sensations you describe are equivalent to certain
> computations.

What is equivalent? Is an apple equivalent to an orange? It's a matter
of pattern recognition. If you recognize a common pattern, you can
project equivalence, but objectively, there is no equivalent to
yellow. You either see it or it does not exist for you. No computation
can substitute for that experience. It has no equivalent. It can be
created in people who can see yellow by exposure to certain optical
conditions, but also by maybe pushing on your eyeball or falling
asleep. Yellow is associated with various computations, but it is not
itself a computation. It is a sensorimotive subjective presence.

>Thus consciousness, and computation are higher-level
> phenomenon, and accordingly can be equivalently realized by different
> physical media, or even as functions which exist platonically in number
> theory.

Human consciousness is a higher level phenomenon of neurological
awareness, which is a higher level phenomena of biology, genetics,
chemistry, and physics. It is also a lower level phenomenon of
anthropology, zoology, ecology, geology, and astrophysics-cosmology.
Some psychological functions can be realized by different physical
media, some physical functions, like producing epinephrine, can be
realized by different psychological means (a movie or a book, memory,
conversation, etc).


>
>
>
> > > > How do you get 'pieces' to 'interact' and obey
> > > > 'rules'? The rules have to make sense in the particular context, and
> > > > there has to be a motive for that interaction, ie sensorimotive
> > > > experience.
>
> > > If there were no hard rules, life could not evolve.
>
> > 'Hard rules' can only arise if the phenomena they govern have a way of
> > being concretely influenced by them. Otherwise they are metaphysical
> > abstractions. The idea of 'rules' or 'information' is a human
> > intellectual analysis. The actual thing that it is would be
> > sensorimotive experience.
>
> Are you advocating subjective idealism or phenomenalism now?

I'm advocating a sense monism encapsulation of existential-essential
pseudo-dualism.

>
> > No. Just the fact of not occupying the same space as my body makes it
> > different.
>
> Not different in any way you could notice.

I would notice if someone that looked exactly like me was standing
somewhere else besides where I'm standing.

 >If everything in the universe
> were shifted to the left by 10 meters, would this universe be different?

That's not possible, since space is inside of the universe, not
outside of it. Space is an abstraction we use to understand the
relation between objects.

> Would it affect your consciousness in any noticeable way?
>
> > The idea of two separate things being 'identical' is a
> > function of pattern recognition. Identical to who?
>
> > > > There is of course a strong correlation between physical and
> > > > psychological phenomena of a human mind/body, but that correlation is
> > > > not causation. Psychological properties can be multiply realized in
> > > > physical properties,
>
> > > This means you think other different physical forms can have identical
> > > psychological forms.  E.g., a computer can have the experience of red.
>
> > If the computer was made out of something that can experience red,
> > then sure.
>
> The human experience of perceiving red is equivalent to a certain
> computation.

What computation would that be? If I arrange milk bottles so that they
fall over in a pattern which is the equivalent to that computation,
will the milk bottles see red? Will I see red if I look at the milk
bottles? How can you seriously entertain that as a reality?

> This computation could be performed by any kind of matter that
> can be arranged into a functional Turing machine.  This computation also
> exists in mathematics already.

I'm confident that no computation generated by a Turing is equivalent
to seeing red.

>
>
>
> > > > but physical properties can be multiply realized
> > > > in psychological properties as well. Listening to the same song will
> > > > show up differently in the brain of different people, and different
> > > > even in the same person over time, but the song itself has an
> > > > essential coherence and invariance that makes it a recognizable
> > > > pattern to all who can hear it. The song has some concrete properties
> > > > which do not supervene meaningfully upon physical media.
>
> > > Different physical properties can be experienced differently, but that's
> > not
> > > what supervenience is about.  Rather it says that two physically
> > identical
> > > brains experiencing the same song will have identical experiences.
>
> > Identical is not possible, but the more similar one thing is
> > physically to another, the more likely that their experiences will
> > also be more similar. That's not the only relevant issue though. It
> > depends what the thing is. A cube of sugar compared to another cube of
> > sugar is different than comparing twins or triplets of human beings.
> > The human beings are elaborated to a much more unpredictable degree.
> > It's not purely a matter of complexity and probability, there is more
> > sensorimotive development which figures into the difference. We have
> > more of a choice. Maybe not as much as we think, and maybe it's more
> > of a feeling that we have more choice, but nevertheless, the feeling
> > that smashing a person's head is different from smashing a coconut
>
> I hope you don't speak from experience. ;-)

If the universe was only arithmetic, what would be the difference?

>
> > deserves serious consideration in any kind of ontological formulation
> > of the universe and our place in it.
>
> > > > We understand the what and the how of electromagnetism, but to predict
> > > > what a brain would do you need to understand the who and the why of
> > > > sensorimotive perception.
>
> > > For this to be so, then the brain must do something that violates quantum
> > > electrodynamics.
>
> > No it doesn't. No more than the Taj Mahal must do something that
> > violated classical brick dynamics.
>
> > > > > I don't think Stathis, Bruno or myself are denying any of these
> > things.
>
> > > > Bruno isn't, he's just attributing them to arithmetic - which I
> > > > respect, but disagree with because of recent changes in how I think
> > > > about it.
>
> > > What was this recent change?
>
> > The realization that photons may not be real.
>
> The idea of real photons made you more accepting of Bruno's theory?

Questioning the conventional wisdom assumption of real photons gave me
the basis for a sense based cosmos rather than a pattern based cosmos
which I had previously subscribed to, which is essentially comp.

>
>
>
> > > > You and Stathis though, are pure substance monists. Anything
> > > > that doesn't show up in a billiard ball model must be magic.
>
> > > > > I don't think it disqualifies the gliders or guns of the Game of Life
> > to
> > > > admit
> > > > > that they are complex phenomena which nonetheless follow from simple
> > > > rules.
> > > > > A lone electron is a simple thing,
>
> > > > It's not a simple thing. It's a simple name for a very complex
> > > > phenomenon. The gliders or guns of the Game of Life aren't things at
> > > > all in themselves. They're experiences of a people using a GUI.
>
> > > > > but when you have ~10^25 of them confined
> > > > > to the size of a human skull moving in immensely complicated shapes
> > and
> > > > > patterns no one can use the fact that the electron is simple to
> > > > disqualify
> > > > > the awesome complexity and wonder of the mind.
>
> > > > If the electron truly were simple, and it had only simple physical
> > > > functions, then it would disqualify the mind from existing.
>
> > > You think an electron has to have the capability to feel in order for the
> > > human brain to be able to feel?
>
> > It doesn't have to have the capability to feel like a human being
> > feels, no, but it has to have some kind of capability to detect and
> > change according to that detection.
>
> Logic gates in a computer can detect and change according to their
> detection.  If this ability forms the "atom" of experience, then by
> extension, computers possess the appropriate building blocks to build any
> form of experience.

Maybe, but I think that the computer might have to assemble those
building blocks into the experiences of actual living cells and
organisms first. Our consciousness is a community of a specific kind
of organic subjective agents. They perform logical functions but they
are not limited to them. It's like saying we could build the Great
Barrier Reef out of Play Dough... maybe in theory, but not really.

>
>
>
> > > > Nothing
> > > > about the physical functions of the brain, neurons, or electrons we
> > > > observe suggest the existence of a mind.
>
> > > The particles in the brain model their external reality,
>
> > In what way? Where is this model located?
>
> In the patterns of the neuron 
> firings:http://www.youtube.com/watch?v=MElU0UW0V3Q

That's cool technology, but the model being used is developed by
researchers. The patterns of the neurons are the experiences of the
person, not a model of them. They are the physical presentation that
corresponds to the psychological presentation. We have to reverse
engineer a Rosetta Stone of code equivalence to match up our first
person experience with the third person measurements. Without the
first person reports, there would be no suggestion of a mind.

>
> > The forest could be modeling
> > intergalactic p0rn for all we know, but without any experience of the
> > result of that 'model', we can't really say that is what the brain is
> > doing at all. The brain is just living cells doing the things that
> > living cells do.
>
> > > analyze patterns,
> > > process sensory information, digest it, share it with other regions, and
> > > enable the body to better adapt and respond to its environment.
>
> > The immune system does that too. The digestive system. Bacteria does
> > that.
>
> For all you know, those systems could be conscious.  The Craig Weinberg I am
> communicating with on this list is not Craig Weinberg's immune system, so I
> have no way to ask your immune system if it is conscious.

Oh, I agree. I think that awareness of different sorts is in
everything, but it wouldn't automatically be that way just to fulfill
functional purposes. Even if there were a functional advantage, there
isn't any functional material process which would or could discover
awareness if it wasn't already a built in potential.

>
>
>
> > >These
> > > behaviors and functions suggest the existence of a mind to me.
>
> > Only because you have a mind and you are reverse engineering it. If a
> > child compares live brain tissue under a microscope to pancreas tissue
> > or bacteria under a microscope, they would not necessarily be able to
> > guess which one was 'modeling' a TV show and which was just producing
> > biochemistry.
>
> If you zoom in on anything too much you crop out all the context.  If you
> zoomed in to the point where all you could see is a silicon atom, you have
> no idea if it is part of an integrated circuit or a grain of sand on a
> beach.

So what context would you have to zoom out from or in to before the
existence of a mind presents itself in the absence of any pre-existing
notion of 'mind'? Like what pattern besides red would make you see red
if you had never seen it?

>
> > The suggestion of a mind is purely imaginary, based upon
> > a particular interpretation of scientific observations.
>
> When we build minds out of computers it will be hard to argue that that
> interpretation was correct.

Ah yes. Promissory Materialism. Science will provide. I'm confident
that the horizon on AGI will continue to recede indefinitely like a
mirage, as it has thus far. I could be wrong, but there is no reason
to think so at this point.

>
>
>
> > > > > > Saying that a structure 'determines' the way
> > > > > > something 'responds' has no explanatory power at all. It's taking
> > for
> > > > > > granted the ability of structures to 'determine' and 'respond' as
> > if
> > > > > > those were logical expectations to have of shapes of matter. It
> > takes
> > > > > > for granted that the existence of a thing which has some reason to
> > > > > > determine or respond to anything without making that thing
> > explicit.
> > > > > > What good is input and output to something that is nothing but
> > input
> > > > > > and output? Nothing makes sense without sense itself, and nothing
> > > > > > which exists makes no sense.
>
> > > > > Take a thermostat and give it the ability to look over and compare
> > its
> > > > > current temperature and past temperatures, to attempt to predict the
> > > > future
> > > > > based on the trends and patterns of temperatures, to talk about these
> > > > > considerations and its present state, and you will have trouble
> > denying
> > > > that
> > > > > this thermostat really does have sense in the same meaning of the
> > word as
> > > > we
> > > > > do.
>
> > > > I have no trouble denying that. The thermostat element does have sense
> > > > of local temperature, and the computer required to record and analyze
> > > > statistical trends has sense of being an electronic device opening and
> > > > closing circuits according to the natural inclinations of it's
> > > > material components to store and discharge current,
>
> > > I could say the same things about the processes in your brain.
>
> > but you can't say the same thing about the processes in your own mind.
> > It's true that nobody can say for sure what a thermostat experiences,
> > but I have never claimed to know that, I only suggest what seems
> > rational to me based on the fact that thermostats don't seem to do
> > anything other than what we expect them to do.
>
> > > > but the two things
> > > > have no sense of each other.
>
> > > This is your guess.
>
> > Yes. I think it's a pretty good one though. It explains why
> > thermostats don't tell you move to a warmer climate or conspire with
> > the other appliances in your house to take better care of them.
>
> > > > The computer doesn't know what
> > > > temperature is at all, and the thermostat doesn't know what a computer
> > > > is at all.
>
> > > A 2 year old doesn't know what temperature is, or understand what a brain
> > > is, but one can still tell the difference between hot and cold.
>
> > Being able to tell the difference between hot and cold is the two year
> > old's version of knowing what temperature is. A computer doesn't know
> > the difference between hot and cold, but if it's connected to
> > something that changes measurably under different thermal conditions,
> > then we can use the computer to report the status of it's connection
> > to that thing. The thermostat's version of the difference between hot
> > and cold is not likely, in my view, to be very similar to a person's
> > view, because a strip of metal is so different from a trillion cell
> > living being
>
> I think your analogy is in error.  You cannot compare the strip of metal to
> the trillion cell organism.  The strip of metal is like a red-sensing cone
> in your retina.  It is merely a sensor which can relay some information.
> How that information is interpreted then determines the experience.

Aren't you just reiterating what I wrote? "because a strip of metal is
so different from a trillion cell living being"

>
> > (I doubt we share the same sense of humor with
> > thermostats either).
>
> > > > In contrast, we understand what temperature means to us and why we
> > > > care about it.
>
> > > An appropriately designed machine could care about it too.
>
> > Why do you think that a machine can care about something?
>
> We do.  And we are molecular machines.

We are also sentient human beings. It's only the subjective view of
the thing as a whole that cares, not the vibrating specks that make up
the tubes and filaments of the monkey body.

>
> > Just because
> > we have mechanistic properties doesn't mean that caring can be
> > accomplished through mechanism.
>
> It is not obvious how a machine comes to care, because we are immensely
> complicated machines.

Complexity alone doesn't care either.

>
>
>
> > > > We actually feel something, we have an opinion about
> > > > the temperature and what it means in terms of our comfort or planning
> > > > our activities etc. The thermostat, even with advanced predictive
> > > > computation capacities, has no comparable sense or experience.
>
> > > It could have experience, even if it was not necessarily comparable to
> > > yours.
>
> > Exactly. I think that is the case.
>
> > > Also, I thought you said everything, even water molecules, have sense and
> > > experience.
>
> > Yes, I think they do. It might be totally orthogonal to our own
> > experience... maybe molecules all have one aggregate experience
> > somehow, or maybe they have a very shallow pool of qualia, etc.
>
> > > > It has
> > > > no general intelligence or ability to question it's programming -
>
> > > It could.
>
> > Theoretically maybe, but it does not seem to be the case.
>
> > > > it
> > > > just blindly takes measurements and plugs them into logic circuitry
> > > > without ever knowing what logic is.
>
> > > > > I think your theory is the result of assuming awareness is as simple,
> > > > plain,
> > > > > and fundamental as it seems.
>
> > > > My theory is not assuming awareness is simple, it is deducing that
> > > > fact after long and careful consideration of the alternatives.
>
> > > This is giving up hope of understanding it.
>
> > No, I think that it's embracing the reality that we understand it
> > already.
>
> > > > > An fMRI could not, but a model of your brain and surrounding
> > environment
> > > > at
> > > > > the level of QED could, unless you think QED is wrong.
>
> > > > Another false dichotomy. If someone in my surrounding environment
> > > > looks at me a certain way, my perception of that presents me with
> > > > feelings and possibilities for interpretations of the look and the
> > > > feelings. QED has no capacity to address phenomena like that and
> > > > therefore fails spectacularly at predicting what I'm going to be
> > > > thinking of, yet does not make QED 'wrong'.
>
> > > Then you require that electrons do things not predicted by QED.
>
> > Does QED predict that electrons tell jokes and produce TV shows?
>
> If it could not, then it would be refuted.

Why? QED is a narrow specialty of microcosmic phenomenology. It has no
predictive or explanatory power over human scale phenomena.

>
>
>
> > > > > > This view of the psyche as being the inevitable result of sheer
> > > > > > biochemical momentum is not even remotely plausible to me.
>
> > > > > Why?
>
> > > > Because it doesn't take into account that there is an experience
> > > > associated with the psyche which is dynamically changing the
> > > > biological momentum from the top down from moment to moment.
>
> > > > > > It denies
> > > > > > any input/output between the mind and the outside world
>
> > > > > There definitely is interaction between the mind and its environment.
>
> > > > Then there can't be any model of prediction based on just biochemical
> > > > default behaviors.
>
> > > You don't seem to get what is meant by supervenience.
>
> > Nothing 'is meant' by supervenience. I may not get what you mean by
> > supervenience. It's a tricky term because if A supervenes on B it
> > means that B defines or controls A but A does not define/control B.
> > It's easy to get it mixed up.
>
> It is much easier to think of it in terms of examples.  I can run microsoft
> word on a computer using an AMD CPU on Windows, or I could run it on an
> Intel CPU on a Mac.  Microsoft Word supervenes on both the Intel and AMD
> CPUs.  The instances of the two programs are equivalent, despite the
> different underlying hardware.  That's all it is.

I think you have it backwards (I used to think it was used that way
too, but someone corrected me - Stephen actually.)

http://plato.stanford.edu/entries/supervenience/

A set of properties A supervenes upon another set B just in case no
two things can differ with respect to A-properties without also
differing with respect to their B-properties. In slogan form, “there
cannot be an A-difference without a B-difference”.

 "There cannot be a Microsoft Windows difference without an Intel chip
difference". To say that Windows determines what the chip does you
would say that Intel and AMD chips both supervene upon Windows. It
seems backwards at first but it sort of makes sense, sort of a synonym
for 'rely upon'. It's still kind of an odious and pretentious way to
say something pretty straightforward, so I try to just say what I mean
in simpler terms.

>
>
>
> > > > > > and reduces
> > > > > > our cognition to an unconscious chemical reaction.
>
> > > > > If I say all of reality is just a thing, have I really reduced it?
>
> > > > It depends what you mean by a 'thing'.
>
> > > Does it?
>
> > Of course. If I say that an apple is a fruit, I have not reduced it as
> > much as if I say that it's matter.
>
> How you choose to describe it doesn't change the fact that it is an apple.

I think the exact opposite. There is no such fact. It's only an apple
to us. It's many things to many other kinds of perceivers on different
scales. An apple is a fictional description of an intangible,
unknowable concordance of facts.

> Likewise, saying the brain is a certain type of chemical reaction does not
> devalue it.  Not all chemical reactions are equivalent, nor are all
> arrangements of matter equivalent.  With this fact, I can say the brain is a
> chemical reaction, or a collection of atoms.  Neither of those statements is
> incorrect.

I don't have a problem with that. You could also say the brain is a
certain type of hallucination.

>
>
>
> > > > > Explaining something in no way reduces anything unless what you
> > really
> > > > value
> > > > > is the mystery.
>
> > > > I'm doing the explaining. You're the one saying that an explanation is
> > > > not necessary.
>
> > > Your explanation is that there is no explanation.
>
> > Not really.
>
> An explanation, if it doesn't make new predictions, should at least make the
> picture more clear, providing a more intuitive understanding of the facts.

I think that mine absolutely does that.

>
>
>
> > > > > Also, I don't think it is incorrect to call it an "unconscious
> > chemical
> > > > > reaction".  It definitely is a "conscious chemical reaction".  This
> > is
> > > > like
> > > > > calling a person a "lifeless chemical reaction".
>
> > > > Then you are agreeing with me. If you admit that chemical reactions
> > > > themselves are conscious,
>
> > > Some reactions can be.
>
> > > > then you are admitting that awareness is a
> > > > molecular sensorimotive property and not a metaphysical illusion
> > > > produced by the brain.
>
> > > Human awareness has nothing to do with whatever molecules may be feeling,
> > if
> > > they feel anything at all.
>
> > Then you are positing a metaphysical agent which supervenes upon
> > molecules to accomplish feeling. (which is maybe why you keep accusing
> > me of doing that).
>
> Yes, the mind is a computation which does the feeling and it supervenes on
> the brain.

Why does the computation need to do any feeling?

>
>
>
> > > > > If when you think of a chemical reaction, you think of a test tube
> > filled
> > > > > with a liquid and then you equate it with cold, lifeless simplicity,
> > and
> > > > you
> > > > > extend that association to other chemical reactions such as life
> > itself
> > > > and
> > > > > think that life is thereby diminished, it is not the theory that
> > needs
> > > > > adjustment but rather the assocations you are making in your mind.
>
> > > > No, I'm seeing it the other way around. I'm saying everything has some
> > > > degree of awareness, but that doesn't mean that everything has the
> > > > same awareness. Molecules have different properties which make
> > > > different kinds of sense in conjunction with other molecules. That is
> > > > the same thing as life, just not as elaborate. It's only cold and
> > > > lifeless in comparison to ourselves because we've taken our particular
> > > > zoological development to a ridiculous extreme.
>
> > > > > Machine's aren't cold, calculating, dim-witted, logical, unfeeling,
> > or
> > > > > polite.  This is just a personal bias you have developed over a
> > lifetime
> > > > > with working with rather simplistic man-made machines.  To think all
> > > > > machines are like this is the same mistaking as thinking all chemical
> > > > > reactions are dead or unconscious.
>
> > > > You think that I don't know what you're saying, but I do. I thought
> > > > that myself for many years. I understand why it's appealing. I get
> > > > completely the beauty of how self-similarity and complexity scale up
> > > > seamlessly from something which might seem mechanical to us to
> > > > something which seems natural, and how it is just the unfamiliarity
> > > > that makes it seem different to us. That's all true, but you don't get
> > > > that I'm not talking about that. I'm talking about the fact that
> > > > without awareness, no pattern is possible at all.
>
> > > An unheard tree falling in the woods.
>
> > Yes, but the organisms that make up the woods would have awareness
> > beyond hearing it. On an ontological level, pattern cannot be a
> > pattern without some form of pattern recognition. They are two parts
> > of the same thing.
>
> > > > Just because natural
> > > > processes can be modeled mechanistically doesn't mean that creating
> > > > machines based on those models won't be cold and unfeeling. I think
> > > > that any model which tries to work with awareness but does not take
> > > > awareness into account is fatally flawed.
>
> > > Awareness exists but I don't think its not a physical property.
>
> > That's fine, but you would need to explain what kind of property it
> > is.
>
> It is a property of information and/or information processing systems.

Why have we not seen a single information processing system indicate
any awareness beyond that which it was designed to simulate? What kind
of awareness does a book have without a reader? Information is
something I used to assume could exist on it's own, but now it's like
a glaring red Emperor's New Clothes to me. A brick is nothing but
'information' and information is the really the brick. Um, yeah. I
understand the appeal, but it's a figment of a 21st century Occidental
imagination.

>
> > How does it come to affect physical things?
>
> Because the aware systems we are familiar with are supervening on physical
> objects.

So because awareness needs physical objects, that means objects are
affected by awareness? But then somehow that doesn't mean that human
awareness affects our neurological behaviors?

>
>
>
> > > > > > If that were the
> > > > > > case then you could never have a computer emulate it without
> > exactly
> > > > > > duplicating that biochemistry. My view makes it possible to at
> > least
> > > > > > transmit and receive psychological texts through materials as
> > > > > > communication and sensation but your view allows the psyche no
> > > > > > existence whatsoever. It's a complete rejection of awareness into
> > > > > > metaphysical realms of 'illusion'.
>
> > > > > I think you may be mistaken that computationalism says awareness is
> > an
> > > > > illusion.  There are some eliminative materialists who say this, but
> > I
> > > > think
> > > > > they are in the minority of current philosophers of mind.
>
> > > > How would you characterize the computationalist view of awareness?
>
> > > A process to which certain information is meaningful.  Information is
> > > meaningful to a process when the information alters the states or
> > behaviors
> > > of said process.
>
> > What makes something a process?
>
> Rules, change, self-reference.

What makes something a rule, or a change, or a self, or a reference?

>
> > Are all processes equally meaningful?
>
> No.
>
>
>
> > > > What makes the difference between something that is aware and
> > > > something that is not?
>
> > > Minimally, if that thing possesses or receives information and is changed
> > by
> > > it.  Although there may be more required.
>
> > We are changed by inputs and outputs all the time that we are not
> > aware of.
>
> There may be other conscious parts within us which are disconnected from the
> conscious part of us which does the talking and typing.  For example, your
> cerebellum performs many unconscious calculations affecting motor control,
> but is it really unconscious?  Perhaps its information patterns and
> processing are merely not connected to the part of the brain which performs
> speech.  Similarly, a bisected brain becomes two minds by virtue of their
> disconnection from each other.

I agree, but it doesn't explain why the inputs and outputs we are
aware of are different from those we are not aware of.

>
>
>
> > > > It seems to me to be obscured behind a veil of
> > > > general 'complexity'.
>
> > > I think simple awareness can be simple, but complex awareness, like that
> > of
> > > humans, is highly complex.
>
> > I agree.
>
> > > > > If the atoms always follow these laws, and we can come to know these
> > > > laws,
> > > > > then in principle a computer programmed to follow these laws can tell
> > us
> > > > how
> > > > > a particular arrangement of atoms will evolve over time.  Do you
> > agree
> > > > with
> > > > > this?
>
> > > > No. If bricks always follow certain laws, and we can come to know
> > > > these laws, then in principle a computer programmed to follow these
> > > > laws can tell us how a particular pile of bricks will be assembled
> > > > over time.
>
> > > If you are assuming humans or other things involved in assembling those
> > > bricks, then your model is incomplete.
>
> > That's why QED is incomplete for understanding human consciousness.
>
> > >  In my example I used a simulation of
> > > atoms, because what besides atoms and their interactions between each
> > other
> > > do you think is important in the functioning of a brain?
>
> > The experiences that the brain has of the world, and the impact of
> > material conditions of that world on the brain (on a chemical,
> > biological, zoological, and anthropological level as well as atomic).
> > If you dwell on certain thoughts, you will change the structure of
> > your brain.
>
> > > > Do you agree with that? Can you detect the blueprint of a
> > > > future Taj Mahal from the mechanics of how random stones fit together?
>
> > > Not stones, but perhaps atoms.
>
> > What's so special about atoms that they are hiding Taj Mahals but not
> > stones?
>
> This world is made of more things than stones, but not more things than
> atoms (and the force carrying particles which govern their interactions).

Ok, but the Taj Mahal is just made of mainly stone. Either way the
dynamics of either one won't ever get you closer to predicting the
shape of the Taj Mahal than anything else.

>
>
>
> > > > Human consciousness is a specific Taj Mahal of sensorimotive-
> > > > electromagnetic construction. The principles of it's construction are
> > > > simple, but that simplicity includes both pattern and pattern
> > > > recognition.
>
> > > Pattern and pattern recognition, information and information processing.
> > > Are they so different?
>
> > Very similar yes, but to me information implies a-signifying
>
> Could you define "a-signifying" for me?

Meaning that the information has no meaning to the system processing
it. A pattern of pits on a CD is a-signifying to the listener and the
music being played is a-signifying to the stereo. In each case,
fidelity of the text is retained, but the content of the text is
irrelevant outside of the context of it's appropriate system. A TV set
isn't watching TV, it's just scanning lines. That's information.
Handling data generically without any relevant experience..

>
> > computation. You don't need to know what information means to process
> > it.
>
> Well your neurons don't know what their pulsing means either, yet their
> processing results in recognition of patterns.

No, it doesn't result in anything. The pulses are just the back side
of the pattern recognition. Tails doesn't result in heads, they just
live on opposite sides of the same coin.

>
> > To recognize a pattern, you have to actually make some sense of it
> > yourself. Information is 3-p generic. Pattern is more general, 1-p
> > sensations as well as 3-p measurements.
>
> > > > > The words we know how to utter exist as patterns encoded by neurons
> > in
> > > > the
> > > > > brain.
>
> > > > That's not the same thing. Without actually uttering them, they are
> > > > not words.
>
> > > They're not spoken words, but they're words written in some neural
> > language
> > > just like words written on paper.
>
> > They're biochemical texts in a neurological context, so they may be a
> > kind of word equivalent to neurons, but they are not words in any
> > human sense. You think that they are encoded neurolgically because
> > your mind decodes when it reads, but there is no place for a
> > translator to intervene. That neural language is not translated into
> > any other language, it is just experienced differently from the inside
> > than it can be observed from the outside. Not just differently, but in
> > the ontologically complementary way.
>
> > > > A pattern on a DVD is not a movie unless it's played on a
> > > > DVD player and watched by a human being. If we found an alien corpse,
> > > > there is no way we could extract any words from any patterns in the
> > > > brain, nor would we find any mechanism which encodes and decodes any
> > > > such 'patterns' into words.
>
> > > The mechanism would exist in the alien brain, and it could be found.
> > > Assuming the brain was sufficiently preserved.
>
> > No, that would mean a Homunculus hiding in the tissue of the brain
> > scribbling out translations.
>
> What I meant is we could find the software in the brain which translates for
> us.  The brain contains both the DVD and the DVD player, in the sense that
> it can decode, understand, and output (by vocal cords or other means) what
> is stored within.

I understand what you mean, but you aren't seeing that the DVD player
is useless without an audience. If a person was nothing but a brain,
it would need no translation. If it needs a translation then it can't
be the person can't be same thing as the brain, plus it needs some
plausible before and after region of the brain, but we find only the
before. There is no evidence of translation into anything. it's all
just neurological cellular activity with no signs of human experience
or translation to any form at all other than neurological activity.

>
> > > The Game of Life can contain Turing machines, and by extension it could
> > > contain any possible program, including a simulation of an evolution of
> > an
> > > artificial life form, or perhaps even you and me having this
> > conversation.
> > > One could translate Bruno's dovetailer into the game of life and then it
> > > would contain everyone.
>
> > I understand why you believe that, but you don't understand why I'm
> > pretty sure that's never going to happen.
>
> What is to stop someone from doing it?

Failure. It is what awaits anyone who attempts to turn lead into gold
(well, without blasting it down to protons and assembling it from
scratch - the equivalent of which would be replicating brains using
stem cells, which is not at all out of the question.

>
>
>
> > > > > > There is a
> > > > > > difference which arises partly from complexity, but also as a
> > > > > > consequence of qualitative differences between different inertial
> > > > > > frames. More degrees of freedom, with more non-computational
> > > > > > considerations come into play.
>
> > > > > What are some examples of these non-computational considerations?
>
> > > > Feeling. Memory. Choice. Imagination. Humor. Emotion. Strategy.
> > > > Insight. Vision. Genius.
>
> > > Feeling:
> >http://www.youtube.com/watch?v=cHJJQ0zNNOM&t=35sANDhttp://www.youtube...
>
> > Impressive, but still just HADD/prognosia.
>
> > > Memory:
> >http://hothardware.com/News/IBM-Building-120PB-Drive-Worlds-Largest/
>
> > That's not human memory, it's just data storage.
>
> > > Choice:http://en.wikipedia.org/wiki/Automated_trading_system(Computers
> > > make over 60% of the trading decisions on the US stock market)
>
> > That's just computation. There is no personal preference there.
>
> Maybe not personal, but the computer is making a choice: should I buy or
> sell X?

A choice is being made from the 3-p view, but that isn't the one that
matters. The computer has no knowledge of it's choices. It's just
executing an instruction set.

>
>
>
> > > Imagination:
> >http://www.miraclesmagicinc.com/science/man-builds-inventing-machine....
> > > Humor:http://news.bbc.co.uk/2/hi/technology/5275544.stm
> > > Emotion:http://www.youtube.com/watch?v=qSmlKAly1UE(Iwould argue the
> > > herbivores might feel something like fear when being chased by a
> > predator)
> > > Strategy:http://en.wikipedia.org/wiki/Deep_Blue_%28chess_computer%29
> > > Insight:
> >http://developers.slashdot.org/story/11/09/11/2051222/Has-Cleverbot-P...
> > > Vision:http://www.bookofjoe.com/2007/08/can-you-spot-th.html
> > > Genius:http://en.wikipedia.org/wiki/Watson_%28computer%29
>
> > I'll come back to these later, but there's nothing new here for me.
>
> You say that without looking at them?  Perhaps your mind is no longer open.

I can see from the links that they are mostly things I'm familiar with
but more importantly I see the underlying theme which is a category
error. It's all pursuing this line of thought that says that briefly
fooling casual observers is evidence of sentience, which I reject
completely as HADD/prognosia. I admit, my mind is no longer open to
substance monism, barring some truly surprising development. I
appreciate the effort to link though, they are good ones, I just can
see that they all fit into the same category of mistaking or
insinuating form for content. You'll know when we make a machine that
can feel, because it will start killing people on it's own volition.

>
> > It's conflating certain superficial kinds of intellectual processes
> > that can be electronically with human experience (which I maintain
> > cannot be reproduced that way).
>
> > > >  Qualia have no physical properties, they have sensorimotive-
> > > > perceptual-experiential properties. A mind can't perceive a new type
> > > > of color because color is the mind itself.
>
> > > > > I think the qualia, like the meaning in the conversation is dependent
> > on
> > > > the
> > > > > mind.  It is built up from other lower level phenomena.
>
> > > > Yes! Lower level sensorimotive phenomena. Not neurological structures
> > > > but the feelings that insist within and through those structures.
>
> > > If the feelings can't change the momentum or direction of a particle,
> > then
> > > they have no causal role and I don't see the point in including them in
> > > one's model of reality.
>
> > The feelings *are* the momentum and direction of particles, and they
> > and and do have a causal role in determining their own patterns on
> > their own level as well as levels below or above them. Our momentum
> > and direction influences civilization. Civilization's momentum and
> > direction influences us. Same with our neurons and their molecules.
>
> The actions of a civilization can always be explained in terms of the
> actions of the individuals within that civilization.

Events affect civilizations as a whole - natural disasters, wars,
plagues, etc, that transcend explanations in terms of individual
actions. You can talk about cultural patterns without discussing
individual people's actions. You can say 'art became more abstract
during the 20th century'..

Craig

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.

Reply via email to