On Sat, Nov 27, 2010 at 12:49 PM, Rex Allen <rexallen31...@gmail.com> wrote:

> On Thu, Nov 25, 2010 at 7:40 PM, Jason Resch <jasonre...@gmail.com> wrote:
> > On Thu, Nov 25, 2010 at 3:38 PM, Rex Allen <rexallen31...@gmail.com>
> wrote:
> >>
> >> But I also deny that mechanism can account for consciousness (except
> >> by fiat declaration that it does).
> >>
> >
> > Rex,
> > I am interested in your reasoning against mechanism.  Assume there is
> were
> > an] mechanical brain composed of mechanical neurons, that contained the
> same
> > information as a human brain, and processed it in the same way.
>
> I started out as a functionalist/computationalist/mechanist but
> abandoned it - mainly because I don't think that "representation" will
> do all that you're asking it to do.
>
> For example, with mechanical or biological brains - while it seems
> entirely reasonable to me that the contents of my conscious experience
> can be represented by quarks and electrons arranged in particular
> ways, and that by changing the structure of this arrangement over time
> in the right way one could also represent how the contents of my
> experience changes over time.
>
> However, there is nothing in my conception of quarks or electrons (in
> particle or wave form) nor in my conception of arrangements and
> representation that would lead me to predict beforehand that such
> arrangements would give rise to anything like experiences of pain or
> anger or what it's like to see red.
>
> The same goes for more abstract substrates, like bits of information.
> What matters is not the bits, nor even the arrangements of bits per
> se, but rather what is represented by the bits.
>
> "Information" is just a catch-all term for "what is being
> represented".  But, as you say, the same information can be
> represented in *many* different ways, and by many different
> bit-patterns.
>
> And, of course, any set of bits can be interpreted as representing any
> information.  You just need the right "one-time pad" to XOR with the
> bits, and viola!  The magic is all in the interpretation.  None of it
> is in the bits.  And interpretation requires an interpreter.
>

I agree with this completely.  Information alone, such as bits on a hard
disk are meaningless without a corresponding program that reads them.  Would
you admit then, that a computer which interprets bits the same way as a
brain could be conscious?  Isn't this mechanism?  Or is your view more like
the Buddhist idea that there is no thinker, only thought?


>
> SO...given that the bits are merely representations, it seems silly to
> me to say that just because you have the bits, you *also* have the
> thing they represent.
>
> Just because you have the bits that represent my conscious experience,
> doesn't mean that you have my conscious experience.  Just because you
> manipulate the bits in a way as to represent "me seeing a pink
> elephant" doesn't mean that you've actually caused me, or any version
> of me, to experience seeing a pink elephant.
>
> All you've really done is had the experience of tweaking some bits and
> then had the experience of thinking to yourself:  "hee hee hee, I just
> caused Rex to see a pink elephant..."
>
> Even if you have used some physical system (like a computer) that can
> be interpreted as executing an algorithm that manipulates bits that
> can be interpreted as representing me reacting to seeing a pink
> elephant ("Boy does he look surprised!"), this interpretation all
> happens within your conscious experience and has nothing to do with my
> conscious experience.
>

Isn't this just idealism?  To me, the main problem with idealism is it
doesn't explain why the thoughts we are about to experience are predictable
under a framework of physical laws.  If you see a ball go up, you can be
rather confident in your future experience of seeing it come back down.  It
seems there is an underlying system, more fundamental than consciousness,
which drives where it can go.  In one of your earlier e-mails you explained
your belief as "accidental idealism", can you elaborate on this accidental
part?


>
> Thinking that the "bit representation" captures my conscious
> experience is like thinking that a photograph captures my soul.
>
> Though, obviously this is as true of biological brains as of
> computers.  But so be it.
>
> This is the line of thought that brought me to the idea that conscious
> experience is fundamental and uncaused.
>
>
>
> > The
> > behavior between these two brains is in all respects identical, since the
> > mechanical neurons react identically to their biological counterparts.
> >  However for some unknown reason the computer has no inner life or
> conscious
> > experience.
>
> I agree that if you assume that representation "invokes" conscious
> experience, then the brain and the computer would both have to be
> equally conscious.
>
> But I don't make that assumption.
>
>
Okay.


> So the problem becomes that once you open the door to the "multiple
> realizability" of representations then we can never know anything
> about our substrate.
>

This sounds a lot like Bruno.  I believe there are a near infinite number of
indiscernible substrates that explain what you are experiencing right now.
In some I am a biological brain, in others I may be playing a sim-human
game, perhaps as a futuristic human being or technologically advanced alien,
or a super-mind of an omega-point civilization which is exploring
consciousness.  I do not, however, see the fact that I cannot know with
certainty my ultimate substrate as a problem for mechanism or multiple
realizability.  Whatever the substrates may be, I think they are in some
sense equivalent by their informational content.  (Note that when I say the
information is equivalent I mean to say it is interpreted/processed
equivalently, the same message fed to two different programs, may have an
entirely different meaning)


>
> You *think* that your brain is the cause of your conscious
> experience...but as you say, a computer representation of you would
> think the same thing, but would be wrong.
>
> Given that there are an infinite number of ways that your information
> could be represented, how likely is it that your experience really is
> caused by a biological brain?  Or even by a representation of a
> biological brain?  Why not some alternate algorithm that results in
> the same *conscious* experiences, but with entirely different
> *unconscious* elements?  How could you notice the difference?
>

That is entirely possible and I believe it is probable.  I could not notice
a difference.


>
> > Information can take many physical forms.
>
> Information requires interpretation.  The magic isn't in the bits.
> The magic is in the interpreter.
>
>
I agree, but I would also add an interpreter needs something to interpret.
The information is devoid of meaning without an interpreter, so it is the
information+interpretation that creates a meaningful message.


Jason

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-l...@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.

Reply via email to