On Wed, Sep 4, 2013 at 3:22 PM, meekerdb <meeke...@verizon.net> wrote:

>
> You're still looking at it backwards, as though there were some
> alternative that would be *really real* and not an illusion; as though a
> video camera just recording "everything" would capture the reall real and
> the would really would spin around when the camera turned and there would
> be no illusion.  My point is that neither one is "reality" but the model
> your brain (via evolution) is closer approximation to what we denominate
> "reality".  We want reality to have point-of-view invariance, i.e. to be
> something that is the same from different points of view and as viewed by
> different people.  That's what we mean by "reality", and the brain
> automatically produces a good approximation of that form middle sized
> things not moving to fast.  For atomic size things or things moving near
> the speed of light - not so good.
>
> Brent
>
> Exactly... the subjective reality we experience is transformed in so many
ways from the impulses we get from our sensory neurons. For instance, a
tennis player that sees the ball as he hits it with his racket is
experiencing a reality that is impossible to justify in terms of notions
like "direct experience". The time it takes for light from the ball to be
transduced into neural signals, the signals to be routed to the visual
cortex, and for the outputs of the visual cortex to be communicated to the
rest of the neocortex and motor cortex takes at least a tenth of a second
or more... too slow to enable the reactions needed to reliably strike a
tennis ball moving at 120 mph.

Therefore it must be that the model the brain produces (and that we
experience) is actually a prediction that is a couple tenths of a second
*ahead* of the sensory signals the tennis ball ultimately produces. We are
experiencing a virtual reality that enables us to move with precision in
time *as if* our senses could process stimuli instantaneously, which of
course, they can't.

The model is further transformed by incorporating sensory impulses from
other modes - such as sound - that arrive at times slightly later than the
visual data, but the presentation we experience is that these stimuli of
separate modalities, timings, and sources are bound up in an integrated
reality.  A great example of this is when you notice a plane flying high
overhead. The sound you are hearing from it was generated at a point well
behind where the plane appears visually. Yet you will never notice this
unless you close your eyes before you locate the plane. Next time you hear
a cruising-altitude jet, close your eyes and try to guess where the plane
is coming from based on the sound and then open your eyes.

Terren


> --
> You received this message because you are subscribed to the Google Groups
> "Everything List" group.
> To unsubscribe from this group and stop receiving emails from it, send an
> email to everything-list+unsubscr...@googlegroups.com.
> To post to this group, send email to everything-list@googlegroups.com.
> Visit this group at http://groups.google.com/group/everything-list.
> For more options, visit https://groups.google.com/groups/opt_out.
>

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to everything-list+unsubscr...@googlegroups.com.
To post to this group, send email to everything-list@googlegroups.com.
Visit this group at http://groups.google.com/group/everything-list.
For more options, visit https://groups.google.com/groups/opt_out.

Reply via email to