Are we any less conscious of as it happens, or perhaps our brains are
simply not forming as many memories of usual/uneventful tasks.

Jason

On Wed, May 27, 2015 at 9:06 PM, Terren Suydam <[email protected]>
wrote:

> In the driving scenario it is clear that computation is involved, because
> all sorts of contingent things can be going on (e.g. dynamics of driving
> among other cars), yet this occurs without crossing the threshold of
> consciousness. Relying on some kind of caching mechanism under such
> circumstances would quickly fail one way or another.
>
> Terren
> On May 27, 2015 7:38 PM, "Pierz" <[email protected]> wrote:
>
>>
>>
>> On Thursday, May 28, 2015 at 6:06:22 AM UTC+10, Brent wrote:
>>>
>>>  On 5/26/2015 10:31 PM, Pierz wrote:
>>>
>>>   Where I see lookup tables fail is that they seem to operate above the
>>>> probable necessary substation level. (Despite having the same
>>>> inputs/outputs at the higher levels).
>>>>
>>>>    But your memoization example still makes a good point - namely that
>>> some computations can be bypassed in favour of recordings, yet presumably
>>> this doesn't lead to fading qualia. We don't need anything as silly as a
>>> gigantic lookup table of all possible responses. We only need to
>>> acknowledge that we can store the results of recordings of computations
>>> we've already completed, and that this should not result in any strange
>>> degradation of consciousness.
>>>
>>>
>>> Isn't that what allows me to drive home from work without being
>>> conscious of it?
>>>
>>
>> People keep making this point, which is one that I myself made in the
>> past - and I believe you argued with me at the time, saying that it's not
>> clear that the mechanism for automating brain functions is anything like
>> the same as caching the results of a computation. I think that objection is
>> actually fair enough. With automated actions it's not clear that the
>> computations aren't being carried out any more, just that they no longer
>> require conscious attention because the neuronal pathways for those
>> computations have become sufficiently reinforced that they no longer
>> require concentration. I think this model (automated computation rather
>> than cached computation) fits our experience of this phenomenon. Sometimes
>> I suspect we're really talking out of our proverbial arses with these
>> speculations as we still have so little idea about how the brain works. It
>> may be a computer in the sense that it is Turing emulable, but then we talk
>> as if it were squishy laptop or something, and that analogy can be
>> misleading in many ways. For example, our memories are nothing like RAM.
>> They are distributed like a hologram, constructive and fuzzy, whereas
>> computer memory is localised, passive and accurate to the bit. I'm probably
>> guilty of the same over-zealous computationalism with my lookup table
>> analogy above, but I was thinking more of an AI and the in-principle point
>> that cached computation results may be employed at a fine grained level. I
>> would continue to insist that it is meaningless to say that a "brain" that
>> employs cached results of computations is a zombie to the extent that it
>> does so, because it is meaningless to speak of the "when" of qualia. (You
>> never replied to my argument about poking a recorded Einstein with a stick,
>> which I think makes a compelling case for this.) We have to rigorously
>> divide the subjective and the objective.
>>
>>>
>>> Brent
>>>
>>  --
>> You received this message because you are subscribed to the Google Groups
>> "Everything List" group.
>> To unsubscribe from this group and stop receiving emails from it, send an
>> email to [email protected].
>> To post to this group, send email to [email protected].
>> Visit this group at http://groups.google.com/group/everything-list.
>> For more options, visit https://groups.google.com/d/optout.
>>
>  --
> You received this message because you are subscribed to the Google Groups
> "Everything List" group.
> To unsubscribe from this group and stop receiving emails from it, send an
> email to [email protected].
> To post to this group, send email to [email protected].
> Visit this group at http://groups.google.com/group/everything-list.
> For more options, visit https://groups.google.com/d/optout.
>

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To post to this group, send email to [email protected].
Visit this group at http://groups.google.com/group/everything-list.
For more options, visit https://groups.google.com/d/optout.

Reply via email to