Just to get back to the topic for a moment. Do I read correctly that the
difference between Russell and Bruno lies in whether a "raw", in Bruno's
usage, computation in terms of UD* is "actualized" (vs. what precisely? A
kind of "emanating non-raw" computation? This is a bit fuzzy from my pov)?

Also, Russell's point of view would be appreciated, as is his input/efforts
in general + this thread, and his paper that presents clearly and
accessibly MGA, are, which hopefully goes without saying. Respects and
complements, regardless of these quibbles. If you have no time, Russell; no
worries, just minor concern to continue the exchange.

Cue the military march music of stoicism, with hyper grave quality: We can
never get to the bottom of these things.. nor would we wish to!  Who says
"it's at the bottom" anyway? Maybe "it's upstairs". My fridge certainly,
truly, UD* or some alternate weirdness approved, just* is upstairs*. The
fridge is true. And so is a slice of cheese. And yes, I kid. :-) PGC


On Wed, Aug 20, 2014 at 2:27 AM, meekerdb <[email protected]> wrote:

>  On 8/19/2014 4:44 PM, LizR wrote:
>
>  On 20 August 2014 04:16, meekerdb <[email protected]> wrote:
>
>>
>>  If your altered state of consciousness has no self-awareness, is it
>> still "consciousness"?   And there's self-consciousness, i.e. being aware
>> you are thinking.  So it's not 'fading' qualia, it different categories of
>> consciousness.  I'd say my dog has self-awareness, e.g. he knows his name.
>> But I'm not so sure he is self-conscious. The koi in my pond are aware, but
>> I doubt they are self-aware.
>>
>
>  So this is all that Brent means by degrees of consciousness,
>
>
> No, there might also be other differences, although Bruno would no doubt
> disagree.  Consider my example of the autonomous Mars Rover, suppose that
> instead of summarizing things into a narrative memory and then
> reconstructing remembered events, as people seem to, it simply recorded
> everything and played back segments when it remembered. ISTM this might
> make a difference in kind.  Or suppose there were many Mars Rover at
> different locations and were controlled by the same program.  They would
> have different kind of sense of location.
>
>
>   after all the waving it in everyone's faces as though it had huge
> explanatory power, it's now been reduced to whether or not an organism
> knows certain things about itself. This looks to me like a lot of
> backpedalling on what started out as a rather grandiose concept,
>
>
> Why would different kinds of consciousness be a grandiose concept?
>
>
>   but which has now ended up as something fairly trivial. So koi carp
> don't have a concept of self (who would have thought it? What fools you've
> been in the all-or-nothing camp not to realise that!)
>
>  But it now appears that Brent just stepped out of the "consciousness
> continuum" camp (from thermostats, according to Dan Dennet, to us) and into
> the "either-conscious-or-not" camp. Is someone going to welcome him?
>
>
> Bruno thinks there's a good chance the algae in my pond is conscious, so
> he probably agrees that my thermostat might be too.
>
> Brent
>
> --
> You received this message because you are subscribed to the Google Groups
> "Everything List" group.
> To unsubscribe from this group and stop receiving emails from it, send an
> email to [email protected].
> To post to this group, send email to [email protected].
> Visit this group at http://groups.google.com/group/everything-list.
> For more options, visit https://groups.google.com/d/optout.
>

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To post to this group, send email to [email protected].
Visit this group at http://groups.google.com/group/everything-list.
For more options, visit https://groups.google.com/d/optout.

Reply via email to