On Friday, October 11, 2013 5:37:52 PM UTC-4, stathisp wrote:
> On Thursday, October 10, 2013 8:58:30 PM UTC-4, stathisp wrote:
>> On 9 October 2013 05:25, Craig Weinberg <whats...@gmail.com> wrote:
>> > A lot of what I am always talking about is in there...computers don't
>> > understand produce because they have no aesthetic sensibility. A
>> > description of a function is not the same thing as participating in an
>> > experience.
>> This is effectively a test for consciousness: if the entity can
>> perform the type of task you postulate requires aesthetic sensibility,
>> it must have aesthetic sensibility.
> Not at all. That's exactly the opposite of what I am saying. The failure
> of digital mechanism to interface with aesthetic presence is not testable
> unless you yourself become a digital mechanism. There can never be a test
> of aesthetic sensibility because testing is by definition anesthetic. To
> test is to measure into a system of universal representation. Measurement
> is the removal of presence for the purpose of distribution as symbol. I can
> draw a picture of a robot correctly identifying a vegetable, but that
> doesn't mean that the drawing of the robot is doing anything. I can make a
> movie of the robot cartoon, or a sculpture, or an animated sculpture that
> has a sensor for iodine or magnesium which can be correlated to a higher
> probability of a particular vegetable, but that doesn't change anything at
> all. There is still no robot except in our experience and our expectations
> of its experience. The robot is not even a zombie, it is a puppet playing
> back recordings of our thoughts in a clever way.
> OK, so it would prove nothing to you if the supermarket computers did a
> better job than the checkout chicks. Why then did you cite this article?
Because the article is consistent with my view that there is a fundamental
difference between quantitative tasks and aesthetic awareness. If there
were no difference, then I would expect that the problems that supermarket
computers would have would not be related to its unconsciousness, but to
unreliability or even willfulness developing. Why isn't the story
"Automated cashiers have begun throwing temper tantrums at some locations
which are contagious to certain smart phones that now become upset in
sympathy...we had anticipated this, but not so soon, yadda yadda"? I think
it's pretty clear why. For the same reason that all machines will always
fall short of authentic personality and sensitivity.
You received this message because you are subscribed to the Google Groups
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email
To post to this group, send email to firstname.lastname@example.org.
Visit this group at http://groups.google.com/group/everything-list.
For more options, visit https://groups.google.com/groups/opt_out.