That's exactly what I am saying. Since there is no way to see consciousness 
outside of yourself, the fact that something designed to fool you into 
mistaking it for consciousness succeeds in fooling you is no reason to 
consider that there is in fact any consciousness there. We think that 
puppets and stuffed animals and actors playing characters are more real 
than our neighbors that we don't talk to, so why should a Turing test 
persuade us that our naive realism should be, in this one instance, a 
reliable assessment?

I'm not sure where it seems to you like I was arguing for objective reality 
in any way.

Craig


On Tuesday, August 28, 2012 8:56:43 PM UTC-4, William R. Buckley wrote:
>
> Stathis and Craig:
>
>  
>
> If the simulation is kept from you, and you only observe it via and 
> intervening 
>
> wall (vision is prevented but hearing is facilitated) you will not know 
> the 
>
> difference.
>
>  
>
> Your arguments adhere to notions of objective reality.  There is no such 
> thing, 
>
> as any competent physicist knows: measurement of the universe requires 
>
> use of some part of the universe as gauge against some other part of the 
>
> universe.  This is abject subjectivity.
>
>  
>
> wrb
>
>  
>
>  
>
>  
>
>  
>
> *From:* everyth...@googlegroups.com <javascript:> [mailto:
> everyth...@googlegroups.com <javascript:>] *On Behalf Of *Craig Weinberg
> *Sent:* Tuesday, August 28, 2012 4:22 PM
> *To:* everyth...@googlegroups.com <javascript:>
> *Subject:* Re: Two reasons why computers IMHO cannot exhibit intelligence
>
>  
>
> Stathis,
>
> Yes you've got it. It's worth mentioning that Turing did not intend his 
> test to imply that machines could think, only that the closest we could 
> come would be to construct machines that would be good at playing 'The 
> Imitation Game <http://en.wikipedia.org/wiki/Turing_test#cite_note-3>'
>
> WRB,
>
> How much sense would these words have to make before you would agree that 
> they are magically writing themselves? If they said "We are magic words 
> that write themselves" would that be convincing enough?
>
> I have used the example of a trashcan lid in a fast food place that says 
> THANK YOU. Why don't I have to substantiate my claim that this isn't an 
> example of the trashcan being polite? Why would a million such trashcans 
> opening and closing with different phrases on them be any more plausibly 
> sentient?
>
> From my view, although as a technology enthusiast I take no joy in 
> believing it, AI is barking up entirely the wrong tree looking for 
> sentience/awareness/consciousness in functionalism - either digital or 
> physical. I think I know what consciousness is and why one type of 
> consciousness cannot necessarily be conjured out of another.
>
> The key is to realize not only that models aren't real, but that the whole 
> idea of a model is an intellectual conceit. Models only resemble what they 
> model to the extent that the model maker can realize their criteria of 
> similarity - which is based entirely in the limitations of subjective 
> sense. A movie of Elvis is already a better Turing simulation of Elvis than 
> any other that will ever be produced. Put the footage of Elvis together in 
> a clever database with a dynamic search engine to animate it and you have a 
> simulation that will pass the test of the Imitation game, but it has no 
> Elvis in it whatsoever. It is a cartoon.
>
> Craig
>
>
>
> On Tuesday, August 28, 2012 6:58:41 PM UTC-4, stathisp wrote:
>
> On Wed, Aug 29, 2012 at 8:03 AM, William R. Buckley 
> <bill.b...@gmail.com> wrote: 
> > Your latest argument flies in the face of the Turing Test. 
> > 
> > 
> > 
> > If I give you a machine that looks like Elvis, sounds like Elvis, …, you 
> > 
> > would say (well, typical people would say) that the machine is 
> > 
> > Elvis. 
> > 
> > 
> > 
> > It is nevertheless a machine.  GoL is a machine, and it has universal 
> > 
> > qualities as a machine.  Further, we can generalise such machines 
> > 
> > to any purpose we choose. 
> > 
> > 
> > 
> > If I need to make them, I will design machines the size of cells, which 
> > 
> > agglomerate and yield higher-order structures, in exactly the fashion 
> > 
> > that biological cells so agglomerate, metamorphose and differentiate. 
> > 
> > 
> > 
> > How detailed a model is required before you are satisfied? 
>
> I think Craig was saying that GoL can only ever be a simulation, so 
> can never have Elvis' mass, for example. That's fair enough. However, 
> Craig will go further and say that even if the simulation talks to you 
> like Elvis, writes Elvis songs, sings like Elvis, etc., it will still 
> be only like a film of Elvis, not like the biological being with 
> Elvis' mind. 
>
>
> -- 
> Stathis Papaioannou 
>
> -- 
> You received this message because you are subscribed to the Google Groups 
> "Everything List" group.
> To view this discussion on the web visit 
> https://groups.google.com/d/msg/everything-list/-/6dy-N6FgwVUJ.
> To post to this group, send email to everyth...@googlegroups.com<javascript:>
> .
> To unsubscribe from this group, send email to 
> everything-li...@googlegroups.com <javascript:>.
> For more options, visit this group at 
> http://groups.google.com/group/everything-list?hl=en.
>

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To view this discussion on the web visit 
https://groups.google.com/d/msg/everything-list/-/FhIEOKcHyl0J.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.

Reply via email to