2009/4/29 Quentin Anciaux <[email protected]>: >>> In Bruno's Washington/Moscow thought experiment that information isn't >>> in your consciousness, although it's available via third persons. My >>> view of the experiment is that you would lose a bit of consciousness, >>> that you can't slice consciousness arbitrarily finely in time. >> >> Could the question be settled by actual experiment, i.e. asking the >> subject if they noticed anything unusual? >> >> >> -- >> Stathis Papaioannou > > For this you would need an actual AI and also that everybody agreed on > the fact that this AI is conscious and not a zombie. > > If you can settle that, then an interview should be counted as proof. > But I'm not sure you can prove the AI is conscious, nor with the same > argument I'm not sure I could prove to you that I am.
Well, you could just ask the teleported human. If he says he feels fine, didn't notice anything other than the scenery changing, would that count for anything? I suppose you could argue that of course he would say that since a gap in consciousness is by definition not noticeable, but then you end up with a variant of the zombie argument: he says everything feels OK, but in actual fact he experiences nothing. -- Stathis Papaioannou --~--~---------~--~----~------------~-------~--~----~ You received this message because you are subscribed to the Google Groups "Everything List" group. To post to this group, send email to [email protected] To unsubscribe from this group, send email to [email protected] For more options, visit this group at http://groups.google.com/group/everything-list?hl=en -~----------~----~----~----~------~----~------~--~---

