On Wed, May 20, 2015 at 4:20 AM, Stathis Papaioannou <[email protected]> wrote:
> > > On Tuesday, 19 May 2015, Jason Resch <[email protected]> wrote: > > On Tue, May 19, 2015 at 12:06 AM, Stathis Papaioannou <[email protected]> >> wrote: >> >>> > >> >> Not necessarily, just as an actor may not be conscious in the same way >>> >> as me. But I suspect the Blockhead would be conscious; the intuition >>> >> that a lookup table can't be conscious is like the intuition that an >>> >> electric circuit can't be conscious. >>> >> >>> > >>> > I don't see an equivalency between those intuitions. A lookup table >>> has a >>> > bounded and very low degree of computational complexity: all answers >>> to all >>> > queries are answered in constant time. >>> > >>> > While the table itself may have an arbitrarily high information >>> content, >>> > what in the software of the lookup table program is there to >>> > appreciate/understand/know that information? >>> >>> Understanding emerges from the fact that the lookup table is immensely >>> large. It could be wrong, but I don't think it is obviously less >>> plausible than understanding emerging from a Turing machine made of >>> tin cans. >>> >>> >>> >> The lookup table is intelligent or at least offers the appearance of >> intelligence, but it makes the maximum possible advantage of the space-time >> trade off: http://en.wikipedia.org/wiki/Space–time_tradeoff >> >> The tin-can Turing machine is unbounded in its potential computational >> complexity, there's no reason to be a bio- or silico-chauvinist against it. >> However, by definition, a lookup table has near zero computational >> complexity, no retained state. Does an ant trained to perform the look >> table's operation become more aware when placed in a vast library than when >> placed on a small bookshelf, to perform the identical function? >> > > The ant is more aware than a neuron but it is not the ant's awareness that > is at issue, it is the system of which the ant is a part. > > Step back and consider why we speculate that computationalism may be true. > It is not because computers are complex like brains, or because brains > carry out computations like computers. It is because animals with brains > display intelligent behaviour, and computers also display intelligent > behaviour, or at least might in the future. If Blockheads roamed the Earth > answering all our questions, then surely we would debate whether they were > conscious like us, whether they have feelings and whether they should be > accorded human rights. > > > I would not torture a blockhead nor refuse to serve one in my restaurant, but I might caution my daughter before marrying one that it might be a zombie. I know I sound like Craig in saying this but I see a difference in kind between the programs, even if they have an equivalence in inputs and outputs at some high layer. There, is, for instance, no "society of mind" or "modularity of mind" as Minsky and Fodor spoke of. Here there is only a top level defintion of high level inputs and outputs. Jason -- You received this message because you are subscribed to the Google Groups "Everything List" group. To unsubscribe from this group and stop receiving emails from it, send an email to [email protected]. To post to this group, send email to [email protected]. Visit this group at http://groups.google.com/group/everything-list. For more options, visit https://groups.google.com/d/optout.

