On Dec 17, 11:24 am, Quentin Anciaux <allco...@gmail.com> wrote:
> 2011/12/17 Craig Weinberg <whatsons...@gmail.com>
> > On Dec 17, 7:30 am, Quentin Anciaux <allco...@gmail.com> wrote:
> > > N- you can build a machine that implements and can only run 3 but that
> > > can't handle counterfactual, but as the computation is the same as 3, it
> > > must be as conscious as when it was running on a complete physical
> > computer.
> > > N+1- you can restore the handling of conterfactual by adding inactive
> > > piece. But If N was not conscious, adding inactive pieces shouldn't
> > render
> > > it conscious.
> > Conscious of what? It sounds like this assumes that consciousness is a
> > binary feature.
> You didn't read, that's not the argument.
> It begins by *assuming we have a conscious program*. The argument is not
> about what is consciousness, it's about assuming consciousness to be
> computational and assuming physical supervenience thesis true and showing a
No, I read it, I just think the argument is broken from the start if
you don't care what consciousness is in the first place. What kind of
consciousness are you assuming a program can have? Feeling? Sense of
smell? How does it manage that without a physical machine?
You received this message because you are subscribed to the Google Groups
"Everything List" group.
To post to this group, send email to email@example.com.
To unsubscribe from this group, send email to
For more options, visit this group at