On May 17, 7:57 am, Quentin Anciaux <allco...@gmail.com> wrote:

> > That's true, but they don't care whether they output or not. It's not
> > driven by their own intention. They won't EVER discover a printer that
> > is sitting right next to them without having drivers loaded and
> > configured to even connect.
> Your unique argument against a program being able to be conscious (as
> conscious as a human can be) is to take a non-conscious program  and to say
> "see it's not conscious"... well yes it is not, that doesn't mean no
> program can be.

Yes, in a sense that's true, but since the only example of something
conscious we have is ourselves, the alternative is to take a non-
conscious program and say "there's no reason that some future version
of it can't be just like me eventually. The former makes more sense to
me. It's not only that logic that makes me suspect the former is the
case though. There seems to be a specific, glaring lack of sentience
in all machines that does not reduce in the slightest even as machines
scale up exponentially in complexity. No byte has ever done anything
by itself, and I don't see why it ever would.

I only bring up the shortcomings of machines and programs because
that's the only common sense examples I can really use, but that's
just the tip of the iceberg. I am trying to use that common sense as a
lever to open you up the deeper understanding that I have about how
intention arises from within matter and cannot be transplanted from
the outside as with a computer. It's not a matter of Luddite neophobia
at all, believe me I am a transhumanist to the core, I just think we
are not going to get there without water, sugar, protein, lipids, etc.


You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
For more options, visit this group at 

Reply via email to