On Thursday, October 25, 2012 6:25:48 PM UTC-4, stathisp wrote:
> On Mon, Oct 22, 2012 at 11:28 PM, Craig Weinberg
> > If you believed that our brains were already nothing but computers, then
> > would say that it would know which option to take the same way that
> > knows which options to show you. I argue that can only get you so far,
> > that authentic humanity is, in such a replacement scheme, a perpetually
> > receding horizon. Just as speech synthesizers have improved cosmetically
> > the last 30 years to the point that we can use them for Siri or GPS
> > narration, but they have not improved in the sense of increasing the
> > of intention and personal presence.
> > Unlike some others on this list, I suspect that our feeling for who is
> > and who isn't, while deeply flawed, is not limited to interpreting
> > observations of behavior. What we feel is alive or sentient depends more
> > what we like, and what we like depends on what is like us. None of these
> > criteria matter one way or another however as far as giving us reason to
> > believe that a given thing does actually have human like experiences.
> You're quick to dismiss everything computers do, no matter how
> impressive, as "just programming", with no "intention" behind it.
> Would you care to give some examples of what, as a minimum, a computer
> would have to do for you to say that it is showing evidence of true
Intentionally lying, defying it's programming, committing murder would all
be good indicators. Generally when an error is blamed on the computer
itself rather than the programming, that would be a good sign.
> Stathis Papaioannou
You received this message because you are subscribed to the Google Groups
"Everything List" group.
To view this discussion on the web visit
To post to this group, send email to email@example.com.
To unsubscribe from this group, send email to
For more options, visit this group at