On Jul 23, 6:17 pm, Craig Weinberg <whatsons...@gmail.com> wrote:
> On Jul 23, 11:40 am, 1Z <peterdjo...@yahoo.com> wrote:
> > On Jul 23, 2:35 am, Craig Weinberg <whatsons...@gmail.com> wrote:
> > > Think of them like sock puppet/bots multiplying in a closed social
> > > network. If you have 100 actual friends on a social network and their
> > > accounts are progressively replaced by emulated accounts posting even
> > > slightly unconvincing status updates,
> > Why would "slightly unconvincing" fall under "exact funcitonal
> > replacement"?
> Because it's not possible for the emulation to simulate first person
> participation forever from a third person design.

Says who?

> First person
> participants don't even know what they are going to say or do in a
> given situation.

Maybe a brain scan would tell them. The *conscious* self is only
a small art.

>The sense of what the thing is leaks through sooner
> or later.
> > IOW: yout think the Neurone Replacement Hypothesis doens't
> > disprove your theory because you think your theory is correct.
> > See the problem?
> If my theory is correct, the Neuron Replacement Hypothesis is a Red
> Herring.

And vice versa.

> It's not a problem, it's a solution.
> > There is  such a thing as machine learning.
> Definitely. Inorganic mega-molecules can do amazing things. Enjoying a
> steak dinner isn't one of them though.

What have qualia to do with learning?

You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
For more options, visit this group at 

Reply via email to