On Jul 23, 11:40 am, 1Z <peterdjo...@yahoo.com> wrote:
> On Jul 23, 2:35 am, Craig Weinberg <whatsons...@gmail.com> wrote:
> > Think of them like sock puppet/bots multiplying in a closed social
> > network. If you have 100 actual friends on a social network and their
> > accounts are progressively replaced by emulated accounts posting even
> > slightly unconvincing status updates,
> Why would "slightly unconvincing" fall under "exact funcitonal
Because it's not possible for the emulation to simulate first person
participation forever from a third person design. First person
participants don't even know what they are going to say or do in a
given situation. The sense of what the thing is leaks through sooner
> IOW: yout think the Neurone Replacement Hypothesis doens't
> disprove your theory because you think your theory is correct.
> See the problem?
If my theory is correct, the Neuron Replacement Hypothesis is a Red
Herring. It's not a problem, it's a solution.
> There is such a thing as machine learning.
Definitely. Inorganic mega-molecules can do amazing things. Enjoying a
steak dinner isn't one of them though.
You received this message because you are subscribed to the Google Groups
"Everything List" group.
To post to this group, send email to firstname.lastname@example.org.
To unsubscribe from this group, send email to
For more options, visit this group at