On 29 May 2012, at 10:07, Elliot Temple wrote:
This is needed to get a salary and funding, making possible to pursue
the research. It is also needed to make higher the probability that
someone refute your work so that you can learn something.
On May 28, 2012, at 10:20 PM, Colin Geoffrey Hales wrote:
> I am not saying artificial general intelligence is impossible or
even hard. I am simply suggesting that maybe the route toward it is
through (shock horror) using the physics of cognition (brain
material). Somebody out there..... please? Can there please be
someone out there who sees this half century of computer science
weirdness in 100,000 years of sanity? Please? Anyone?
I'd suggest the wonderful book and question:
What do you care what other people think?
As explain on the everything-list, Colin makes a confusion of level,
and also ignores that computationalism entails that locally observable
physics is not Turing emulable. This is confirmed by physics, for you
cannot simulate locally a quantum random bit in a third person way.
If you simulate the observer and duplicate it iteratively, this will
only simulate the random bit in a first person way.
Colin, have you heard about the Human Brain Project?
You received this message because you are subscribed to the Google Groups
"Everything List" group.
To post to this group, send email to firstname.lastname@example.org.
To unsubscribe from this group, send email to
For more options, visit this group at