I'll bet it can eat people too. Just so it can't reproduce (except it can).
Brent
Well we are one step closer, machines that can metabolize and feed themselves:
http://www.robotictechnologyinc.com/index.php/EATR
--
You received this message because you are subscribed to the Google Groups
On 27 Sep 2012, at 00:02, Jason Resch wrote:
On Wed, Sep 26, 2012 at 3:33 AM, Bruno Marchal marc...@ulb.ac.be
wrote:
More on this when I have more time. Someday I will give you the
enunciation of Solovay theorem, which is the key here.
Thank you. I look forward to this.
OK. Nice.
On 26 Sep 2012, at 19:30, Craig Weinberg wrote:
On Wednesday, September 26, 2012 3:47:26 AM UTC-4, Bruno Marchal
wrote:
On 25 Sep 2012, at 19:06, Craig Weinberg wrote:
On Tuesday, September 25, 2012 3:02:05 AM UTC-4, Bruno Marchal wrote:
On 24 Sep 2012, at 18:16, Craig Weinberg wrote:
On 26 Sep 2012, at 19:37, Craig Weinberg wrote:
On Wednesday, September 26, 2012 3:45:09 AM UTC-4, Bruno Marchal
wrote:
On 25 Sep 2012, at 19:03, Craig Weinberg wrote:
On Tuesday, September 25, 2012 4:43:29 AM UTC-4, Bruno Marchal wrote:
On 25 Sep 2012, at 05:45, Stathis Papaioannou
http://www.umcs.maine.edu/~chaitin/berlin.html
Metabiology: Life as Evolving Software
G J Chaitin, Federal University of Rio de Janeiro
Berlin, 13 September 2012
What is Metabiology?
A field parallel to biology, dealing with the random evolution of artificial
software (computer
On 27 Sep 2012, at 04:24, Stathis Papaioannou wrote:
On Tue, Sep 25, 2012 at 3:34 PM, Jason Resch jasonre...@gmail.com
wrote:
If it has no causal efficacy, what causes someone to talk about the
pain
they are experiencing? Is it all coincidental?
There is a sequence of physical events
On 26 Sep 2012, at 19:29, meekerdb wrote:
On 9/25/2012 9:51 PM, Jason Resch wrote:
On Sep 25, 2012, at 11:05 PM, meekerdb meeke...@verizon.net wrote:
snip
So you mean if some mathematical object implies a contradiction it
doesn't exist, e.g. the largest prime number. But then of
Hi Bruno Marchal
I was thinking of a computer as a monad,
but whether it can think or not would
have to be an assumption (that it contains
an intellect). I forgot that inanimate matter
does not have an intellect. So I have to retract
that statement. Sorry.
This may be another mistake or be
On 9/27/2012 2:06 AM, meekerdb wrote:
I'll bet it can eat people too. Just so it can't reproduce (except it
can).
Brent
Well we are one step closer, machines that can metabolize and feed
themselves:
http://www.robotictechnologyinc.com/index.php/EATR
Oh my!
The 4D/RCS is a framework in
http://www.umcs.maine.edu/~chaitin/apa.html
APA Newsletter on Philosophy and Computers, Vol. 9, No. 1 (Fall 2009), pp. 7-10
Leibniz, Complexity and Incompleteness
Gregory Chaitin
Roger Clough, rclo...@verizon.net
9/27/2012
Forever is a long time, especially near the end. -Woody Allen
On Thu, Sep 27, 2012 at 1:29 PM, Jason Resch jasonre...@gmail.com wrote:
But can you separate the consciousness from that sequence of physical events
or not? There are multiple levels involved here and you may be missing the
forest for the trees by focusing only on the atoms. Saying the
On 9/27/2012 4:17 AM, Bruno Marchal wrote:
So does comp provide any hints as to which aspects of our local
universe should be universal and which are geographical?
Yes, as the logic of probabilty one for observation (given by S4Grz1,
and/or the X and Z logics) already provides an
On Thu, Sep 27, 2012 at 6:06 PM, Bruno Marchal marc...@ulb.ac.be wrote:
You can approximate consciousness by belief in self-consistency. This has
already a causal efficacy, notably a relative self-speeding ability (by
Gödel length of proof theorem). But belief in self-consistency is pure
3p,
On Thursday, September 27, 2012 1:01:12 AM UTC-4, Jason wrote:
On Wed, Sep 26, 2012 at 11:09 PM, Stephen P. King
step...@charter.netjavascript:
wrote:
On 9/26/2012 11:29 PM, Jason Resch wrote:
On Wed, Sep 26, 2012 at 9:24 PM, Stathis Papaioannou
stat...@gmail.comjavascript:
On Thursday, September 27, 2012 9:09:12 AM UTC-4, stathisp wrote:
On Thu, Sep 27, 2012 at 6:06 PM, Bruno Marchal
mar...@ulb.ac.bejavascript:
wrote:
You can approximate consciousness by belief in self-consistency. This
has
already a causal efficacy, notably a relative self-speeding
On Wed, Sep 26, 2012 at 1:04 PM, Craig Weinberg whatsons...@gmail.comwrote:
I meant more 'your answer to God' - the universal principle of automatic
functionality which allows you to believe that no being or creation need
exist.
Religious people think God is important and I think information
On Thu, Sep 27, 2012 at 7:49 AM, Stathis Papaioannou stath...@gmail.comwrote:
On Thu, Sep 27, 2012 at 1:29 PM, Jason Resch jasonre...@gmail.com wrote:
But can you separate the consciousness from that sequence of physical
events
or not? There are multiple levels involved here and you may
On 9/27/2012 4:37 AM, Bruno Marchal wrote:
On 26 Sep 2012, at 19:37, Craig Weinberg wrote:
in which case, how are they really arithmetic.
They are not. Arithmetical truth is already not arithmetical.
Arithmetic seen from inside is *vastly* bigger than arithmetic. This
needs a bit of model
Say that you have been captured by the [totalitarian fiend of your choice],
and are tied up in a basement somewhere. The torture has begun, and is has
become clear that it will continue to get worse until you 'become one of
them'.
Fortunately you have been supplied by your team with a
On Thursday, September 27, 2012 12:32:38 AM UTC-4, Brent wrote:
On 9/26/2012 9:27 PM, Stephen P. King wrote:
On 9/27/2012 12:19 AM, Stathis Papaioannou wrote:
On Thu, Sep 27, 2012 at 2:01 PM, Craig Weinberg
whats...@gmail.comjavascript:wrote:
The problem is the assumption that they
On Thursday, September 27, 2012 4:24:37 AM UTC-4, Bruno Marchal wrote:
On 26 Sep 2012, at 19:30, Craig Weinberg wrote:
On Wednesday, September 26, 2012 3:47:26 AM UTC-4, Bruno Marchal
wrote:
On 25 Sep 2012, at 19:06, Craig Weinberg wrote:
On Tuesday, September 25,
On 9/27/2012 10:22 AM, Jason Resch wrote:
This is to equate reasoning to automatically following an
algorithm. This implies perfect predictability at some level and
thus the absence of any 1p only aspects. Additionally, the recipe
is some thng that needs explanation. How was it
On 9/27/2012 10:22 AM, Jason Resch wrote:
I think the only difference in what you are saying and what I am
saying, is I say look the zombies can do these things (by their
definition), so they must be conscious and there is the inconsistency,
whereas you say zombies cannot do these things since
On 27 Sep 2012, at 13:24, Roger Clough wrote:
Hi Bruno Marchal
I was thinking of a computer as a monad,
but whether it can think or not would
have to be an assumption (that it contains
an intellect).
I don't think you have to assume this, unless you propose some magical
theory of an
On 9/27/2012 1:19 AM, Bruno Marchal wrote:
On 26 Sep 2012, at 19:29, meekerdb wrote:
On 9/25/2012 9:51 PM, Jason Resch wrote:
On Sep 25, 2012, at 11:05 PM, meekerdb meeke...@verizon.net wrote:
snip
So you mean if some mathematical object implies a contradiction it doesn't exist,
On 27 Sep 2012, at 15:08, Stathis Papaioannou wrote:
On Thu, Sep 27, 2012 at 6:06 PM, Bruno Marchal marc...@ulb.ac.be
wrote:
You can approximate consciousness by belief in self-consistency.
This has
already a causal efficacy, notably a relative self-speeding
ability (by
Gödel length of
On 9/27/2012 5:49 AM, Stathis Papaioannou wrote:
Albeit at a low resolution, scientists have already extracted from brain
scans what people are seeing:
http://www.newscientist.com/article/dn16267-mindreading-software-could-record-your-dreams.html
We still can't observe the experience.
On 9/27/2012 6:57 AM, John Clark wrote:
If you made a computer out of something that could be conscious, you
would not be
able to control it
True, the more conscious it became the harder it would be to control, and with computers
doubling in power every 18 months we won't be able
On 9/27/2012 9:52 AM, Bruno Marchal wrote:
I object to the idea that consciousness will cause a brain or other
machine to behave in a way not predictable by purely physical laws.
But this cannot be entirely correct. Consciousness will make your brain, at the level
below the substitution
Citeren Stephen P. King stephe...@charter.net:
On 9/25/2012 11:46 AM, smi...@zonnet.nl wrote:
Hi Roger,
My idea about this is that the Moon and that we landed on it exists
in parallel with the Moon not existing or existing but we not
landing on it, or we already having a base on the oon
On 9/27/2012 8:06 AM, Craig Weinberg wrote:
On Thursday, September 27, 2012 12:32:38 AM UTC-4, Brent wrote:
On 9/26/2012 9:27 PM, Stephen P. King wrote:
On 9/27/2012 12:19 AM, Stathis Papaioannou wrote:
On Thu, Sep 27, 2012 at 2:01 PM, Craig Weinberg whats...@gmail.com
On 9/27/2012 8:27 AM, Stephen P. King wrote:
Note that I think we agree (some forms of reasoning probably require consciousness),
which only provides another reason to doubt the consistency of the definition of
zombies. I don't think reasoning is normally assumed to require consciousness,
link: http://www.youtube.com/watch?v=SbUHxC4wiWk
Very appropriate for this list. http://www.youtube.com/watch?v=SbUHxC4wiWk
Great brief lecture with a more enlightened interpretation of neuroscience
than usual. His views are very much in alignment with my own. Sort of the
antidote for Daniel
On Thursday, September 27, 2012 3:02:52 PM UTC-4, Brent wrote:
On 9/27/2012 8:06 AM, Craig Weinberg wrote:
On Thursday, September 27, 2012 12:32:38 AM UTC-4, Brent wrote:
On 9/26/2012 9:27 PM, Stephen P. King wrote:
On 9/27/2012 12:19 AM, Stathis Papaioannou wrote:
On Thu, Sep
On Thursday, September 27, 2012 9:57:11 AM UTC-4, John Clark wrote:
On Wed, Sep 26, 2012 at 1:04 PM, Craig Weinberg
whats...@gmail.comjavascript:
wrote:
I meant more 'your answer to God' - the universal principle of automatic
functionality which allows you to believe that no being or
On Thu, Sep 27, 2012 at 11:30 PM, Craig Weinberg whatsons...@gmail.com wrote:
I object to the idea that consciousness will cause a brain or other
machine to behave in a way not predictable by purely physical laws.
Some people, like Craig Weinberg, seem to believe that this is
possible but it
On Fri, Sep 28, 2012 at 8:53 AM, Craig Weinberg whatsons...@gmail.com wrote:
But you don't need a living cell to transmit a signal. That is my point. Why
have a cell?
There are cells because that's the way organisms evolved. If there
were a way of evolving computer hardware and this was
On Fri, Sep 28, 2012 at 12:55 AM, Craig Weinberg whatsons...@gmail.com wrote:
Say that you have been captured by the [totalitarian fiend of your choice],
and are tied up in a basement somewhere. The torture has begun, and is has
become clear that it will continue to get worse until you 'become
On Thursday, September 27, 2012 8:40:14 PM UTC-4, stathisp wrote:
On Fri, Sep 28, 2012 at 12:55 AM, Craig Weinberg
whats...@gmail.comjavascript:
wrote:
Say that you have been captured by the [totalitarian fiend of your
choice],
and are tied up in a basement somewhere. The torture
On Thursday, September 27, 2012 8:10:37 PM UTC-4, stathisp wrote:
On Fri, Sep 28, 2012 at 8:53 AM, Craig Weinberg
whats...@gmail.comjavascript:
wrote:
But you don't need a living cell to transmit a signal. That is my point.
Why
have a cell?
There are cells because that's the
On Thursday, September 27, 2012 7:45:07 PM UTC-4, stathisp wrote:
On Thu, Sep 27, 2012 at 11:30 PM, Craig Weinberg
whats...@gmail.comjavascript:
wrote:
I object to the idea that consciousness will cause a brain or other
machine to behave in a way not predictable by purely physical
On 9/27/2012 7:40 PM, Craig Weinberg wrote:
The perfect actor might believe it or he might just be acting. Acting
is top-down replacement, not bottom-up replacement. Bottom-up
replacement would involve replacing a part of your brain so that you
didn't notice any difference and
On 9/27/2012 7:46 PM, Craig Weinberg wrote:
On Thursday, September 27, 2012 8:10:37 PM UTC-4, stathisp wrote:
On Fri, Sep 28, 2012 at 8:53 AM, Craig Weinberg whats...@gmail.com
javascript:
wrote:
But you don't need a living cell to transmit a signal. That is my point.
Why
On Fri, Sep 28, 2012 at 12:40 PM, Craig Weinberg whatsons...@gmail.com wrote:
Replacing body parts that break down with artificial ones is
well-established in the medical industry, and will become increasingly
so in future as the devices become more sophisticated.
Are you saying that you
On Thursday, September 27, 2012 11:05:20 PM UTC-4, Brent wrote:
On 9/27/2012 7:40 PM, Craig Weinberg wrote:
The perfect actor might believe it or he might just be acting. Acting
is top-down replacement, not bottom-up replacement. Bottom-up
replacement would involve replacing a part of
On Fri, Sep 28, 2012 at 12:52 PM, Craig Weinberg whatsons...@gmail.com wrote:
If physics cannot predict even in theory when the neurons will fire
then *by definition* the neurons behave contrary to physics.
If the neurons fire based on the participation of a personal identity in
response to
On Thursday, September 27, 2012 11:16:12 PM UTC-4, stathisp wrote:
On Fri, Sep 28, 2012 at 12:40 PM, Craig Weinberg
whats...@gmail.comjavascript:
wrote:
Replacing body parts that break down with artificial ones is
well-established in the medical industry, and will become
On 9/27/2012 10:40 PM, Craig Weinberg wrote:
On Thursday, September 27, 2012 8:40:14 PM UTC-4, stathisp wrote:
On Fri, Sep 28, 2012 at 12:55 AM, Craig Weinberg
whats...@gmail.com javascript: wrote:
Say that you have been captured by the [totalitarian fiend of
your choice],
On 9/27/2012 10:52 PM, Craig Weinberg wrote:
On Thursday, September 27, 2012 7:45:07 PM UTC-4, stathisp wrote:
On Thu, Sep 27, 2012 at 11:30 PM, Craig Weinberg
whats...@gmail.com javascript: wrote:
I object to the idea that consciousness will cause a brain or
other
On Thursday, September 27, 2012 11:07:12 PM UTC-4, Brent wrote:
On 9/27/2012 7:46 PM, Craig Weinberg wrote:
On Thursday, September 27, 2012 8:10:37 PM UTC-4, stathisp wrote:
On Fri, Sep 28, 2012 at 8:53 AM, Craig Weinberg whats...@gmail.com
wrote:
But you don't need a living cell
When the story first broke, there were a number of articles claiming it
would be deployed in war and feed off of the dead. This caused the
creators to come out and state that EATR is vegetarian:
http://news.cnet.com/8301-17852_3-10289514-71.html
Jason
On Thu, Sep 27, 2012 at 1:06 AM, meekerdb
On Thursday, September 27, 2012 11:29:12 PM UTC-4, stathisp wrote:
On Fri, Sep 28, 2012 at 12:52 PM, Craig Weinberg
whats...@gmail.comjavascript:
wrote:
If physics cannot predict even in theory when the neurons will fire
then *by definition* the neurons behave contrary to physics.
On 9/27/2012 8:28 PM, Craig Weinberg wrote:
On Thursday, September 27, 2012 11:05:20 PM UTC-4, Brent wrote:
On 9/27/2012 7:40 PM, Craig Weinberg wrote:
The perfect actor might believe it or he might just be acting. Acting
is top-down replacement, not bottom-up
On Thursday, September 27, 2012 11:30:40 PM UTC-4, Stephen Paul King wrote:
On 9/27/2012 10:40 PM, Craig Weinberg wrote:
On Thursday, September 27, 2012 8:40:14 PM UTC-4, stathisp wrote:
On Fri, Sep 28, 2012 at 12:55 AM, Craig Weinberg whats...@gmail.com
wrote:
Say that you have
On Thursday, September 27, 2012 11:56:58 PM UTC-4, Brent wrote:
On 9/27/2012 8:28 PM, Craig Weinberg wrote:
On Thursday, September 27, 2012 11:05:20 PM UTC-4, Brent wrote:
On 9/27/2012 7:40 PM, Craig Weinberg wrote:
The perfect actor might believe it or he might just be acting.
On 9/27/2012 9:01 PM, Craig Weinberg wrote:
On Thursday, September 27, 2012 11:56:58 PM UTC-4, Brent wrote:
On 9/27/2012 8:28 PM, Craig Weinberg wrote:
On Thursday, September 27, 2012 11:05:20 PM UTC-4, Brent wrote:
On 9/27/2012 7:40 PM, Craig Weinberg wrote:
On Thursday, September 27, 2012 11:56:58 PM UTC-4, Brent wrote:
Then I would say it's not distinct from 'being'. It is no longer a
choice, I'm going to act. motivated by some particular situation.
Brent
Think of it as an 'auto-pilot' functionality. Instead of getting a brain
On Friday, September 28, 2012 12:03:09 AM UTC-4, Brent wrote:
On 9/27/2012 9:01 PM, Craig Weinberg wrote:
On Thursday, September 27, 2012 11:56:58 PM UTC-4, Brent wrote:
On 9/27/2012 8:28 PM, Craig Weinberg wrote:
On Thursday, September 27, 2012 11:05:20 PM UTC-4, Brent wrote:
58 matches
Mail list logo