On Sat, Apr 20, 2013 at 8:14 PM, Telmo Menezes <te...@telmomenezes.com>wrote:
> There is an entire field of physics, for example, dedicated to studying
> emergence in a rigorous fashion
True, and the key word is " rigorous" and that means knowing the details.
> > Cellular automata show how simple local rules can give rise to
Yes, but saying something worked by cellular automation wouldn't be of any
help it you didn't know what those simple local rules were, and to figure
them out that you need reductionism. In talking about art there are 2 buzz
words that, whatever their original meaning, now just mean "it sucks"; the
words are "derivative" and "bourgeois". Something similar has happened in
the world of science to the word "reductive", it now also means "it sucks".
Not long ago I read a article about the Blue Brain Project, it said it was
a example of science getting away from reductionism, and yet under the old
meaning of the word nothing could be more reductive than trying to simulate
a brain down to the level of neurons.
> > when someone invokes utilitarianism
I don't see any difference between "invoking utilitarianism" and just doing
something that works, and I'm pretty sure that's better than doing
something that doesn't work.
> > a concept that can be dangerous, as History as shown us a number of
I can't think of a single case where science was harmed by doing something
> > The missing part I don't understand bugs me.
It bugs me too, I also want to know everything but, you can't always get
what we want. Hey, somebody ought to make a song about that.
> >>>If consciousness is easier than intelligence
>> >> Evolution certainly found that to be the case.
> >There is not scientific evidence whatsoever of this.
Some of our most powerful emotions like pleasure, pain, and lust come from
the oldest parts of our brain that evolved about 500 million years ago.
About 400 million years ago Evolution figured out how to make the spinal
cord, the medulla and the pons, we have these brain structures just like
fish and amphibians do and they deal in aggressive behavior, territoriality
and social hierarchies. The Limbic System is about 150 million years old
and ours is similar to that
found in other mammals. Some think the Limbic system is the source of awe
and exhilaration because it is the active site of many psychotropic drugs,
and there's little doubt that the amygdala, a part of the Limbic system,
has much to do with fear. After some animals developed a Limbic system they
started to spend much more time taking care of their young, so it probably
has something to do with love too.
It is our grossly enlarged neocortex that makes the human brain so unusual
and so recent, it only started to get large about 3 million years ago and
only started to get ridiculously large less than one million years ago. It
deals in deliberation, spatial perception, speaking, reading, writing and
mathematics; in other words everything that makes humans so very different
from other animals. The only new emotion we got out of it was worry,
probably because the neocortex is also the place where we plan for the
If nature came up with feeling first and high level intelligence much much
later I don't see why the opposite would be true for our computers. It's
probably a hell of a lot easier to make something that feels but doesn't
something that thinks but doesn't feel.
> > People like António Damásio (my compatriot) and other neuroscientists
> confuse a machine's ability to recognise itself with consciousness.
I see no evidence of confusion in that.
> > This makes me wonder if some people are zombies.
Without the axiom that intelligent behavior implies consciousness it would
be entirely reasonable to conclude that you are the only conscious being in
> > Computers are what they have always been, Turing machines with finite
Human brains are what they have always been, a finite number of
interconnected neurons imbedded in 3 pounds of grey jello.
> > The tapes are getting bigger, that's all.
Yes, but the grey jello is not getting any bigger and that is exactly why
computers are going to win.
> >Measuring conscious by intelligent behaviour is mysticism,
Call it any bad name you like but the fact is that both you and I have been
measuring consciousness by intelligent behavior every minute of every hour
of our waking life from the moment we were born; but now if we're
confronted with a intelligent computer for some unspecified reason you say
we're supposed to suddenly stop doing that. Why?
> >> The only consciousness I have direct experience with is my own and I
>> note that when I'm sleepy my consciousness is reduced and so is my
>> intelligence, when I'm alert the reverse is true.
> > I agree on intelligence, but I don't feel less conscious when I'm sleepy.
If so and consciousness is a all or nothing matter and is not on a
continuum then you should vividly remember the very instant you went to
sleep last night. Do you?
> > I'm a bit sleepy right now.
Wow what a temptation, with that opening if I was in a bad mood I'd make
some petty remark like "that explains a lot", but I'm not so I won't.
John K Clark
You received this message because you are subscribed to the Google Groups
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email
To post to this group, send email to firstname.lastname@example.org.
Visit this group at http://groups.google.com/group/everything-list?hl=en.
For more options, visit https://groups.google.com/groups/opt_out.