On Sun, Jan 1, 2012 Craig Weinberg <whatsons...@gmail.com> wrote:
> But if I dream of something very valuable, I don't get to keep that
> valuable in real life.
If you dream about a very valuable idea you do get to keep it, you just
can't keep nouns. After years working on it the chemist August Kekule made
by far the most important discovery of his life, the benzene ring, in a
dream. And he kept that very valuable thing when he woke up.
>> Einstein didn't learn physics by himself, he needed books and teachers.
> > Not really. His theories are mainly based on profoundly elaborated
> common sense.
You are being silly. Before Einstein could push the frontiers of knowledge
he had to understand what had already been discovered, and nobody, not even
Einstein, could pick up physics and advanced mathematics by osmosis, like
everybody else he had to study. And although some teachers thought he
failed to do as well as he could he was not a bad student, at least not in
science and math.
> He wasn't programmed by Newton to see the universe in a Newtonian way.
Newton probably had the most powerful mind that any human has ever had, and
one of his most famous quotations is "If I have seen further it is only by
standing on the shoulders of giants". *
> > the mistaking the shadow of clever computation for the ineffable and
> forever non-
> simulatable experience of awareness.
If that were true the multibillion dollar video game industry would not
exist and NASA would not use computer simulations to train its astronauts.
> >>A synapse just translates one meaningless set of data into another
>> A synapse has no understanding of the significance of the process.
> >I take it then that you believe in metaphysical agency? Since we
> understand the process, and there's nobody here but us neurons, I conclude
> that neurons actually do collectively understand.
All the neurons in my head collectively understand things but a single one
does not, and one of the things we neurons collectively know is that you
behave as if you understand things too, but we neurons don't know if you
"really" understand anything at all.
> 460 nanometers is just a wavelength, it has no color at all.
Exactly, so why did you disagree when I said "the sound of broken glass is
not broken glass, the look of broken glass is not broken glass, the feel of
broken glass is not broken glass."?
> > Whatever is in our head, it isn't blue.
> > Blue has no parts.
If you really believe that (and I hope you don't) then you have surendered
to the forces of religious irrationalism. I'm not ready to throw in the
towel and think we can learn more, but you have to try and you're not
trying if you believe that.
> Bits aren't real. [...] information isn't real
Then "real" isn't all it was cracked up to be because all the many things
that you say are not real seem to be doing just fine. It seems that very
little is "real" in your cosmology, well I admit that does simplify things,
you don't have to explain anything because there is nothing "real" to
> There is no such thing as virtual particles - all subatomic particles are
You can do better than that, remarks like that just sound foolish. The
Casimir Effect can not be explained without virtual particles and a
baseball can't be explained without actual particles; the two types of
particles are RADICALLY different, it's hard to see how they could be more
different. To say they are the same is NOT the path to enlightenment.
> >quantum mechanics is hopelessly lost and pulling machineus ex deitina out
> of thin air to try to salvage it's inside out cosmology.
Lost? Quantum Mechanics is the most successful theory in the history of
>I'm not talking about atoms, I'm talking about inanimate objects
I always thought atoms were inanimate objects.
"You're trying to sneak your 'organizations' back into this."
Sneak? Obviously if nothing is organized in a system you won't have
intelligence or consciousness or much of anything of interest except for
> I have no reason to assume that a 'computer' has any interiority at all.
Do you have any reason to assume I have any interiority at all? How about
when I'm not arguing on the Internet but sleeping, or dead, has my
> I think it's either sophistry or wishful thinking to entertain the
> possibility of machine awareness.
And I think it's sophistry or wishful thinking pretend that a intelligent
ANYTHING is not conscious. Saying he can't be conscious because he's made
of silicon not meat is as crazy as saying he can't be conscious because his
skin is a different color than mine.
> A tiny amount of substance LSD can radically alter awareness.
That's because the LSD radically alters the firing patterns of the neurons
in the brain, and this is not pie in the sky philosophy this, to use your
favorite word, is real, you can measure it in the lab.
> > You are taking protons, neutrons, stars, and the relations between them
> for granted. I'm talking about a universe made of computation. 79 eggs in a
> basket don't reflect red light better than blue light. 79 toothpicks don't
> form a shiny nucleus.
Starting from nothing but gravity, protons, neutrons, electrons and stars,
Quantum Mechanics can figure out that there exists a shiny heavy metal that
will look gold to our eyes. I'd say that was pretty damn good for a theory
that was "hopelessly lost". Do you have a theory that can do better?
> Getting your teeth ripped out one by one with someone using pliers is not
Certainly it's information and it's not only possible to write a program
that experiences pain it's easy to do so, far far easier than writing a
program with even rudimentary intelligence. Just write a program that tries
to avoid having a certain number in one of its registers regardless of what
sort of input the machine receives, and if that number does show up in that
register it should stop whatever its doing and immediately change it to
another number. True, our feeling of pain is far richer than that but our
intelligence is far far greater than current computers can produce too, but
both are along the same continuum; if your brain gets into state P stop
whatever you're doing and use 100% of your resources to get out of state P
as quickly as you can.
Emotion is easy but intelligence is hard, that's certainly what Evolution
found to be true. Emotion comes from the oldest parts of the brain and is
about 500 million years old, the parts that make us smart are far younger.
"I didn't make it up, I got it from talking to AI programmers."
AGI is a example of pointless acronym inflation, real men say AI.
> >> It doesn't matter if we think it's moral to enslave a AI or not
> > why not? If you take AI seriously as awareness then on what basis do you
> treat them as less than human?
My remark was based on pure practicalities. There is not a snowball's
chance in hell of enslaving something that is a thousand times smarter and
thinks a million times faster than you do, so it's a waste of time worrying
about if it's moral enslave it so not. That's why I'd much rather know if
the AI thinks it's moral to keep slaves.
"I would count on the first AI that was much smarter than us to pretend
> that it wasn't until it could get in the best possible strategic position
> to exterminate"
Yes, it's entirely possible the AI will take that strategy.
"or enslave us."
Maybe but we probably would make for poor slaves and wouldn't be of much
use to the AI; our best bet is that AI is nostalgic and we hold some slight
sentimental value to Mr. AI reminding him of his origins and he keeps us as
> Maybe that has already happened? Isn't the world economy run on quant
> trading programs?
> Aren't our lives shaped by corporate financial agendas produced by
> quantitative analysis using computers?
Maybe, but it's far too late to turn back now, you can't put the toothpaste
back into the tube.
A brain keeps doing what it does while we are deep asleep, but the mind
Incorrect, the brain does not just keep doing what it has been doing when
we sleep, it changes and changes a lot and unlike consciousness these
things can be measured in the lab.
>> Please don't give me any more of that silly computers aren't "really"
>> intelligent stuff, I'm not buying it
> > I guess you have given up on finding any real fault with my
> understanding and have moved on to just deciding that you refuse to
> consider it.
I'm tired of you saying that X behaves intelligently and is conscious
while Y behaves intelligently but is not conscious when you never even hint
at how in the world you know this or why I should treat X and Y
>Can you explain exactly how that would be true if the world was
That's like asking how could you have likes and dislikes if you skin was
green? Skin color has nothing to with it and neither does determinism.
> Free will is nothing more or less than the feeling that one exercises
> voluntary control - over their thoughts, their actions, their lives.
Fine, I certainly feel like I have voluntary control over my thoughts and
actions because there is no shortcut and I don't know what I'm going to do
until I do it. Afterwards I say I guess that's what I decided to do and
determinism wins again, unless of course out thought and actions are random
and sometimes they probably are, especially if we take something like LSD.
> > No feeling of free will could arise out of determinism, even an
Why not? You don't know what you're going to do because the calculation is
not complete, and then the calculation is complete and you do stuff, and
then you say out of my own free will I decided to do this stuff.
> How can determinism 'want'?
I'll tell you as soon as you tell me how green can want.
> What is the cause of causality itself?
And I'll answer that question as soon as you explain why there is something
rather than nothing. It seems to me that looking for all the deep
philosophical problems you can find and blaming the failure to find a
solution to all of them on causality, but you don't explain how events
without causes, randomness, solves any of these puzzles.
I have to go to work (yes on New Year's Day) and don't have time to write
John K Clark
You received this message because you are subscribed to the Google Groups
"Everything List" group.
To post to this group, send email to firstname.lastname@example.org.
To unsubscribe from this group, send email to
For more options, visit this group at