Politically, the stability of the human species seems less dependent on John's 
estimation of the real estate guy, or fanatical, drooling, followers, like 
myself, but instead the decisions of Joe's special friend, Comrade Xi. It's 
often wise not to simply focus on a North American-centric pov, but understand 
the motivations of others. For war and peace issues, it is Xi who is the 800kg 
ape in the room, and his communist party does risk-embracing things like 
destroy the Uighers, or ruin Hong Kong, perhaps takeover Taiwan, or permit the 
Wuhan fly to spread worldwide to ensure no competitive advantage.


-----Original Message-----
From: John Clark <[email protected]>
To: [email protected]
Sent: Sun, Oct 11, 2020 2:48 pm
Subject: Re: Trump is on drugs

On Sun, Oct 11, 2020 at 10:21 AM Lawrence Crowell 
<[email protected]> wrote:



>> [Me] I'm not talking about humans snuffing themselves out although I admit 
>> that's possible, I'm talking about humans replacing parts of themselves 
>> until there is no longer anything very human about them. Some signals in the 
>> brain move as slowly as .01 meters per second, the slow diffusion of 
>> hormones for example, but even the very fastest signals in the brain move at 
>> only 100 meters per second and light moves at 300,000,000 meters per second; 
>> and in a computer made with Nanotechnology the distances the signal must 
>> travel will be far shorter because the components will be much smaller. And 
>> that's without even considering Quantum Computers. There is just no way 
>> biology can compete with that.





> I have serious doubts about a lot of these hyper-tech ideas that border on 
> science fiction.
 These ideas are technology fiction maybe but they are not science fiction. I'm 
not talking about backward time travel or faster than light spaceships, those 
things are probably physically impossible and would require a major 
breakthrough in science that would upend nearly everything we think we know 
about how the world works, I'm just talking about an improvement in technology. 
We just need to be able to place atoms where we want them to go (and we don't 
even need to get close to Heisenberg's limit). Everything else follows from 
that.  

> These ideas sort of give me a sense of why there were so many of those 1950 
> science fiction and horror films about mad doctors or scientists hell bent on 
> bizarre quests. I think for the average person these sorts of ideas probably 
> sound little different.

That is certainly true today and that's why Cryonics is not enormously more 
popular than it is. So I guess I'm not the average person. To tell the truth, 
when I was a kid I usually identified more with the mad scientist in those 
1950s movies (Forbidden Planet was my favorite) than with the purported hero, 
and I thought Lex Luthor had more fun than Superman.    

> One has to remember that while we can pursue a better understanding of the 
> universe, few people want their humanity taken away or to become robots.

What people want is not terribly relevant in this case, I'm sure the dinosaurs 
didn't want an asteroid crashing into the Yucatán 66 Million years ago, but it 
happened nevertheless. 

> For some practical reasons I also think there are limits on these things. 

That might be the most comfortable thing for some people to believe but I see 
no reason to think it's actually true. Modern humans have only been around for 
about half a million years and you think that's as smart as things can get? A 
machine can approach our level of intelligence but never reach it? If humanity 
manages to avoid destruction by Trump and other existential threats you think 
the human species will remain unchanged on a geological time scale?  With the 
twin factors of the computer revolution and genetic engineering I don't think 
the human race will remain stable even for the remainder of this century.
 John K Clark-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To view this discussion on the web visit 
https://groups.google.com/d/msgid/everything-list/CAJPayv1sb1ekd-xwG%3D%2BBbW9yfdZM_OH-1wvTPGV6Qakzc74ycw%40mail.gmail.com.

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To view this discussion on the web visit 
https://groups.google.com/d/msgid/everything-list/750421329.302962.1602457797857%40mail.yahoo.com.

Reply via email to