Re: [Vo]:Centre for the Study of Existential Risk

2012-11-27 Thread Terry Blanton
On Mon, Nov 26, 2012 at 1:26 PM, Terry Blanton hohlr...@gmail.com wrote:
 CSER is being launched at Cambridge University to protect us from Skynet:

And now we have reassurance from the Pentagon:

http://www.wired.com/dangerroom/2012/11/human-robot-kill

The Pentagon wants to make perfectly clear that every time one of its
flying robots releases its lethal payload, it’s the result of a
decision made by an accountable human being in a lawful chain of
command. Human rights groups and nervous citizens fear that
technological advances in autonomy will slowly lead to the day when
robots make that critical decision for themselves. But according to a
new policy directive issued by a top Pentagon official, there shall be
no SkyNet, thank you very much.

more

I feel reassured, do you?



Re: [Vo]:Centre for the Study of Existential Risk

2012-11-27 Thread Alain Sepeda
between the demographic fears, debunked since decades (sorry, after 9
billion, humanity will reduce), the gibbs like it might be wrongly done or
use, so just don't do it, the various fear about healt,, environment...
and those existential fears tha get up to DoD or UNO, with debunked fear or
SciFi...

I think we (in occident )  have a problem... a mental problem...

It starts to destroy more than it saves, to waste resources instead os
saving, to kill instead of cure.

Maybe also like many I forgot that humanity have been more stupid
earlier... Hope so.
We have old societal mental diseases (malthusianism, apocalipticalism,
neo-animism, hypocondry) but now they are global.

it remind me an article about post-modernism. it determine the kind of
religion

Archaic culture is feeling that human is controlled by the nature, and
suffer from it... Religion is based on obeying nature, or lords...
Modern culture is feeling that humans can control the nature.
parapsychology develop as a symptom of self-confidence.
post-modern culture feel so powerful that it is afraid to hurt the
nature...Culpabilist animism develops.

Hope LENR can make us more positive.


2012/11/27 Terry Blanton hohlr...@gmail.com

 On Mon, Nov 26, 2012 at 1:26 PM, Terry Blanton hohlr...@gmail.com wrote:
  CSER is being launched at Cambridge University to protect us from Skynet:

 And now we have reassurance from the Pentagon:

 http://www.wired.com/dangerroom/2012/11/human-robot-kill

 The Pentagon wants to make perfectly clear that every time one of its
 flying robots releases its lethal payload, it’s the result of a
 decision made by an accountable human being in a lawful chain of
 command. Human rights groups and nervous citizens fear that
 technological advances in autonomy will slowly lead to the day when
 robots make that critical decision for themselves. But according to a
 new policy directive issued by a top Pentagon official, there shall be
 no SkyNet, thank you very much.

 more

 I feel reassured, do you?




Re: [Vo]:Centre for the Study of Existential Risk

2012-11-27 Thread Mark Gibbs
On Tue, Nov 27, 2012 at 5:57 AM, Alain Sepeda alain.sep...@gmail.comwrote:

  the gibbs like it might be wrongly done or use, so just don't do it


You should read what I write more carefully: I didn't say CF/LENR shouldn't
be used but that possible unintended consequences should be considered.

[mg]


Re: [Vo]:Centre for the Study of Existential Risk

2012-11-27 Thread Jed Rothwell
Mark Gibbs mgi...@gibbs.com wrote:


 You should read what I write more carefully: I didn't say CF/LENR
 shouldn't be used but that possible unintended consequences should be
 considered.


That seems sensible to me.

Perhaps you should write about some of the unintended benefits.

We can avoid unintended problems by anticipating them. The article raised
the issue of lax regulations in third world countries. Even countries such
as India and China now have sophisticated anti-pollution laws. Their cars
are much cleaner than U.S. cars were in 1960. So, even poor countries will
abide by sensible environmental regulations, if potential problems are
explained clearly, and if cost-effective ways to avoid problems are
engineered into the system from the start.

As I pointed out, it is easy to keep tritium or any other radioactive
materials sealed in a cold fusion cell. Tritium today is safely sealed in
emergency exit signs, in a higher concentration than it is likely to be
found in a cold fusion cell. The cell can be recycled in a sophisticated
factory where the tritium is captured. India has thousands of
state-of-the-art factories. I saw them lined up for miles along the highway
going out of Chenai. Every major Japanese and European company has assembly
plants. It would take only a dozen or so factories to recycle every cold
fusion device in the country, assuming the devices last 10 years. The
country can easily afford that. Furthermore, there is not likely to be any
economic benefit to opening up the cells and recycling them manually, the
way the Chinese recycle computers.

- Jed


Re: [Vo]:Centre for the Study of Existential Risk

2012-11-27 Thread Alain Sepeda
Ok, I exaggerate, it is a litte a strawman rhetoric... sorry.


I'm inspired by less subtiles fearmonger on any subject that reach the
media and even the politicians...

managing risk is ok, but we should not forget the gain, to avoid the
drawbacks...
people forget a little that today situation is not so nice, that some
people dies of lack of many things, and much more than of pollution, of
catastrophe, of technological accident...

fighting drawback , pollution, risk, is mostly no-regret, but today there
is a tendency to avoid any progress because of unvalidated fears...
protection principle is good, watring principle too, precaution no.

a better approache, beside avoiding know risk, and solving know problems,
is to learn so you can react to the unexpected, instead of trying to
anticipate all... which does not work...

On some place I've heard that LENr could destroy the atmosphere by
destroying oxygen...
fear of robots invasion too is crazy, while I feel more rational to observe
adults and kids, and see how dependent we are, and how it change our
approach to problem solving, making us powerful but also weak... mobile
phone don't kill by wave, but by stealing our brain on the road... We have
enoug real risk and proble, not to prepare for all.


LENR like gas might be dangerous (because of hydrogen, of heat, of steam)...
It remind me a nuclear waste storage where they focus so strongly on
radiation, than they forget that some chemicals were simply toxic, even
dead cold.

In Grenoble zone I've discussed with local safety expert, and she said me
that old factories were nearly ignored by safety, while new one where over
regulated, and nuke above all...


today risk management seems crazy, and focused on the fear of new things,
of technology, and not on old or natural things.

LENR I agree, will probably cause strange effects... some good, some bad,
many funny or crazy... let us prepare to adapt, we are programmed to.

sorry for that bad mood, I'm a bit fed up by our state of permanent fear...


2012/11/27 Mark Gibbs mgi...@gibbs.com

 On Tue, Nov 27, 2012 at 5:57 AM, Alain Sepeda alain.sep...@gmail.comwrote:

  the gibbs like it might be wrongly done or use, so just don't do it


 You should read what I write more carefully: I didn't say CF/LENR
 shouldn't be used but that possible unintended consequences should be
 considered.

 [mg]



Re: [Vo]:Centre for the Study of Existential Risk

2012-11-27 Thread Terry Blanton
On Tue, Nov 27, 2012 at 4:34 PM, Alain Sepeda alain.sep...@gmail.com wrote:

 sorry for that bad mood, I'm a bit fed up by our state of permanent fear...

Recommended reading for you:

http://www.amazon.com/State-Fear-Michael-Crichton/dp/0061782661



[Vo]:Centre for the Study of Existential Risk

2012-11-26 Thread Terry Blanton
CSER is being launched at Cambridge University to protect us from Skynet:

http://www.dailymail.co.uk/news/article-2238152/Cambridge-University-open-Terminator-centre-study-threat-humans-artificial-intelligence.html

And rightly so considering the latest advances in Deep Thought (aka
deep learning):

http://www.nytimes.com/2012/11/24/science/scientists-see-advances-in-deep-learning-a-part-of-artificial-intelligence.html



Re: [Vo]:Centre for the Study of Existential Risk

2012-11-26 Thread James Bowery
The problem isn't a threat from artificial intelligence -- it is, and has
always been -- from natural intelligence.

The robocalypse will be controlled by malign natural intelligence.

On Mon, Nov 26, 2012 at 12:26 PM, Terry Blanton hohlr...@gmail.com wrote:

 CSER is being launched at Cambridge University to protect us from Skynet:


 http://www.dailymail.co.uk/news/article-2238152/Cambridge-University-open-Terminator-centre-study-threat-humans-artificial-intelligence.html

 And rightly so considering the latest advances in Deep Thought (aka
 deep learning):


 http://www.nytimes.com/2012/11/24/science/scientists-see-advances-in-deep-learning-a-part-of-artificial-intelligence.html




Re: [Vo]:Centre for the Study of Existential Risk

2012-11-26 Thread Jed Rothwell
I love the name of this organization. It sounds like it should have 
something to do with existential philosophy. Centre is so very British 
 cosmopolitan


- Jed



Re: [Vo]:Centre for the Study of Existential Risk

2012-11-26 Thread Terry Blanton
On Mon, Nov 26, 2012 at 3:06 PM, Jed Rothwell jedrothw...@gmail.com wrote:
 I love the name of this organization. It sounds like it should have
 something to do with existential philosophy. Centre is so very British 
 cosmopolitan

And it will be called Caesar or Seize Her!