Ya, stevio, that was hillarious, I don't often find postings here humorous,
but I was laughing out loud. Thanks, that was great.
Lol, wow.

But ya Alan and Linas are right,  sad fact is that those really would be
"selling points" and the government would love to have the co-operation of
AGI people.  Ben is pretty safe, as they rarely go for high-profile people,
and it's in the interest of their own AGI program to have an active
civilian program from which to recruit from, such as fresh out of college
kids, perhaps with big ideas and tight lips.

Usually due to the depopulation agenda, anything dangerous the government
either cant or wont handle, is considered a conspiracy, and people that
worry about it are theorists,  such as radiation leaks from the various
nuclear reactors, aliens that abduct and experiment on people etc.

The government is in charge of steering their own people, so they mostly
pick on their own, to keep them in line like with TSA and all that.
"Terrorists" is just distracting from the fact they do most of the
terrorism, but have various fall guys, which are mind-controlled or paid
off before hand -- most work(ed) for the CIA i.e. Osama.

Also this whole AGI explosion stuff, is akin to fears about the atmosphere
lighting on fire from the first nuclear bombs, it's just not gonna happen,
there are simply too many roadblocks. Just like nuclear power, AGI power is
likely to have a slow-creep as developed nations get their own AGI's, to
help manage their economy or w/e.

Also as Matt mentions, Artificial Intelligence has been on a downward trend
in terms of search results, so many people are pretty immune to it, as it
hardly ellicits an emotional response after long times of not so greats.
It is still more popular than "machine intelligence",  but until we have
humanoid robots walking around taking people's jobs, the public isn't going
to see AI or AGI as a serious threat.


On a personal front, in case of w/e potential calamity, I have things
backed up in paper, as it has higher longevity than hard-drives, and am
learning sailing, so if SHTF can always sail away.



On Fri, Jan 4, 2013 at 12:50 AM, Linas Vepstas <[email protected]>wrote:

>
>
> On 3 January 2013 17:01, Steve Richfield <[email protected]>wrote:
>
>> Matt and Ben,
>>
>> Regarding:
>>
>>
>> http://www.dailymail.co.uk/news/article-2238152/Cambridge-University-open-Terminator-centre-study-threat-humans-artificial-intelligence.html?ITO=1490&ns_mchannel=rss&ns_campaign=1490
>>
>> WOW  The next OBVIOUS steps are that this Terminator group publishes a
>> study showing how AGI would predictably lead to the end of the human race
>> (which after all is their existential purpose), then governments will clamp
>> down on it so hard that no one dare talk about it, lest they be locked up
>> for the good of society - or worse - read on.
>
>
>> Ben, did you notice that YOU (as the leader of AGI) are now at the top of
>> the short-list of the four most likely ways to kill off the entire human
>> race? This apparent threat can be delayed, probably for years, for less
>> than a dollar, depending on the caliber of the bullet they select. It won't
>> take a government assassin - there are plenty enough crazies around to do
>> the job.
>>
>
> One word: Unabomber.
>
> OK, more words:
>
> 1) Public perception is misplaced: terminator will be built by the
> military, not by open source enthusiasts.
>
> 2) Realistic AGI designs will still require access to supercomputers
> and/or clouds. Neither Ben, nor anyone else here has this access.  However,
> Google, Apple, Microsoft, Amazon, Rackspace and a few others do.  I'm
> guessing that Siri, Watson, and the google car all started as skunk works.
> I fully believe that there's AGI skunk-works in progress, at these
> companies, right now.
>
> 3) NSA does this shit too. Some years ago (4-5 ?) news came out about
> biding for their 3rd-generation propaganda-modeling machine.  The
> generation 2 machine modeled a virtual world with one virtual human for
> every million real ones. The virtual human would read the news, and then
> decide to act: maybe go to work, maybe join a political party, maybe
> participate in street protest, maybe become a terrorist. For generation 3,
> they wanted to refine this to one virtual for every 10K real people (on a
> global, planet-wide, all-country basis).  Think of it as climate modelling,
> but for the political weather. For all I know, its a waste of tax-payer
> money. But that doesn't matter: these guys do this stuff.  And, obviously,
> it has AGI potential.
>
> 4) If your worst-case scenario materializes, there is a place to run for
> shelter: work on military contracts. Get a security clearance. Get a job
> with one of the big military contractors.  Work on X-37B or whatever. They
> protect their own, and they are more powerful than governments.  Ain't no
> judge or jury gonna find you guilty of anything if you were just doing your
> job, following orders, working on some black-ops technology.  Ain't no
> civilian who is even gonna have the vaguest clue of what you are working
> on, except for some barking-dog-crazy conspiracy theorists, e.g. the folks
> over at infowars. (they now have a paper magazine, because fire-walls
> cannot block paper! Seriously! It says so on the cover!)
>
> -- Linas
>    *AGI* | Archives <https://www.listbox.com/member/archive/303/=now>
> <https://www.listbox.com/member/archive/rss/303/5037279-a88c7a6d> | 
> Modify<https://www.listbox.com/member/?&;>Your Subscription
> <http://www.listbox.com>
>



-------------------------------------------
AGI
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/21088071-f452e424
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=21088071&id_secret=21088071-58d57657
Powered by Listbox: http://www.listbox.com

Reply via email to