I think of uploading to the network, as similar to that of a spider.
A spider catches and consumes it's prey, but in return it gets to become
the material that later makes up the next web.


On Sun, Jan 6, 2013 at 1:27 PM, Alan Grimes <[email protected]> wrote:

> [note, I'm cc'ing kurzweilAI.net because it's rude to talk about someone
> behind their back, at least I'm not being doubly-rude here].
>
> I just finished reading the Big Weenie's new book. I was fairly happy with
> the first 280 pages of the thing. It seemed like he was changing his spiel
> to advocating a cloud based meta-cortex; very similar to what I've been
> advocating recently, actually, minus the cloud part... (I think that the
> idea of people putting ALL their information on the cloud was enough to
> give certain key google execs the biggest boner of their lives, hence he
> was hired.)
>
> I think he just amused himself finding several dozen random quotations for
> each chapter and subsection and let his ghost-writer do the rest. Or that's
> how it felt.
>
> He spent most of his ink talking about himself. That said, I still agree
> with most of his AI ideas, fuzzy and pop-sci as they were. Writers of
> pop-sci books should burn in hell for keeping people away from the real
> inf0z. U guys suck.
>
> On pages 281 and 282 he decided to burn all the credibility he had built
> up through the rest of the book. Basically, what he said was outrageous
> enough that I'd propose burning him at the stake.
>
> LOOK, ASSHOLE, YOUR LAW OF ACCELERATING RETURNS IS NOT AS IMPORTANT AS MY
> EXISTENCE AS A HUMANOID LIFEFORM. YOU CAN HAVE YOUR COMPUTRONIUM. YOU CAN
> UPLOAD INTO YOUR COMPUTRONIUM. YOU CAN CONVERT THE ODD MOON HERE, A GAS
> GIANT THERE, A NEBULA OR TWO, EVEN A BLACK HOLE IF YOU CAN FIND A WAY TO
> EXPLOIT ONE. KNOCK YOURSELF OUT!!! BUT STOP THERE. LEAVE THE TERRESTRIAL
> PLANETS TO THE PEOPLE, FLUFFY BUNNIES, AND FURRY LITTLE MEOW-MONSTERS WHO
> NEED THEM!!!
>
> At no point in my life have I ever granted him license to speak about my
> destiny. =|
>
> He doesn't explicitly propose an uploading paradigm, you kinda have to
> piece it together from paragraphs scattered throughout the book. While I
> remain anti-uploading, the outlines he drew were slightly less unappealing
> than the standard picture. His proposal for emulation was based on a
> medium-high level model of the brain, while still flawed, actually could
> surpass the baseline's capabilities, as an AI. The traditional proposal of
> low-level simulation would always be much worse.
>
> The bulk of his uploading argument was on pages 240-247, though he didn't
> present it as an uploading proposal. I kinda passed over it because he
> didn't say anything outrageous in it. My objections to this thought
> experiment are much more subtle. My objection is that at some point during
> the replacement procedure a paradox will occur where you are at the point
> of replacing the exact same part of the brain that would normally perceive
> some kind of benefit from the procedure. The paradox being that the
> benefit-receiving part of the brain would be destroyed without receiving
> any benefit.
>
> My counter-proposal is to apply the concept of mind coalescence and
> mind-meld with an external AI, avoiding unnecessary damage to the original
> brain. If the procedure is successful, then the mind will have a much
> firmer foundation of redundancy to start chucking obsolete parts, when and
> if it decides to do so.
>
> This does not completely resolve the paradox, it only provides you with
> the tools that will allow you to address it in a way that you might find
> acceptable.
>
> This is not to say that after my proposal you'd necessarily go live like a
> fairy in computronium. No, you still have the full breadth and depth of
> human choices. Your >>> choice <<< of mentality does not at all imply any
> choices about your physicality, there you have all of your tastes,
> preferences, and desires to guide you.
>
> --
> E T F
> N H E
> D E D
>
> Powers are not rights.
>
>
>
> ------------------------------**-------------
> AGI
> Archives: 
> https://www.listbox.com/**member/archive/303/=now<https://www.listbox.com/member/archive/303/=now>
> RSS Feed: https://www.listbox.com/**member/archive/rss/303/**
> 5037279-a88c7a6d<https://www.listbox.com/member/archive/rss/303/5037279-a88c7a6d>
> Modify Your Subscription: https://www.listbox.com/**
> member/?&id_**secret=5037279-8beb0005<https://www.listbox.com/member/?&;>
> Powered by Listbox: http://www.listbox.com
>



-------------------------------------------
AGI
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/21088071-f452e424
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=21088071&id_secret=21088071-58d57657
Powered by Listbox: http://www.listbox.com

Reply via email to