Still that physics envy? A one page solution to AGI that somehow eluded
millions of researchers and software developers for the last 70 years.

OK here it is. It involves self replicating nanotechnology and 10^48 bit
copy operations over 3 billion years of evolution. Don't want to wait?
Then you have to use modern software development to reproduce something of
the complexity of our DNA, equal to 300 million lines of code when both are
compressed.

That's certainly doable, but just shrinking transistors won't get us to 20
watts to run a 10 petaflop neural network. You need a whole new technology
to get anywhere near the Lamdauer limit, one that encodes bits in heavier
and slower atoms or molecules instead of electrons.

On Wed, Dec 30, 2020, 8:02 PM Alan Grimes via AGI <agi@agi.topicbox.com>
wrote:

> While we are waiting to learn the fate of western civilization, which
> seems like it will be decided on January 6...   I mean in the end, God
> wins, but the open question is whether he'll be declaring this planet a
> loss and starting somewhere else. It looks like that question will be
> answered on the 6th. =\
> 
> Anyway, all the cofe (<< letter optimal spelling. =P)  table chit-chat
> about AI is really starting to wear on me... Ok, the word "starting"
> there was a total lie... Anyway...
> 
> Lets consider the corollary to moore's law that attempts to approximate
> the cost of a sucessful AGI project as a function of time.
> 
> COST =   K * e^(-t * S)
> 
> Where K and S are scaling constants...
> 
> Lets say someone in the future sketched out the outline of an AGI on a
> napkin, crumpled it up, and tossed it into a pocket of anti-time where
> the thing ended up somewhere you could recover it.
> 
> Even though what comes through is just a sketch, it would save you,
> potentially billions of dollars and decades of time by allowing you to
> focus in on exactly what needs to be done. Even still, the machine needs
> to be built, programmed, raised, edjoomakayted, deployed, etc... So the
> cost would STILL be on the order of $100 million. Add back in the trial
> and error, and the silly human accademic politics, and all the other
> nonsense, and you are looking at a billion dollar program.
> 
> Lets look at the silly extremes to understand the mechanisms behind this
> proposed cost curve,
> 
> Ooga Booga the cave man would need to pay for a technological
> civilization, science, math, neural science, a semiconductor foundry,
> operating systems and compilers, data-sets, algorithms... So several
> trillions of dollars of value.
> 
> While I don't hope we have street bums in 2060, lets take a street bum,
> dumpster-diving for hardware, stealing electricity, downloadnig free
> libraries, and then setting it up in just such a way that hadn't been
> done previously and it, miraculously works... So basically all the
> capitol investments of our civilization will have paid off to the point
> where it becomes ridiculously inexpensive.
> 
> In terms of what can actually happen, we have the key date of 1951 when
> Turing fired off the starting gun on the entire field, roughly marking
> the point at which a practical technological roadmap became visible.
> While you could find examples of computers going all the way back to
> Babbage, Turing was the one who most clearly expressed the mathematical
> foundations of computing and paved the way to the first truly general
> computers.
> 
> Since then, computers have become vastly superhuman in just about any
> specific capacity you could name. For some reason, there are still news
> articles written about computers beating humans in some specific, well
> defined domain. Some even call this progress. =| People with enough
> experience have come to realize that this isn't even the correct problem.
> 
> What is needed is a qualitative leap. consider the point in time where
> video games went from painted still immages to real time 3D rendering.
> The new machines had no capabilities that the old ones didn't, on a
> theoretical level, but were now powerful enough that a qualitative leap
> was possible. In this case, we need a qualitative leap in how software
> comes to be, we need a kernel with which the computer can learn its
> software, and not just the simple functions that neural networks are
> starting to do. Shakespeare used the word "apprehension" to describe
> this quality. I think this word means the ability to capture the
> semantic structure, either physical or abstract, of a thing by creating
> terms and concepts. Do that in the general case and you're done.
> 
> --
> The vaccine is a LIE.
> #EggCrisis
> The Great Reset
> Powers are not rights.
> 

------------------------------------------
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/Tabc940322ac5ad2c-Mf8d8e69b21acc2de0501f2a5
Delivery options: https://agi.topicbox.com/groups/agi/subscription

Reply via email to