Vinge: when humans produce superhuman AI then so can it, only faster. A singularity in mathematics is a point where a function (like intelligence over time) goes to infinity. That can't happen in a universe with finite computing power and finite memory. Or by singularity do you mean when AI makes humans irrelevant or extinct?
On Thu, Jun 14, 2018, 5:56 PM Steve Richfield via AGI <agi@agi.topicbox.com> wrote: > Matt, > > My own view is that a human-based singularity is MUCH closer. The problem > is NOT a shortage of GFLOPS or suitable software, but rather, a repairable > problem in our wetware. Sure, a silicon solution might eventually be > faster, but why simply wait until then? > > Apparently, I failed to successfully make this point to the people who > were paying Singularity's bills. > > *Steve* > > On Thu, Jun 14, 2018 at 12:47 PM, Matt Mahoney via AGI < > agi@agi.topicbox.com> wrote: > >> The singularity list (and SL4) died years ago. The singularity has been >> 30 years away for decades now. I guess we got tired of talking about it. >> > > > > -- > Full employment can be had with the stoke of a pen. Simply institute a six > hour workday. That will easily create enough new jobs to bring back full > employment. > > *Artificial General Intelligence List <https://agi.topicbox.com/latest>* > / AGI / see discussions <https://agi.topicbox.com/groups/agi> + > participants <https://agi.topicbox.com/groups/agi/members> + delivery > options <https://agi.topicbox.com/groups> Permalink > <https://agi.topicbox.com/groups/agi/T5ada390c367596a4-Md1a36411257f817e0eaf1812> > ------------------------------------------ Artificial General Intelligence List: AGI Permalink: https://agi.topicbox.com/groups/agi/T5ada390c367596a4-M3be8dd330fb882077e7f7442 Delivery options: https://agi.topicbox.com/groups