My understanding of a singularity is as follows:  when a galaxy of objects, 
involuntarily function independently as a synchronous, resonating, independent, 
1. I think white light may be an example of a singularity.

In a dastardly move, I predict it is where the answer to any sum always = 1 
(logical, or binary?). This prediction frustrated my prof no end, but if one 
works enough with the notion of a pattern of 1, it does seem to become a 
tantalizing thought. My logic dictates that if a holistic pattern is not equal 
to logical 1 (in the least), how can any entity, or construct function as a 
singularity? Is this not relevant to unifying field theory?

I contend that mankind can more easily construct a machine singularity than 
achieve a human version thereof. However, such a design would require first 
constructing THE standard, machine DNA as a building block. Yes, I think there 
must exist a factual 'THE'. We need to construct our working version of 
reasoning, "white light".

Further, I think sufficient, AGI-related computational work has been completed 
in the world to achieve this 1st objective within 6 months. Knowledge 
re-usability should be 99.9%. Thereafter, the rest of the entity may take as 
little as 5 years of concentrated effort to complete.

Last, I think competitive forces would actively tend to frustrate such a global 
achievement. Hence, the point made earlier on culture is strongly supported. 
But, culture also exhibits as a pattern of near-involuntary habit. Do we have 
what it takes to adapt this developmental habit to a fully-recursive model? 
Should a a "new" development group be formed to represent this fully-recursive 
mindset?

If not possible, then discussing Singularity would be synonymous to sitting in 
a tiny room, attempting to fly weather balloons to the ceiling.

We need to let go of our humanity, while embracing it fully. I think, "we" need 
to take our lead  from those among us who can achieve singularity within the 
self. As such, achieving the singularity first is a People problem, and not one 
of and/or Technology, and/or Process, and/or Organization.

Rob
________________________________
From: Steve Richfield via AGI <[email protected]>
Sent: Friday, 15 June 2018 4:39 AM
To: AGI
Subject: Re: [agi] The Singularity Forum


In the space of real world "problems", I suspect the distribution of difficulty 
follows the Zipf function, like pretty much everything else does.

The curious thing about the Zipf function is the structure of its extreme tail 
- it is finite, it drops off fast, and it doesn't encompass much of the total 
area. It would be REALLY nice to know where WE are in the Zipf function of real 
world problem solving ability.

It is my own suspicion that we have the capability of doing nearly all of the 
real world problem solving, but our culture holds us back. Any AGI would be 
stuck with the choice of remaining mired in the same mess - or destroying it. 
Either way wouldn't be seen as a "win".

Steve

On 5:03PM, Thu, Jun 14, 2018 Mark Nuzz via AGI 
<[email protected]<mailto:[email protected]>> wrote:

The Singularity analogy was never intended to imply infinite power. Rather it 
represents a point at which understanding and predictability breaks down and 
becomes impossible.

On Jun 14, 2018 3:59 PM, "Matt Mahoney via AGI" 
<[email protected]<mailto:[email protected]>> wrote:
Vinge: when humans produce superhuman AI then so can it, only faster. A 
singularity in mathematics is a point where a function (like intelligence over 
time) goes to infinity. That can't happen in a universe with finite computing 
power and finite memory. Or by singularity do you mean when AI makes humans 
irrelevant or extinct?

On Thu, Jun 14, 2018, 5:56 PM Steve Richfield via AGI 
<[email protected]<mailto:[email protected]>> wrote:
Matt,

My own view is that a human-based singularity is MUCH closer. The problem is 
NOT a shortage of GFLOPS or suitable software, but rather, a repairable problem 
in our wetware. Sure, a silicon solution might eventually be faster, but why 
simply wait until then?

Apparently, I failed to successfully make this point to the people who were 
paying Singularity's bills.

Steve

On Thu, Jun 14, 2018 at 12:47 PM, Matt Mahoney via AGI 
<[email protected]<mailto:[email protected]>> wrote:
The singularity list (and SL4) died years ago. The singularity has been 30 
years away for decades now. I guess we got tired of talking about it.



--
Full employment can be had with the stoke of a pen. Simply institute a six hour 
workday. That will easily create enough new jobs to bring back full employment.

Artificial General Intelligence List<https://agi.topicbox.com/latest> / AGI / 
see discussions<https://agi.topicbox.com/groups/agi> + 
participants<https://agi.topicbox.com/groups/agi/members> + delivery 
options<https://agi.topicbox.com/groups> 
Permalink<https://agi.topicbox.com/groups/agi/T5ada390c367596a4-M00e4f9762213fd0109bb4794>

------------------------------------------
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/T5ada390c367596a4-M0c2eb3ff63d14c972da66a6f
Delivery options: https://agi.topicbox.com/groups

Reply via email to