On Friday, September 9, 2022 at 2:10:21 PM UTC-5 jessem wrote:

> On Fri, Sep 9, 2022 at 8:26 AM smitra <[email protected]> wrote:
>
>> So, I think insect-level AGI will cause a rapid transition to a machine 
>> civilization. This will lead to a new biology of machines with insect 
>> level intelligence ending up wiping out all life on Earth due to 
>> pollution, similar to the great oxygenation event:
>>
>
> Are you assuming insect-level AGI would also be small like insects and 
> could self-replicate just as rapidly using commonly-found materials as 
> "nutrients"? If we had insect-level AGI but they were larger and easier to 
> spot, and also took much longer than an insect to self-replicate (and 
> perhaps required external infrastructure or uncommon materials to do so), 
> it seems hard to imagine a scenario in which humanity wouldn't be able to 
> prevent them from going into runaway self-replication mode.
>
> I think the possibility of relatively "dumb" self-replicating machines, 
> even if large and relatively slow like Eric Drexler's concept of a 
> "clanking replicator" (see 
> http://wfmh.org.pl/enginesofcreation/EOC_Chapter_4.html ), could disrupt 
> society for a different reason--they could spell the end of capitalism, or 
> at least radically change its nature. If there were commercially available 
> machines that could replicate themselves, those who owned them could make 
> copies for just the cost of raw materials and energy, and if they were 
> competing to sell them, competition would tend to drive the cost down to 
> materials/energy cost or barely above it, basically destroying profits for 
> any good that isn't forced into artificial scarcity by intellectual 
> property laws. This would likewise go for any other goods the machines are 
> capable of replicating. If self-replicating machines could also extract 
> resources (fully automated mining facilities, say), then profit would still 
> be possible if raw materials returned > raw materials invested (akin to 
> 'energy return on energy invested' in energy economics), but if companies 
> were making profits by just setting up mining machines and then sitting 
> back and doing nothing, this would probably cause political instability, 
> both in democracies and autocratic systems, where either the people or the 
> politicians would likely prefer to be the ones reliably getting back more 
> than their initial investment with no work needed. Perhaps instead of 
> totally ending capitalism, we might end up with a hybrid system where some 
> sort of intellectual property laws would still be in place so companies and 
> individuals could still profit from those, but actual production machinery 
> would mostly be publicly owned, and people (along with retail companies) 
> could order up any good from a database of designs, receiving something 
> like a basic income in raw materials and energy (funded by mining and 
> energy generation facilities which could also be publicly owned).
>
> Arthur C. Clarke, in his 1962 nonfiction book Profiles of the Future, 
> commented about how a self-replicating machine which could also replicate 
> other goods, which he just called a "Replicator", would disrupt our current 
> economic system:
>
> "The advent of the Replicator would mean the end of all factories, and 
> perhaps all transportation of raw materials and all farming. The entire 
> structure of industry and commerce, as it is now organized, would cease to 
> exist. Every family would produce all that it needed on the spot — as, 
> indeed, it has had to do throughout most of human history. The present 
> machine era of mass-production would then be seen as a brief interregnum 
> between two far longer periods of self-sufficiency, and the only valuable 
> item of exchange would be matrices, or recordings, which had to be inserted 
> into the Replicator to control its creations.
>
> "No one who has read thus far will, I hope, argue that the Replicator 
> would itself be so expensive that nobody could possibly afford it. The 
> prototype, it is true, is hardly likely to cost less than 
> £1,000,000,000,000 spread over a few centuries of time. The second model 
> would cost nothing, because the Replicator's first job would be to produce 
> other Replicators. It is perhaps relevant to point out that in 1951 the 
> great mathematician, John von Neumann, established the important principle 
> that a machine could always be designed to build any describable machine -- 
> including itself. The human race has squalling proof of this several 
> hundred thousand times a day. 
>
> "A society based on the Replicator would be so completely different from 
> ours that the present debate between Capitalism and Communism would become 
> quite meaningless. All material possessions would be literally cheap as 
> dirt. Soiled handkerchiefs, diamond tiaras, Mona Lisas totally 
> indistinguishable from the original, once-worn mink stoles, half-consumed 
> bottles of the most superb champagnes – all would go back into the hopper 
> when they were no longer required. Even the furniture in the house of the 
> future might cease to exist when it was not actually in use.”
>
> Probably this book was a major influence on Gene Roddenberry's vision of a 
> post-scarcity future in Star Trek, see his comments quoted at 
> https://arthurcclarke.org/site/how-arthur-c-clarke-helped-save-star-trek/ 
> where he specifically references Profiles of the Future. For a more 
> cyberpunk depiction of how fully automated self-replicating machinery could 
> lead to a transition to a new kind of economic system, I recommend Cory 
> Doctorow's recent sci fi novel "Walkaway".
>
> Jesse
>

All of that would require an enormous amount of energy. That is one thing 
that would put a limit on this.

LC
 

>
>
>  
>
>>
>> https://en.wikipedia.org/wiki/Great_Oxidation_Event
>>
>> And as I pointed out earlier, I think this is a universal phenomena that 
>> all intelligent life is subject to. The whole point of being intelligent 
>> is to let as much of the work be done for you by entities that are 
>> dumber than you. But in that process that leads to faster and faster 
>> economic growth, its inevitable that at some point you are going to 
>> crate autonomous systems that will grow exponentially. The point where 
>> the transition to artificial life starts is going to be close to the 
>> minimum intelligence level needed for exponential growth.
>>
>> If you make it hotter and hotter in some closed space, a fire will break 
>> out, this is going to happen close to the minimum required temperature 
>> for ignition, not at some extremely high value for the temperature. 
>> Nature shows us that the minimum amount of intelligence required for 
>> efficient self-maintenance and reproduction that yields exponential 
>> growth is very low.
>>
>> Saibal
>>
>>
>>
>>
>>
>> On 08-09-2022 14:09, John Clark wrote:
>> > This is an interview of the great computer programmer John Carmack, he
>> > thinks the time when computers can do everything, not just some
>> > things, as good or better than humans is much closer than most people
>> > believe, he thinks there is a 60% chance it will happen by 2030. Like
>> > me Carmack is much more interested in intelligence than consciousness
>> > and has no interest in the "philosophical zombie" argument. As far as
>> > the future history of the human race is concerned the following
>> > quotation is particularly relevant:
>> > 
>> > "___It seems to me this is the highest leverage moment for a single
>> > individual potentially_ _in the history of the world._ [...]   _I am
>> > not a mad man in saying that the code for artificial General
>> > intelligence is going to be tens of thousands of lines of code, not
>> > millions of lines of code. This is code that conceivably one
>> > individual could write, unliker writing a new web browser or operating
>> > system._"
>> > 
>> > The code for AGI will be simple [1]
>> > 
>> > John K Clark    See what's on my new list at  Extropolis [2]
>> > 
>> > b30
>> > 
>> >  --
>> > You received this message because you are subscribed to the Google
>> > Groups "Everything List" group.
>> > To unsubscribe from this group and stop receiving emails from it, send
>> > an email to [email protected].
>> > To view this discussion on the web visit
>> > 
>> https://groups.google.com/d/msgid/everything-list/CAJPayv3ZEbXXVjs803%3Dutjc2pvkCgpZGA%2Bad_OWBhue-5kxDJQ%40mail.gmail.com
>> > [3].
>> > 
>> > 
>> > Links:
>> > ------
>> > [1] https://www.youtube.com/watch?v=xLi83prR5fg
>> > [2] https://groups.google.com/g/extropolis
>> > [3]
>> > 
>> https://groups.google.com/d/msgid/everything-list/CAJPayv3ZEbXXVjs803%3Dutjc2pvkCgpZGA%2Bad_OWBhue-5kxDJQ%40mail.gmail.com?utm_medium=email&utm_source=footer
>>
>> -- 
>> You received this message because you are subscribed to the Google Groups 
>> "Everything List" group.
>> To unsubscribe from this group and stop receiving emails from it, send an 
>> email to [email protected].
>>
> To view this discussion on the web visit 
>> https://groups.google.com/d/msgid/everything-list/d4b54074fe283e5c198ff6a6d709b143%40zonnet.nl
>> .
>>
>

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To view this discussion on the web visit 
https://groups.google.com/d/msgid/everything-list/bf5f67c4-b319-459e-8eae-d8c4a4fd271an%40googlegroups.com.

Reply via email to