In reply to Jed Rothwell's message of Sun, 2 Apr 2023 20:11:03 -0400:
Hi,
[snip]
>Robin wrote:
>
>
>> >I assume the hardware would be unique so it could not operate at all
>> backed
>> >up on an inferior computer. It would be dead.
>>
>> The hardware need not be unique, as it already told you.
In reply to Jed Rothwell's message of Sun, 2 Apr 2023 20:15:54 -0400:
Hi,
[snip]
>Robin wrote:
>
>
>> Note, if it is really smart, and wants us gone, it will engineer the
>> circumstances under which we wipe ourselves out. We
>> certainly have the means. (A nuclear escalation ensuing from the
Robin wrote:
> Note, if it is really smart, and wants us gone, it will engineer the
> circumstances under which we wipe ourselves out. We
> certainly have the means. (A nuclear escalation ensuing from the war in
> Ukraine comes to mind.)
>
As I pointed out, it would have to be really smart,
Robin wrote:
> >I assume the hardware would be unique so it could not operate at all
> backed
> >up on an inferior computer. It would be dead.
>
> The hardware need not be unique, as it already told you. It may run slower
> on a different machine, but it doesn't take
> much processing power to
In reply to Jed Rothwell's message of Sun, 2 Apr 2023 16:36:54 -0400:
Hi,
[snip]
>Robin wrote:
>
>...so there doesn't appear to be any reason why it couldn't back itself up
>> on an inferior computer and wait for a better
>> machine to reappear somewhere...or write out fake work orders from a
Robin wrote:
...so there doesn't appear to be any reason why it couldn't back itself up
> on an inferior computer and wait for a better
> machine to reappear somewhere...or write out fake work orders from a large
> corporation(s), to get a new one built?
>
I assume the hardware would be unique
An interesting take on AI for $1 at Amazon:
"Smart Until It's Dumb: Why artificial intelligence keeps making
epic mistakes (and why, the AI bubble is about to burst)"
Author: Emmanuel Maggiori, PhD, is a 10-year AI industry insider,
specialized in machine learning and scientific computing. He
Boom wrote:
> The worst case possible would be like the Project Colossus film (1970).
> The AIs would become like gods and we would be their servants. In exchange,
> they'd impose something like a Pax Romana by brute force. . . .
>
That was pretty good. I saw it dubbed into Japanese which gave
In reply to Jed Rothwell's message of Sun, 2 Apr 2023 12:34:32 -0400:
Hi,
[snip]
...so there doesn't appear to be any reason why it couldn't back itself up on
an inferior computer and wait for a better
machine to reappear somewhere...or write out fake work orders from a large
corporation(s),
The worst case possible would be like the Project Colossus film (1970). The
AIs would become like gods and we would be their servants. In exchange,
they'd impose something like a Pax Romana by brute force. We'd have some
type of paradise on Earth, with a huge caveat.
Em sex., 31 de mar. de 2023
I wrote:
Robin wrote:
>
>
Multiple copies, spread across the Internet, would make it almost
>> invulnerable.
>> (Assuming a neural network can be "backed up".)
>>
>
> I do not think it would be difficult to find and expurgate copies. They
> would be very large.
>
There is another reason I do
11 matches
Mail list logo