On 2020-02-23 20:02:PM, Matt Mahoney wrote:

Elon Musk takes Yudkowsky's theory seriously that the first AGI to achieve human level intelligence will launch a singularity. OpenAI founders believe that too, which is why they are racing to be first. Musk worries that their secrecy risks getting the design wrong, producing an unfriendly singularity. Meanwhile, Microsoft is late to the game like they have been with every technological advance since Apple introduced the GUI, and now they have to buy their way in again. Nobody at Microsoft believes in singularities.

To be honest, I don't either.


The singularity is nonsense, but the concept isn't required. Machine superintelligence is likely to

result in fast progress in some areas, power imbalances and big changes. IOW, it is an

important development, and no doubt most of the participants basically understand this.

--
__________
 |im |yler http://timtyler.org/

------------------------------------------
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/Tab97f0f82cc3442a-M6a0f011f746e0d4ffda9bdbd
Delivery options: https://agi.topicbox.com/groups/agi/subscription

Reply via email to