“Advances of technology, along with enhanced understandings of biology and
emergent systems, will rouse a new order of attentive competence unattainable
to any that lack appropriate technological augmentation. The greater part of
living humanity could be relegated to a taxonomic out-caste,
;)
--
Artificial General Intelligence List: AGI
Permalink:
https://agi.topicbox.com/groups/agi/T86b555c591599ac6-Mb9deaf949ef4f59948bce062
Delivery options: https://agi.topicbox.com/groups/agi/subscription
Thank you Matt. I always enjoy reading your posts and comments.
On Thu, Jan 14, 2021, 7:23 AM Matt Mahoney wrote:
> Most people don't think about AGI. Very few of those who do believe that
> we need to worry about an unfriendly singularity.
>
> A self improving AGI needs to acquire both
On Sunday, January 17, 2021, at 5:50 AM, Mohammadreza Alidoust wrote:
> I think they did not cite your projects because they only cite published
> papers
Haha, so you could make true AGI and still go unnoticed.
So my AGI work, I have a huge guide on the way soon, maybe within a month (it's
only
Dear Dr. Immortal,
(sorry if I mentioned your name incorrectly)
85% is a great number. Congratulations!
Do you have published papers for your works (books, papers, ...)? I think
they did not cite your projects because they only cite published papers.
For example for my works they implied that my
> Look, if we don't build AGI, we are going to die.
Hurry up then!
--
Artificial General Intelligence List: AGI
Permalink:
https://agi.topicbox.com/groups/agi/T86b555c591599ac6-M337361d887d665af87a4fb91
Delivery options:
Yes an AGI* _in a box_* can attain more "data" by improving its intelligence.
Say we have a dataset it has in the box "The cat ate food. The dog ate food.
The cat meows." Now the AGI adds the ability to translate words cat=dog by
seeing they share some contexts (eat) so maybe they share others