Sure you can make a super duper HQ (RTS pun, good game, great songs;
https://www.youtube.com/watch?v=3KAVfhqH0zg) GPT-9 that predicts really really
good, but it won't ever nag about AGI all day, like me. And that's a problem.
On the bright side, It will not nag about the common words like the, could,
they, after, before, if, would, their, and, can, went, because it already does
that well, it talks about rare but not too rare features, that's good, but it
doesn't talk just about one domain. How do you expect to make a computatious
monsterous intelligent BERT-9 talk about AGI most the day and not cats? Sure in
the dataset the word clothing, food, computer may be up there in probability,
but remember we want agents working on their own domain, you do supermagnets, i
do computers, she does farming, you do truck delivery, we share the work, i
think about clothing all day, you computer all day, I therefore Don't think
about the most common rareish word! There is proof. Furthermore, we already
talk about rare words that we believe are the gold, we do not and can not
predict the most common words as probably as they call for. We need to inject
reward onto features to change the prediction, just like Blender by Facebook
does, even though that makes it predict things not as common as they are.
------------------------------------------
Artificial General Intelligence List: AGI
Permalink:
https://agi.topicbox.com/groups/agi/T5e30c339c3bfa713-Me00a91c39df9b0d71b82ee73
Delivery options: https://agi.topicbox.com/groups/agi/subscription