John says you can compress sometimes horribly/ nothing, sometimes ace it, but 
not all the time ace all the thing. Here's my answer: Given 100 letters of 
context that are random, yes a smart brain will fail here because it is random, 
and given 100 letters all 'a' ex. 'aaaaaaa....aaaaa' it will ace it and 
compress it all maximally the most. And yes all the time it can't because 
sometimes it sees the 'aaa' and sometimes it sees the total random ex. 
'5!0fIs8'. What matters here is brains can solve many problems by predicting 
solutions based on context/ problem given, it may not do perfect for to do 
perfect would require knowing where the particles of your RAM is by storing 
such in the very same RAM, but it can get close to perfect. For some problem, 
like the 'js62nf' or 'aaaaa', it is trying its hardest to predict accurately, 
there is no sad 'sometimes it can compress perfectly, sometimes not at all'. 
AGI works.

And to Jim/ etc: AI takes context/ problem and predicts answer solution using 
past similar experiences, it's a predictor that sees patterns. This is the only 
way you can take advantage of the universe and be on top, by taking patterns 
and utilizing that fact that things are not random but repeat. The first thing 
you can notice in a dataset of text is that the same letter/ word/ etc 
re-occur, this allows compression / prediction. Which is AI. All deeper 
patterns start rooted by exact matches, ex. translation is shared contexts ex. 
of all the things cat and dog predict, they share 80% predictions, ex. dog ran, 
dog play, cat ran, cat play, and only 2 are not shared ex. cat meowed, dog 
barked, however these are similar at least here. So cat and dog likely share 
other contexts, so I can predict - if i see dog - dog>meowed, because cat and 
dog share many contexts/ predictions and hence likely is valid other contexts 
not shared/ are shared are accurate to place after dog/ cat. Translation uses 
exact matches. And translation also tells you how similar cat is to dog too, 
not just how likely meow can go after dog word. Also sames clump together in 
text and images, ex. one paragraph is on dogs, or rockets, so all the words etc 
will be that such related - dog based, you'll see dog saw a dog and my dog 
loves cats and my cat saw a dog and my cat...so it is easy to predict what 
follows cat> it is meow cuz you know it already or is cat again! And seen 
cat>meow more than cat>play tells you to predict it more often.

------------------------------------------
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/Tbdfca102d702de94-M6abf9204bc54904f4afe3ef4
Delivery options: https://agi.topicbox.com/groups/agi/subscription

Reply via email to