You can just take a look if you don't want to run it. There is no gradient 
descent etc it is fully understood AI.

My new code is ready, it is 101 lines of code and can compress 100MB (enwik8) 
to 21.4MB though I only tested this version up to 1MB and got 251,148,000 bytes 
(which is really good, shelwien's by comparison that achieves 21.4MB gets 
256,602). It is in Python. To use it, run it, it tells you at bottom how many 
bytes it is compressed, currently I don't have the bin or evaluation fixed up 
so bare with me. Paste the long number you get at top of code after the 0. then 
lower the 50,000 to ex. 49,000 and run it, including change the input2 file 
dataset to only "<mediawiki " and it will regenerate it all back in out.txt. 
Place both files I give you in the same folder BTW.

There's still room to improve it, my exponential function is for now a simple 
else ifs, and i only did layers. And my length of a set of predictions = roof 
is a bit wobbly still, some lengths of candidates get different changes to 
roof. As well the global weights is a bit static feeling but it seems fine for 
now.

Yes, i'm linking te files, i'm not bothering this time to attach to a new 
email, i'm using the thread forum... 
https://aidreams.co.uk/forum/general-project-discussion/releasing-full-agievolution-research/195/
------------------------------------------
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/T4db5f3886a465e8a-Mdc462a69612f55988c8f9488
Delivery options: https://agi.topicbox.com/groups/agi/subscription

Reply via email to