So if we look at the Hutter Prize website or the Large Text Compression Benchmark website we see the goal for getting higher in the ranks is the compression size, and there is restrictions for speed and memory used to do so.
But the real test should be finely based on speed and MODEL SIZE and compression size. First it is necessary to put the typical predictor evaluation to death and say bye to it (when you separate your dataset into train and test sets, it's wrong!). This allows you now to get the true compression by dumping the model and hiding it in the arithmetic code compression, and it gives you with certainty the true size of the model that was needed to do this! You get the model size at the end of Compression or Decompression. You also have how long it took to run on a specific CPU (used for all entries). What we should now do is take these 3 items we obtain here and combine them. We first set a limit of how much each agent on Earth gets, at first there may be 100 AGIs running, and each may have dedicated to themselves 100GB, so if your algorithm B has the same compression score as algorithm A but uses half the memory to do so, then it has twice the wisdom, hence a better compression/generated output. This agent's 100GB is full and must delete its worst memories and replace with the latest updates to its agenda to become smarter. And now speed, if algorithm B is twice as fast, it has twice as much data prediction outputs/updates to its 100GB brain. These 3 items combined give us a more true compression size / accuracy of the generated prediction, because algorithm B has more data in its 100GB, more insights (more data), and more speed (twice more data / updating its 100GB). It'll give a better answer than algorithm A on the first pass by the time it has spent twice as long to output its first prediction. ------------------------------------------ Artificial General Intelligence List: AGI Permalink: https://agi.topicbox.com/groups/agi/T6761a13445e5864b-Mf69278d54e8c8d59dfba5750 Delivery options: https://agi.topicbox.com/groups/agi/subscription
