On Fri, Jun 11, 2021, 9:32 PM stefan.reich.maker.of.eye via AGI <
[email protected]> wrote:

> I am probably going to port the ZPAQ decompressor to Java. This is the
> last decompressor everyone ever needs, seeing as any other format can be
> converted to ZPAQ with a constant-sized header. (Am I wrong?)
>

I have implemented LZ77, BWT, and various context mixing algorithms in
ZPAQL, but it would be difficult to implement PPM because it predicts
bytes, not bits. But it is still very flexible. I wrote a model to compress
the digits of pi as a program that computes it.


> In the process I just realized how different that decompressor is from the
> compressor. It's completely non-symmetric. The compressor is a neural
> network [I'm simplifying]... and the decompressor is a Turing machine.
> Which means the compressor is wildly inferior in relation to the powers of
> the decompressor and thus optimizable. Which is actually a given since
> there is no machine-based optimization process of the ZPAQ compressor
> algorithm in progress. Or is there? Where are the live stats?
>

Compression is not that much different. Both use identical bit predictors.
But the compressor reads the model from a config file or creates one,
writes it to the archive, and runs it to compress. The preprocessor has to
be the inverse of the post processor, of course. The development tool zpaqd
tests both at compress time to make sure the original data would be
restored.

But forward compatibility is a new idea, if you don't count self extracting
archives.


------------------------------------------
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/Ta8a0e64036790330-Mc034e62ff585d08e73efa7c8
Delivery options: https://agi.topicbox.com/groups/agi/subscription

Reply via email to