The top entry on the large text benchmark, nncp, uses a transformer. It is
closed source but there is a paper describing the algorithm. It doesn't
qualify for the Hutter prize because it takes 3 days to compress 1 GB on a
GPU with 10K cores.

The winning entry, fx-cmix, is open source. It is a variation of cmix,
which uses the PAQ architecture that I developed. It has a lot of
independent bit predictors whose predictions are combined using a simple 2
layer neural network. A prediction p is stretched as x = ln(p)/ ln(1-p).
The output prediction is squash(sum_i xi wi) where w is the weight vector
and squash(x) = 1/(1+e^-x) is the inverse of stretch. The weights are then
updated by w = w + L(y-p) where y is the actual bit, p was the prediction,
and L ≈ .001 is the learning rate.

You can find the software, algorithm descriptions and benchmark results at
https://mattmahoney.net/dc/text.html

For more about data compression in general, including the PAQ algorithms,
see
https://mattmahoney.net/dc/dce.html


On Sun, May 12, 2024, 9:14 PM John Rose <[email protected]> wrote:

> On Sunday, May 12, 2024, at 10:38 AM, Matt Mahoney wrote:
>
> All neural networks are trained by some variation of adjusting anything
> that is adjustable in the direction that reduces error. The problem with
> KAN alone is you have a lot fewer parameters to adjust, so you need a lot
> more neurons to represent the same function space. That's even with 2
> parameters per neuron, threshold level and steepness. The human brain has
> another 7000 parameters per neuron in the synaptic weights.
>
>
> I bet in some of these so-called “compressor” apps that Matt always looks
> at there is some serious NN structure tweaking going on there. They’re open
> source, right? Do people obfuscate the code when submitting?
>
>
> Well it’s kinda obvious but transformations like this:
>
> (Universal Approximation Theorem) => (Kolmogorov-Arnold Representation
> Theorem)
>
> There’s going to be more of them.
>
> Automating or not I’m sure researchers are on it.
> *Artificial General Intelligence List <https://agi.topicbox.com/latest>*
> / AGI / see discussions <https://agi.topicbox.com/groups/agi> +
> participants <https://agi.topicbox.com/groups/agi/members> +
> delivery options <https://agi.topicbox.com/groups/agi/subscription>
> Permalink
> <https://agi.topicbox.com/groups/agi/T1af6c40307437a26-Md991f57050d37e51db0e68c5>
>

------------------------------------------
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/T1af6c40307437a26-Ma01352c6397139afc00fd032
Delivery options: https://agi.topicbox.com/groups/agi/subscription

Reply via email to