that sounds like the cool way to do it :), i do it the easy way and just use
a binary key store... and i get my compression by sharing sections of the keys.
--
Artificial General Intelligence List: AGI
Permalink:
However you do it, If theres no repetition theres no possible compression...
its a losing game unless you find where the repetition is.
Counting 1's and 0's gets you log over the bits, but you lose topological
position, and its only good for say, getting the area of a circle for
computing pi.
Have u tried randomizing for it with occams razor as the heuristic?
That seems like a good idea to me, Someones obviously done it right? How did
it go?
--
Artificial General Intelligence List: AGI
Permalink:
On Fri, Jan 31, 2020, 3:49 AM wrote:
> that sounds like the cool way to do it :), i do it the easy way and just
> use a binary key store... and i get my compression by sharing sections of
> the keys.
>
Any benchmark results? I would be interested if it improves compression.
Compression is a
On Fri, Jan 31, 2020 at 1:29 PM Matt Mahoney
wrote:
>
>
> On Fri, Jan 31, 2020, 2:11 PM wrote:
>
>> However you do it, If theres no repetition theres no possible
>> compression... its a losing game unless you find where the repetition is.
>> Counting 1's and 0's gets you log over the bits, but
On Friday, January 31, 2020, at 2:04 PM, Matt Mahoney wrote:
> Compression is a highly experimental process. Most of the stuff I tried
> either didn't work or resulted in tiny improvements.
Last I checked I did 1 thing and shaved off 76MB of the 100MB wiki8. Yes I
studied it but whoever figured
On Fri, Jan 31, 2020, 3:24 PM wrote:
> On Friday, January 31, 2020, at 2:04 PM, Matt Mahoney wrote:
>
> Compression is a highly experimental process. Most of the stuff I tried
> either didn't work or resulted in tiny improvements.
>
> Last I checked I did 1 thing and shaved off 76MB of the 100MB
hate to say it, but even if you managed to compress it to all hell, whats
driving the computer to speak is more important than it madly rambling the
contradictions that formed the insane useless model.
--
Artificial General Intelligence List: AGI
Gotta hold off on that BWT, it's losing patterns Matt!! I don't feel good about
it. And the PPM & Cmixing is what I already do ... what are u sayinggg?? I mix
together many partial matches.
--
Artificial General Intelligence List: AGI
Permalink:
"10 years or more to develop a good compressor" Do you mean leaving the
computer generating the network that long? :)
--
Artificial General Intelligence List: AGI
Permalink:
https://agi.topicbox.com/groups/agi/T409fc28ec41e6e3a-M982e67268f298c2a70fd7079
On Fri, Jan 31, 2020, 5:05 PM wrote:
> Gotta hold off on that BWT, it's losing patterns Matt!! I don't feel good
> about it. And the PPM & Cmixing is what I already do ... what are u
> sayinggg?? I mix together many partial matches.
>
You can read about BWT and PPM in my book to understand
On Friday, January 31, 2020, at 8:29 PM, Matt Mahoney wrote:
> All of this complexity is in keeping with Legg's proof that powerful
> predictors are necessarily complex. You end up writing lots of code to handle
> special cases and obscure file types to squeeze out just a little more
>
12 matches
Mail list logo