You're writing a PPM compressor. It's different from mixing bit
predictions. Read about ppmd in my book. You want to use the longest
possible context and estimate the probability of a novel value (zero
frequency) that you divide among the next smaller context. That can be
learned.

Also get on encode.su. Lots of compression developers there can give you
advice.


On Fri, Feb 28, 2020, 11:52 PM <[email protected]> wrote:

> matt,
> let's use 3 terms 'total unique symbols', 'total counts', and 'individual
> counts.
>
> Ex. we have 5 prediction sets to mix, and set_1 is:
> a 5, b 2, c 3, d 8, e 1
>
> total unique symbols=5
> total counts=19
> individual counts=5, 2, 3, 8, 1
>
> I know, I know, the amount we mix this set will be determined by how many
> 'total counts' (19), but, what else determines its weight?
>
> *Artificial General Intelligence List <https://agi.topicbox.com/latest>*
> / AGI / see discussions <https://agi.topicbox.com/groups/agi> +
> participants <https://agi.topicbox.com/groups/agi/members> + delivery
> options <https://agi.topicbox.com/groups/agi/subscription> Permalink
> <https://agi.topicbox.com/groups/agi/Tcfc4df5e57c62b43-M42db3c488177bc876b4ba8c0>
>

------------------------------------------
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/Tcfc4df5e57c62b43-M7f89fa25cef6657504909680
Delivery options: https://agi.topicbox.com/groups/agi/subscription

Reply via email to