You give compression as an example of a powerful computation primitive.

Are you interested in the similarity between Matt Mahoney's ZPAQ virtual
machine
http://mattmahoney.net/dc/zpaq.html
which implements predictive compression by mixing a swarm of predictors,
each expert at a different domain.
And Miller and Drexler's idea of "Agoric Computing"
http://e-drexler.com/d/09/00/AgoricsPapers/agoricpapers.html
where they propose routinely combining untrusted agents into reliable
services.

I think your focus has been on eliminating the software complexity caused
by history.
But there's another slightly different focus - reducing the size of the
trusted or kernel infrastructure by creating ways to delegate out to a
service provided by untrusted and potentially enormous code. The
enthusiasts (such as Miller and Drexler) think of these potentially
enormous swarms as actually enormous and long-lasting; I think it's fairly
likely that one or a few strategies per market eventually mostly defeat
everyone else, so that the system as a whole is not particularly large.

However, the point is that you don't necessarily have to choose between
LZ77 and LZW or whatever.

Johnicholas
-- 
To unsubscribe: http://lists.canonical.org/mailman/listinfo/kragen-discuss

Reply via email to