On Tue, Sep 22, 2020 at 08:28:19PM +0200, Vincent Habchi wrote: > I agree wholeheartedly, but since you mention modern compression algorithms, > isn’t there another one (or more) which would be yet more efficient than xz?
In terms of compressed file size, I'm not sure anything is going to beat xz by much if at all. If we are going to consider other compression algorithms, zstd would be the main one to think about, and the main benefit would be much faster decompression speed in exchange for slightly larger packages than xz. Some data: - https://fedoraproject.org/wiki/Changes/Switch_RPMs_to_zstd_compression#Comparison_of_compression_algorithms_and_levels - https://lists.archlinux.org/pipermail/arch-dev-public/2019-March/029520.html Whether that's worth bunding a new utility, I'm not sure -- I don't know what fraction of our package installation time is decompression right now. Also, bz2 is particularly slow at decompression, so even xz is likely an improvement there. Dan -- Dan R. K. Ports https://drkp.net/
