Control: tag -1 wontfix

Hi Горбешко,

Quoting Горбешко Богдан (2018-10-04 13:02:59)
> the pandoc binary is extremely large. It's the largest file in my 
> /usr/bin, exceeding even blender's binary in almost 2 times.
> 
>  From my experience, ghc is not good at making small binaries, and 
> even stripping doesn't do much. However UPX does it's job great on 
> binaries produced by ghc. I tried compressing pandoc in --best mode 
> and achieved 14% compression (from 141M to 20M); however the 
> compression took more than an hour on my system.
> 
> If you are afraid of performance decreasing that may arise because of 
> UPXing, you can make pandoc a virtual package, pointing by default to 
> a non-compressed real package, but providing a compressed real package 
> as well, for those who care about disk space.

I agree that the binary is big, but I disagree with shipping a 
compressed binary - even as an alternative only.

Reason Pandoc is big is that it is statically linked.  If statically 
linked with FFmpeg, Boost, Cairo, Mesa, GDAL, GTK+, HDF4, HDF5, Lapack, 
etc., Blender would be much much larger than Pandoc.

Providing a compressed binary will just shift the burden elsewhere, and 
providing as alternative shifts the burden to the distribution mirrors.

The proper solution here, I guess (but am not expert in Haskell so may 
be wrong) is to switch to using shared linking, so that 5 Haskell 
binaries will not consume 5 x the disk space of the parts reused among 
them.


 - Jonas

-- 
 * Jonas Smedegaard - idealist & Internet-arkitekt
 * Tlf.: +45 40843136  Website: http://dr.jones.dk/

 [x] quote me freely  [ ] ask before reusing  [ ] keep private

Attachment: signature.asc
Description: signature

Reply via email to