"Tom Spindler (moof)" <[email protected]> writes: >> PS: The xz compression for the debug set takes 36 minutes on my machine. >> We shoudl do something about it. Matt to use -T for more parallelism? > > On older machines, xz's default settings are pretty much unusable, > and USE_XZ_SETS=no (or USE_PIGZGZIP=yes) is almost a requirement. > On my not-exactly-slow i7 6700K, build.sh -j4 parallel is just fine > until it hits the xz stage; gzip is many orders of magnitude faster. > Maybe if xz were cranked down to -2 or -3 it'd be better at not > that much of a compression loss, or it defaulted to the higher > compression level only when doing a `build.sh release`.
(I have not really been building current so am unclear on the xz details.) I'd like us to keep somewhat separate the notions of: someone is doing build.sh release someone wants min-size sets at the expense of a lot of cpu time I regularly do build.sh release, and rsync the releasedir bits to other machines, and use them to install. Now perhaps I should be doing "distribution", but sometimes I want the ISOs. Sometimes I do builds just to see if they work, e.g. if being diligent about testing changes. (Overall the notion of staying with gzip in most cases, with a tunable for extreme savins sounds sensible but I am too unclear to really weigh in on it.)
