I would probably try to compress on the agent before even trying to 
transfer the large data to the master. This avoids load on the master a) 
due to transfer and b) due to compression.
And if the artifacts get really huge, consider storing them  independent 
from jenkins (S3, maven-style repo, whatever matches your use-case).


Am Montag, 2. Dezember 2019 20:56:02 UTC+1 schrieb Tim Black:
> Our projects produce large artifacts that now need to be compressed, and 
> I'm considering my alternatives. The zip step 
> <https://jenkins.io/doc/pipeline/steps/pipeline-utility-steps/#zip-create-zip-file>
> would be a nice non-plugin solution but I'm curious what compression 
> technique this uses. The documentation page linked above doesn't show any 
> options that pertain to compression tuning.
> I've also seen the Compress Artifacts Plugin 
> <https://github.com/jenkinsci/compress-artifacts-plugin>, but I can't 
> tell from its docs either whether the algo is tunable. Also I'd rather not 
> depend on another plugin.
> If neither of the above work, I'll simply use sh step to call xz, gzip, 
> bzip, or the like, from my linux-based master.
> Thanks for your consideration,
> Tim Black

You received this message because you are subscribed to the Google Groups 
"Jenkins Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to jenkinsci-users+unsubscr...@googlegroups.com.
To view this discussion on the web visit 

Reply via email to