Thanks Björn. We're currently on a single master, and I definitely will 
take performance into consideration when we scale. We're looking into 
installing and ind integrating with Artifactory in the coming months, which 
should help with managing artifacts, but I suspect there will still be the 
issue of "who does the compression"..

Many of our artifacts are already compressed entities, but I have confirmed 
that zipping everything (using Jenkins zip) shrinks them by more than half, 
so I'm definitely on the right track..

On Monday, December 2, 2019 at 11:09:19 PM UTC-8, Björn Pedersen wrote:
> Hi,
> I would probably try to compress on the agent before even trying to 
> transfer the large data to the master. This avoids load on the master a) 
> due to transfer and b) due to compression.
> And if the artifacts get really huge, consider storing them  independent 
> from jenkins (S3, maven-style repo, whatever matches your use-case).
> Björn
> Am Montag, 2. Dezember 2019 20:56:02 UTC+1 schrieb Tim Black:
>> Our projects produce large artifacts that now need to be compressed, and 
>> I'm considering my alternatives. The zip step 
>> <>
>> would be a nice non-plugin solution but I'm curious what compression 
>> technique this uses. The documentation page linked above doesn't show any 
>> options that pertain to compression tuning.
>> I've also seen the Compress Artifacts Plugin 
>> <>, but I can't 
>> tell from its docs either whether the algo is tunable. Also I'd rather not 
>> depend on another plugin.
>> If neither of the above work, I'll simply use sh step to call xz, gzip, 
>> bzip, or the like, from my linux-based master.
>> Thanks for your consideration,
>> Tim Black

You received this message because you are subscribed to the Google Groups 
"Jenkins Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
To view this discussion on the web visit

Reply via email to