Hello Jenkis users,
I'm wondering if there is a good best practices in file sharing between 
jobs. I have this setup right now.

A. Upstream job A pulls files from Team Foundation Server's drop location 
(they are over 100MB+ easily, eventually we want to compile MS project in 
jenkins we can not right now because of project dependencies)
B. The job A does code analysis, tests, etc on #1 slave jenkins and then 
archive binaries to master jenkins
C. The job B does code deployment using those binary files on workspace in 
#1 slave jenkins to another environment. (Because files are being copied 
from workspace, job B is triggered as a build step rather than post-build 
step to ensure workspace is not being overwritten or anything. Also had to 
use nodelabel parameter plugin to force to use the same slave)

>From my research it seems to be the best practice that slave job copies 
files from master jenkins rather than trying to use the files on upstream 
job's workspace on. With that approach my concern is the file size and too 
much network activities.

Any feedbacks/advices/suggestions?

-- 
You received this message because you are subscribed to the Google Groups 
"Jenkins Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
For more options, visit https://groups.google.com/groups/opt_out.

Reply via email to