Package up the file as an artifact? Have Job A retrieve it and then pass it to C? If that seems like too much then just copy the file from B's workspace into A' then inject as an envar.
My approach is to use the BuildFlow plugin. Much greater flexibility as far as conditional job execution. It even offers try/catch like functionality. -b On Wednesday, December 19, 2012 2:28:10 PM UTC-5, Ken Beal wrote: > > Hi, > > > > I have three jobs. A invokes B, waits for it to complete, and then > invokes C. B gets some data dynamically. This needs to be given to C. > > > > (The real names for these jobs are "Monitor", "Deploy", and "Test"; we're > building a deployment pipeline, post-build.) > > > > Job B gets some new data, in our case an IP address. This needs to be > sent to C so that it can perform its tests on the right deployment. > > > > > > One way to do this is using the Parameterized Trigger Plugin, defining A > to invoke B, and B to invoke C. However, we want to configure it > conceptually with a "master" job invoking each downstream job, so that e.g. > one could invoke B by itself, without it requiring running C after it runs > (e.g., for debugging purposes). > > > > We currently have a workaround where A creates a unique network folder, > passing that as a parameter to B and C; when B completes, it writes the IP > address to a file at the network location, and then when C starts, it reads > that file from there. > > > > However, I'd prefer to not depend on an external network resource if I can > help it. Is there a method to have B return data to A which can then be > passed to C as normal Jenkins parameters? > > > Thanks, > Ken > >
