I have a pretty complex setup right now and I think pipelining might 
simplify it, but I'm not sure.

It's a combination of embedded C and C# were layers get bundled together 
and culminate in a Windows program that contains data for embedded devices.

What happens is that all of the embedded builds must build using the latest 
revision number and the binaries from all of those get assembled into 
multiple tarballs (or tarball like) and then those tarballs get assembled 
into one big tarball that's fed into the C# build.

Ideally, if an embedded project has already been built with the current 
revision number then another build isn't required - it's superfluous.

All this is on one machine with the basic two node setup, but the embedded 
builds cannot run simultaneously at all.

So is pipelining what I want or is it just a bunch of projects that oversee 
everything or is it some other way?

-- 
You received this message because you are subscribed to the Google Groups 
"Jenkins Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To view this discussion on the web visit 
https://groups.google.com/d/msgid/jenkinsci-users/58504ef9-ac66-4de4-b871-a10245ff9272%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Reply via email to