This does sound broadly like something that GoCD is designed to handle -
ensuring consistent and reproducible artifact and/or material inputs. Using
the server to mediate (store and fetch) artifacts between stages or
pipelines is also intended usage.

To confirm - "local build directory" in your description is inside the
normal agent working directory that GoCD creates inside pipelines/ rather
than somewhere elsewhere on the agent file system?

1) do the DLLs get put/copied/fetched into a location that is *inside* a
Git material repo clone? e.g <working-dir>/test-repo where "test-repo" is a
Git material with alternate checkout location or if Git material is cloned
directly to <working-dir>
2) if NOT, and they are inside the agent the working area, but OUTSIDE the
clone does your pipeline that packages the DLLs clean its workspace from
previous runs every time it executes, i.e have you enabled this for the
stage?

[image: image.png]

If "no" to both questions - I possibly know a possibly root cause, as I've
seen it myself. :-/

-Chad

On Tue, Dec 17, 2024 at 1:08 PM Josh <jos...@pracplay.com> wrote:

>
> we've used gocd for many years and it's a great product.
>
> been having an occasional issue that is increasing as we increase
> deployment frequency.
> sometimes pipelines briefly get "stuck" on old dlls, meaning that
> sometimes a downstream pipeline will fail to run because it's been packaged
> with older dlls. it's only between upstream and downstream pipelines, never
> in the same pipeline. this occurs infrequently infrequently but perhaps as
> much as 1 out of every 5 builds.
>
> the workaround fix is to run a script on all the agents that periodically
> refreshes and rebuilds all the pipelines manually.  not sure why this works
> but it always does.
>
> haven't been able to figure out the cause, i'm wondering if it's a
> misunderstanding about artifacts, or otherwise misconfigured artifacts?
>
> here's the situation:
>
>    - we have a pipeline template that runs 8 or 9 pipelines
>    - the template (and thus every pipeline) has 4 stages: prep, build,
>    test and package
>    - prep stage: doesn't do much, mostly just analysis
>    - build stage: pulls code from the repo and builds it, builds
>    artifacts from all binaries built and puts them in gocd at:
>    #{project-name}/build
>    - test stage, fetches those artifacts stored in #{project-name}/build,
>    puts them in local build directory and then runs tests, saves test artifact
>    (not used in build/pkging)
>    - package stage: fetches artifacts stored at stored in
>    #{project-name}/build, puts them in local build directory and packages them
>    up
>
> as i say, most of the time it works great but occasionally a mismatch
> between a previously built upstream pipeline (older version) gets mixed in
> with a newer pipeline build, and while it compiles when you run something
> with the mismatched versions it generates a runtime exception.
>
> as i'm describing this, i believe the cause might be that since we have
> multiple agents, a given agent might not always be scheduled to build every
> pipeline stage.
>
> so eg if project2 is downstream from project1:
>
> agentA builds project1.verX
> agentB builds project2.verX
>
> [project 2 changes]
>
> agentA builds project2.verY
> agentA still has project1.verX binaries locally, so these get built
> against project2.verY
>
> then when the binaries get packaged up, you get the version mismatch.
>
> it seems like what maybe should occur is that we should have pipelines
> also fetch artifacts from all their upstream dependencies (vs just fetching
> from their upstream stages, as i described above).
>
> however I'm not certain how to do this with pipeline templates, since we
> could have multiple upstream pipelines to fetch from?
>
> so i wanted to add an arbitrary # of 'fetch artifact' tasks to a build
> stage's pipeline, and then put all it's upstream pipelines as parameters...
> how can i make the pipeline properly fetch all of:
>
>    - zero upstream pipelines
>    - one upstream pipeline
>    - multiple upstream pipelines
>
> ?
>
> Hopefully this makes sense.
>
> My Idea:
>
>    - Is there a way i can somehow create a 'upstream-pipeline-list'
>    parameter, have each pipeline list their upstreams in CSV fasion, and then
>    have gocd fetch EACH of these upstream pipeline builds prior to actually
>    building the stage?
>
> To me putting #{upstream-pipeline-list} in a single 'fetch artifact' task
> doesn't seem right, since the context of the task seems to only take one
> source location, not multiple.
>
> But I misunderstood this before regarding resources, so I figured it was
> worth asking.
>
> Or maybe there's some other even more obvious thing I"m missing (outside
> of a monorepo, we can't use a monorepo here at least not presently).   What
> is the 'GOCD WAY' to handle this properly?
>
> appreciate any assistance
>
> -j
>
>
> --
> You received this message because you are subscribed to the Google Groups
> "go-cd" group.
> To unsubscribe from this group and stop receiving emails from it, send an
> email to go-cd+unsubscr...@googlegroups.com.
> To view this discussion visit
> https://groups.google.com/d/msgid/go-cd/066669fa-b0ae-40ad-bfb2-0cf3e567e641n%40googlegroups.com
> <https://groups.google.com/d/msgid/go-cd/066669fa-b0ae-40ad-bfb2-0cf3e567e641n%40googlegroups.com?utm_medium=email&utm_source=footer>
> .
>

-- 
You received this message because you are subscribed to the Google Groups 
"go-cd" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to go-cd+unsubscr...@googlegroups.com.
To view this discussion visit 
https://groups.google.com/d/msgid/go-cd/CAA1RwH-jPedtiQtUt6qQhAGLSu3-CkZESg0Sx1qjRCkERFU6tg%40mail.gmail.com.

Reply via email to