> I am not a fan of pointing to a moving target with the "include" statement: > > include: > - https://salsa.debian.org/salsa-ci-team/pipeline/raw/master/salsa-ci.yml > - > https://salsa.debian.org/salsa-ci-team/pipeline/raw/master/pipeline-jobs.yml > > "master" will change, and that can break CI jobs where nothing in the > local repo has changed.
It does have pros and cons. The good: Additional build/verification steps or even automatic deployment can be added by the Salsa team at some point without requiring changes to each repository. The bad: As you mentioned, a moving target can be bad and cause inadvertent build failures and other issues that are out of the hands of maintainers. The ugly: Pulling in external scripts always bears a certain risk. They may go away at some point or cause potentially dangerous side effects. However, I do think that a standardised CI pipeline is very useful. Consider that the buildd infrastructure also uses a standardised build process that packages cannot simply get away from. If this process is replicated on Salsa, with an external script or not, people will quickly get a "glimpse" of what would happen on buildd. The need to manually adapt the CI script every time something changes in the buildd process is a heavy burden to bear and will easily lead to people "forgetting" to update their scripts. That kind of defeats the purpose. Also, consider that the Salsa CI pipeline is not an absolute source of truth, but a tool for developers and maintainers to quickly spot issues with their packages. If an autobuild fails, it's not the end of the world. It just means you have to go check what's going on.