That is the case, yes. And that validation is expecting to find pipeline {
... } at the top-level of the Jenkinsfile - pretty much everything else
depends on that.

A.

On Thu, Apr 13, 2017 at 1:28 PM, Kenneth Brooks <[email protected]>
wrote:

> Got a chance to play around with some of the options for DRYing out my
> pipelines.
>
> We are already using shared libraries (#1 below) very extensively but that
> is for truly enterprise wide shared functionality.
> I think having teams write those just for steps isn't ideal. They don't
> want the extra overhead of another place to maintain.
>
> I took a stab @ #2 below.
>
> I was able to define the steps as a closure and then pull them into the
> pipeline.
> Jenkinsfile:
> /* Step Implementations */
>
> buildSteps = {
>     sh 'mvn clean compile'
> }
>
> /* Stage Implementations */
>
> featureBuildStage = {
>     agent { label "java-1.8.0_45 && apache-maven-3.2.5 && node4" }
>     steps {
>         sh 'mvn clean compile'
>     }
> }
>
> /* Load branch specific Declarative Pipeline */
>
> if (env.BRANCH_NAME.startsWith("develop")) {
>   evaluate(readTrusted('develop-pipeline.groovy'))
> } else if (env.BRANCH_NAME.startsWith("master")) {
>   evaluate(readTrusted('master-pipeline.groovy'))
> } else if (env.BRANCH_NAME.startsWith("PR-")) {
>   evaluate(readTrusted('pull-request-pipeline.groovy'))
> } else {
>   evaluate(readTrusted('feature-pipeline.groovy'))
> }
>
> feature-pipeline.groovy:
> pipeline {
>     stages {
>
>         stage('Feature Build') {
>             agent { label "java-1.8.0_45 && apache-maven-3.2.5 && node4" }
>             steps buildSteps  // This works, pulling in the buildSteps
> closure defined in Jenkinsfile
>         }
>
>         stage('Feature Build') featureBuildStage //This doesn't work
>     }
> }
>
> Using a step closure it works fine.
>
> Trying to use the Stage closure doesn't.
> This is the one I really think would be useful. Then teams could re-use
> the same stage across multiple pipelines.
>
> I get the following stack trace:
>
> java.lang.ArrayIndexOutOfBoundsException: 1
>       at 
> org.codehaus.groovy.runtime.dgmimpl.arrays.ObjectArrayGetAtMetaMethod.invoke(ObjectArrayGetAtMetaMethod.java:41)
>       at groovy.lang.MetaMethod.doMethodInvoke(MetaMethod.java:325)
>       at 
> org.codehaus.groovy.runtime.callsite.PojoMetaMethodSite.invoke(PojoMetaMethodSite.java:51)
>       at 
> org.codehaus.groovy.runtime.callsite.PojoMetaMethodSite.call(PojoMetaMethodSite.java:56)
>       at 
> org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCall(CallSiteArray.java:48)
>       at 
> org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:113)
>       at 
> org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:125)
>       at 
> com.cloudbees.groovy.cps.sandbox.DefaultInvoker.getArray(DefaultInvoker.java:51)
>       at 
> com.cloudbees.groovy.cps.impl.ArrayAccessBlock.rawGet(ArrayAccessBlock.java:21)
>       at 
> org.jenkinsci.plugins.pipeline.modeldefinition.ClosureModelTranslator.methodMissing(jar:file:/var/jenkins_home/plugins/pipeline-model-definition/WEB-INF/lib/pipeline-model-definition.jar!/org/jenkinsci/plugins/pipeline/modeldefinition/ClosureModelTranslator.groovy:130)
>       at Script1.run(Script1.groovy:16)
>       at 
> org.jenkinsci.plugins.pipeline.modeldefinition.ClosureModelTranslator.resolveClosure(jar:file:/var/jenkins_home/plugins/pipeline-model-definition/WEB-INF/lib/pipeline-model-definition.jar!/org/jenkinsci/plugins/pipeline/modeldefinition/ClosureModelTranslator.groovy:216)
>       at 
> org.jenkinsci.plugins.pipeline.modeldefinition.ClosureModelTranslator.methodMissing(jar:file:/var/jenkins_home/plugins/pipeline-model-definition/WEB-INF/lib/pipeline-model-definition.jar!/org/jenkinsci/plugins/pipeline/modeldefinition/ClosureModelTranslator.groovy:168)
>       at Script1.run(Script1.groovy:15)
>       at 
> org.jenkinsci.plugins.pipeline.modeldefinition.ModelInterpreter.call(jar:file:/var/jenkins_home/plugins/pipeline-model-definition/WEB-INF/lib/pipeline-model-definition.jar!/org/jenkinsci/plugins/pipeline/modeldefinition/ModelInterpreter.groovy:59)
>
>
> I think this is because you are attempting to do some pre
> loading/validations inside ModelInterpreter on the direct source of what is
> inside pipeline (inside feature-pipeline.groovy in this case).
>
> Thoughts?
>
> -K
>
>
>
> On Wednesday, April 12, 2017 at 4:53:23 PM UTC-4, Patrick Wolf wrote:
>
>> Feel free to open a JIRA ticket but I'm not a huge fan of this because it
>> is counter to the KISS principle we wanted with Declarative and breaks the
>> Blue Ocean editor.  We have discussed having multiple "stages" blocks but
>> rejected that because it quickly becomes needlessly complex without adding
>> any use case coverage. IMO, having multiple "stages" makes much more sense
>> than having multiple "pipelines" or else you will have to recreate all
>> agent, environment, libraries, options, parameters etc for each pipeline
>> and that leads to wanting those sections being DRY as well and Declarative
>> pretty much falls apart completely.
>>
>> BTW, It is already possible to have multiple 'pipeline' closures in a
>> single Jenkinsfile but they will be treated as parts of a whole Pipeline
>> and this cannot be used in the editor.  Because the Jenkinsfile is treated
>> as one continuous Pipeline anything outside of the pipeline closures is
>> interpreted as Scripted Pipeline. This means you can use 'if' blocks around
>> the separate 'pipeline' blocks instead of using 'load' if you choose but
>> keeping them in separate files makes maintenance easier, I think.
>>
>> if (BRANCH_NAME.startsWith("develop")) {
>>     pipeline { .... }
>> }
>>
>>
>> Also, it's worth noting that 'readTrusted' probably works better than
>> 'load' because this takes the committer into account and it doesn't require
>> a workspace.
>>
>> https://jenkins.io/doc/pipeline/steps/workflow-multibranch/#
>> code-readtrusted-code-read-trusted-file-from-scm
>>
>> As for DRY stages there are several ways to accomplish this with Pipeline.
>>
>> 1. Shared Library and Resources - This is the preferred method of
>> creating DRY routines
>>
>> You create a global variable that has all of the steps you want (with
>> appropriate variable replacement for environment variables). You could have
>> a build.groovy global variable in the /vars directory that does all of your
>> build steps. Then the steps in your stage can be single line.
>>
>> Alternatively, you can store shell scripts in the /resources of your
>> shared library and run those in your steps without having to duplicate
>> anything:
>>
>> https://gist.github.com/HRMPW/92231e7b2344f20d9cc9d5f2eb778a54
>>
>> 2. You can define your steps directly in the Jenkinsfile at the top level
>> either as strings or methods and simply call that method from with each
>> pipeline.
>>
>> 3. You can define your steps in a configuration file as a property or
>> yaml and load those files using the Pipeline utility steps plugin.
>> https://wiki.jenkins-ci.org/display/JENKINS/Pipeline+Utility+Steps+Plugin
>>
>> To sum up, I think having different stages is worth discussing (it is not
>> going to be implemented in the short term) but there are already many
>> existing ways to make Pipelines DRY.
>>
>> On Tuesday, April 11, 2017 at 8:43:49 AM UTC-7, Kenneth Brooks wrote:
>>>
>>> TL;DR up front:
>>> *As a user, I want to have a pipeline that performs specific pipeline
>>> stages based on the branch. Recommendation: Put the when{} condition
>>> outside the pipeline{} tag.*
>>> *As a user, I want to declare my stages but have the implementation be
>>> separate so that I can reuse them in multiple pipelines*.
>>>
>>> Currently the Declarative syntax has the ability to perform a stage
>>> conditionally using 'when' but not a whole pipeline.
>>> This leads to making the pipeline fairly inflexible and much harder to
>>> read thru.
>>>
>>> Take for example:
>>>
>>> pipeline {
>>>
>>>    stages {
>>>      stage('Build') {
>>>        when { branch "develop || master || feature"} // no the real syntax, 
>>> i know
>>>        steps { /* do some build stuff */ }
>>>      }
>>>
>>>      stage('Scan') {
>>>        when { branch "master"}
>>>        steps { /* run static code analysis or other code scanning */}
>>>      }
>>>
>>>      stage('Pull Request Build') {
>>>        when { branch "PR-*"}
>>>        steps { /* do a merge build stuff */ }
>>>      }
>>>
>>>      stage('Dev Deploy') {
>>>        when { branch "develop || master"}
>>>        steps { /* deploy to dev */ }
>>>      }
>>>
>>>      stage('Pull Request Deploy') {
>>>        when { branch "PR-*"}
>>>        steps { /* deploy to special PR sandbox */}
>>>      }
>>>   }
>>> }
>>>
>>>
>>> In this simple example, the following will happen, but it is extremely hard 
>>> to follow.
>>>
>>> Feature -> Build
>>> Master -> Build, Scan, Dev Deploy
>>> Develop -> Build, Dev Deploy
>>> Pull Request -> Pull Request Build, Pull Request Deploy
>>>
>>> I would suggest we allow the when to be placed at the pipeline level 
>>> somehow.args
>>>
>>> pipeline('master') { // Just for naming
>>>   when { branch "master" }
>>>   stages {
>>>     stage('Build'){
>>>       steps { /* do some build stuff */ }
>>>     }
>>>     stage('Scan'){
>>>       steps { /* run static code analysis or other code scanning */}
>>>     }
>>>     stage('Dev Deploy'){
>>>       steps { /* deploy to dev */ }
>>>     }
>>>   }
>>> }
>>>
>>> pipeline('develop') { // Just for naming
>>>   when { branch "develop" }
>>>   stages {
>>>     stage('Build'){
>>>       steps { /* do some build stuff */ }
>>>     }
>>>     stage('Dev Deploy'){
>>>       steps { /* deploy to dev */ }
>>>     }
>>>   }
>>> }
>>>
>>> pipeline('pull request') { // Just for naming
>>>   when { branch "PR-*" }
>>>   stages {
>>>     stage('Pull Request Build') {
>>>       steps { /* do a merge build stuff */ }
>>>     }
>>>     stage('Pull Request Deploy') {
>>>       steps { /* deploy to special PR sandbox */}
>>>     }
>>>   }
>>> }
>>>
>>> pipeline('feature') { // Just for naming
>>>   when { branch != "master || PR-* || develop" } // just do a build for any 
>>> 'other' branches, which would then include developer feature branches
>>>   stages {
>>>     stage('Build') {
>>>       steps { /* do some build stuff */ }
>>>     }
>>>   }
>>> }
>>>
>>>
>>> That, to me, is much cleaner. It is very easy to see exactly what each 
>>> pipeline is doing.
>>> This brings one downside. The stage is repeated.
>>> stage('Build') and stage('Dev Deploy') are the same impl, but I have to 
>>> write them 2 times.
>>> I could create a global library, but then that has 2 other downsides. It is 
>>> no longer declarative syntax in the global library, the global library is 
>>> loaded external. I have to now go to a whole other file to see that 
>>> implementation.
>>>
>>> To keep things DRY I would also like to then see the stages treated as a 
>>> definition and and implementation.
>>> Define the stages external to the pipeline, but pull them into each 
>>> pipeline.
>>>
>>> This can optionally be done (like you'll see on the Pull Request stages).
>>>
>>> Here is what I believe the combination of the two would look like:
>>>
>>>
>>> pipeline('master') { // Just for naming
>>>   when { branch "master" }
>>>   stages {
>>>     stage('Build')
>>>     stage('Scan')
>>>     stage('Dev Deploy')
>>>   }
>>> }
>>>
>>> pipeline('develop') { // Just for naming
>>>   when { branch "develop" }
>>>     stages {
>>>     stage('Build')
>>>     stage('Dev Deploy')
>>>   }
>>> }
>>>
>>> pipeline('pull request') { // Just for naming
>>>   when { branch "PR-*" }
>>>   stages {
>>>     stage('Pull Request Build') {
>>>       steps { /* do a merge build stuff */ }
>>>     }
>>>     stage('Pull Request Deploy') {
>>>       steps { /* deploy to special PR sandbox */}
>>>     }
>>>   }
>>> }
>>>
>>> pipeline('feature') { // Just for naming
>>>   when { branch != "master || PR-* || develop" } // just do a build for any 
>>> 'other' branches, which would then include developer feature branches
>>>   stages {
>>>     stage('Build')
>>>   }
>>> }
>>>
>>> /* Stage definitions below */
>>> stage('Build'){
>>>   steps { /* do some build stuff */ }
>>> }
>>>
>>> stage('Scan'){
>>>   steps { /* run static code analysis or other code scanning */}
>>> }
>>>
>>> stage('Dev Deploy'){
>>>   steps { /* deploy to dev */ }
>>> }
>>>
>>>
>>> Is there a way to do this with the current declarative syntax?
>>>
>>> If not, what is the best way to get this into the declarative syntax? Open 
>>> jira enhancement requests?
>>>
>>>
>>> What we've resorted to in the mean time (which still doesn't solve the DRY 
>>> part) is to have a Jenkinsfile that does the if logic and then loads a 
>>> specific pipeline (which has its own demons because the load evals the file 
>>> immediately and is holding onto a heavyweight executor the whole time).
>>>
>>>
>>> if (env.BRANCH_NAME.startsWith("develop")) {
>>>     load 'develop-pipeline.groovy'
>>> } else if (env.BRANCH_NAME.startsWith("master")) {
>>>     load 'master-pipeline.groovy'
>>> } else if (env.BRANCH_NAME.startsWith("PR-")) {
>>>     load 'pull-request-pipeline.groovy'
>>> } else {
>>>     load 'feature-pipeline.groovy'
>>> }
>>>
>>>
>>> Thanks,
>>>
>>> Ken
>>>
>>>
>>> --
> You received this message because you are subscribed to the Google Groups
> "Jenkins Users" group.
> To unsubscribe from this group and stop receiving emails from it, send an
> email to [email protected].
> To view this discussion on the web visit https://groups.google.com/d/
> msgid/jenkinsci-users/9c13526d-4c5a-482d-81b9-570bd20c6c1c%40googlegroups.
> com
> <https://groups.google.com/d/msgid/jenkinsci-users/9c13526d-4c5a-482d-81b9-570bd20c6c1c%40googlegroups.com?utm_medium=email&utm_source=footer>
> .
>
> For more options, visit https://groups.google.com/d/optout.
>

-- 
You received this message because you are subscribed to the Google Groups 
"Jenkins Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To view this discussion on the web visit 
https://groups.google.com/d/msgid/jenkinsci-users/CAPbPdObDNermuXH7m5jfWmuTjGatiqdFB%3D9u6B5Z73cU5hRwzw%40mail.gmail.com.
For more options, visit https://groups.google.com/d/optout.

Reply via email to