I believe that this is a bug.  What do I need to do to either get comments, 
or action on this?

On Thursday, August 1, 2019 at 5:05:02 PM UTC-4, Louis Elston wrote:
>
> Studying and playing with pipelines.  I see that you can use Declarative 
> in the Pipeline Scrip window, but it still stores it in the config.xml 
> file.  And I have played with the combination of both Declarative and non 
> Declarative in the same script.
>
> I am trying to understand the Blue Ocean interface, the word "MultiBranch" 
> is throwing me a little.  We do not create test branches, and them merge 
> them back into the master.  In the repository, we have branches for each 
> release of the product, and we rarely go back to previous 
> branches\versions.  So, if I am working on branchV9 right now, do I also 
> need a Jenkinsfile in the Master branch, or any other of the previous 
> version branches?
>
> I have been playing with Blue Ocean (which only does MultiBranch 
> pipelines).  I am on a Windows system, Jenkins 2.176.2, and have all the 
> latest Blue Ocean plugins as of today (1.18.0).  I am accessing a local Git 
> repository (not GitHub), and am running into the following...
>
> If I try to use use “c:\GitRepos\Pipelines1\.git”, i get "not a valid 
> name"...
>
> [image: 1.PNG]
>
>
> [image: 2.PNG]
>
>
>
> [image: 3.PNG]
>
>
> [image: 4.PNG]
>
>
> Why is this happening?
>
>
>
>
>
>
> On Monday, July 29, 2019 at 11:40:56 AM UTC-4, Louis Elston wrote:
>>
>> 07/17/19 – wrote this…
>>
>> We are currently using Windows \ Jenkins 2.107.1 (no pipeline), and I am 
>> researching going to pipeline. We have a nightly build job, that fetches 
>> from repositories, and submits and waits on other jobs. I see 9 jobs 
>> running on the same Master node (we only have a master), at the same time. 
>> I am not clear on if we should have one Jenkinsfile or multiple 
>> Jenkinsfiles. It will not be a multibranch pipeline, as we do not create 
>> test branches and then merge back to a master. In the repository we have 
>> product1.0 branch, product2.0 branch etc., and build only one branch (the 
>> latest one). While I do like the Blue Ocean editor, it is only for 
>> MultiBranch pipelines.
>>
>> Looking for directions and\or examples on how to convert existing Jenkins 
>> non-pipeline systems, to pipeline.  I did find this…
>> https://wiki.jenkins.io/display/JENKINS/Convert+To+Pipeline+Plugin. It 
>> does help a little in that it gives you some converted steps, but cannot 
>> convert all the steps, and will give comments in the pipeline script 
>> "//Unable to convert a build step referring to...please verify and convert 
>> manually if required." There is an option "Recursively convert downstream 
>> jobs if any" and if you select that, it appears to add all the downstream 
>> jobs to the same pipeline script, and really confuses the job parameters. 
>> There is also an option to "Commit JenkinsFile" (if doing declarative). I 
>> will play with this some more, but it is not the be all and end all of 
>> converting to pipeline, and I still am not sure of whether I should be have 
>> one or more scripts.
>>
>> Added 07/26/19 - Let’s see if I have my research to date correct…
>>
>> A Declarative pipeline (Pipeline Script from SCM), is stored in a 
>> Jenkinsfile in the repository. Every time that this Jenkins job is 
>> executed, a fetch from the repository is done (to get the latest version of 
>> the Jenkinsfile).
>>
>> A Pipeline script is stored as part of the config.xml file in the 
>> Jenkins\Jobs folder (it is not stored in the repository, or in a separate 
>> Jenkinsfile in the jobs folder). There is a fetch from the repository only 
>> if you put it in (you do not need to do a repository fetch to get the 
>> Pipeline script).
>>
>> Besides our nightly product build, we also have other jobs. I could 
>> create a separate Declarative Jenkinsfile for each of them (JenkinsfileA, 
>> JenkinsfileB, etc.) for each of the other jobs and store then in the 
>> repository also (in the same branch as the main Jenkinsfile), but that 
>> would mean that every one of those additional jobs, to get the particular 
>> Jenkinsfile for that job, would also need to do a repository fetch 
>> (basically fetching\cloning the repository branch for each job, and have 
>> multiple versions of the repository branch unnecessarily downloaded to the 
>> workspace of each job).
>>
>> That does not make sense to me (unless my understanding of things to date 
>> is incorrect). Because the main product build does require a fetch every 
>> time it is run (to get any possible developer check-ins), I do not see a 
>> problem doing Declarative Jenkinsfile for that job. For the other jobs (if 
>> we do not leave then for the time being in the classic (non-pipeline) 
>> format)), they will be Pipeline scripts.
>>
>> Is there any way of (or plans for), being able to do Declarative pipeline 
>> without having to store in the repository and doing a fetch every time 
>> (lessening the need to become a Groovy developer)? The Blue Ocean script 
>> editor appears to be an easier tool to use to create pipeline scripts, but 
>> it is only for MultiBranch pipelines (which we don’t do).
>>
>> Serialization (restarting a job), is that only for when a node goes down, 
>> or can you restart a pipeline job (Declarative or Scripted), from any point 
>> if it fails?
>>
>> I see that there are places to look to see what Jenkins plugin’s have 
>> been ported to pipeline, but is there anything that can be run to look at 
>> the classic jobs that you have, to determine up front which jobs are going 
>> to have problems being converted to pipeline (non supported plugins)?
>>
>

-- 
You received this message because you are subscribed to the Google Groups 
"Jenkins Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to jenkinsci-users+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/jenkinsci-users/912b3b5f-a204-4b83-ad35-9fd94dcb2aa4%40googlegroups.com.

Reply via email to