Re: [Wikitech-l] How does a build process look like for a mediawiki extension repository?

2017-06-15 Thread Bryan Davis
On Thu, Jun 15, 2017 at 10:58 AM, Jon Robson  wrote:
>
> What's the best wiki page to get an overview of how deployment to the beta
> cluster/production works? I'd like to tinker with these and see if I can
> get one of those steps running npm jobs.

The beta cluster process is to update the repos and then run
https://wikitech.wikimedia.org/wiki/Wikimedia_binaries#scap_sync. The
composer packages are not managed by that process. Instead composer
managed assets used in beta cluster and production are manually
curated in the mediawiki/vendor.git repo via gerrit patches.

Bryan
-- 
Bryan Davis  Wikimedia Foundation
[[m:User:BDavis_(WMF)]] Manager, Cloud Services  Boise, ID USA
irc: bd808v:415.839.6885 x6855

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] How does a build process look like for a mediawiki extension repository?

2017-06-15 Thread Jon Robson
 For PHP deps we've got composer dependency installation for extensions, so
> it seems like there's an opportunity to do other build steps in this
> stage...

Definitely. If we can hook into the existing composer build step that seems
like it would make the most sense e.g. post-update post-install

What's the best wiki page to get an overview of how deployment to the beta
cluster/production works? I'd like to tinker with these and see if I can
get one of those steps running npm jobs.



On Thu, 15 Jun 2017 at 03:01 Joaquin Oltra Hernandez <
jhernan...@wikimedia.org> wrote:

> Thanks for the comprehensive responses.
>
>
>
>
>
>
>
>
>
>
>
> *I can certainly stillsee the possible benefit of having a full fledged
> build step for core,skins, and extensions. It is something that should be
> thought about abit before diving right into an implementation though. One
> thing toconsider is if what would be best is a packaging step that leads to
> atarball or similar artifact that can be dropped into a runtimeenvironment
> for MediaWiki or if instead it would be better to have aunified post-deploy
> build step that operates across MediaWiki core andthe entire collection of
> optional extensions and skins deployed tocreate a particular wiki.*
>
> Totally agree, it is something that needs careful consideration. Even if
> the choice is to have a per-extension packaging step that produces a
> deployable, it would be great to have shared conventions across repos to
> run it (something like a *scripts/build{.sh,.bat}* that internally performs
> the specific build steps of the project.
>
> If that exists, then we can build into core the build step that coordinates
> those sub-build steps where needed.
>
>
>
>
>
>
> *One of the awesome features of working on a PHP codebase is the quickcycle
> of making a change and seeing it live in your test environment.Today that
> is mostly a matter of saving an edit and hitting refresh ina browser. It
> would be sad to lose that, so the build system that isdevised should also
> provide a path that allows a git clone to be aviable wiki.*
>
> That is indeed nice, but it is already the case of many extensions and
> repos that have build steps for frontend code (see grunt and makefiles in
> extensions), it is just that we run them adhoc in developer's machines and
> use git's master as the deploy tarball.
>
> This means that the deploy tarball has built assets that depend on who
> built and committed something, and whatever tools they had on their local
> system (node, npm, grunt and node_modules libraries), instead of in a
> reproducible place like our CI machines.
>
> For whatever reason, the reality is the front-end world has moved to node
> based tooling and build steps, so all the great tools that are well
> maintained are run in a build step (unless you run node.js on your server
> and plug it in there). That's why many projects use grunt and tools from
> npm for linting, optimizing images, and other tasks.
>
> I think that coming up with a standard build process would allow us to do
> away with the adhoc way of building things into the repository on master,
> and allow us to painlessly introduce some very interesting improvements to
> front-end tooling.
>
>
> On Thu, Jun 8, 2017 at 6:07 PM David Barratt 
> wrote:
>
> > Symfony is going to start recommending the use of `make` starting with
> > version 4, so it might be something worth exploring:
> > http://fabien.potencier.org/symfony4-best-practices.html#makefile
> >
> > (I have no opinion on the matter)
> >
> > On Wed, Jun 7, 2017 at 5:48 PM, Bryan Davis  wrote:
> >
> > > On Wed, Jun 7, 2017 at 2:29 PM, Brion Vibber 
> > > wrote:
> > > > On Wed, Jun 7, 2017 at 10:18 AM, Joaquin Oltra Hernandez <
> > > > jhernan...@wikimedia.org> wrote:
> > > >
> > > >> *Context*
> > > >>
> > > >> We'd like to have a build script/process for an extension so that I
> > can
> > > >> perform certain commands to install dependencies and perform
> > > optimizations
> > > >> on the extension sources. For example, on front-end sources.
> > > >>
> > > >> Some examples could be:
> > > >>
> > > >>- Installing libraries from bower or npm and bundling them into
> the
> > > >>resources folder
> > > >>- Applying post processing steps to CSS with something like post
> > css
> > > >>- Optimizing images
> > > >>
> > > >> We are aware of other projects that have build processes for
> building
> > > >> deployables, but not extensions.
> > > >> Such projects have different ways of dealing with this. A common way
> > is
> > > >> having a repository called /deploy and in there you pull
> from
> > > >>  and run the build scripts, and that is the repository that
> > > gets
> > > >> deployed.
> > > >>
> > > >> *Current system*
> > > >>
> > > >> The current way we usually do this (if we do) is run those build
> > > >> scripts/jobs on the developers machines and commit them into the git
> > > >> repository on master.
> > > >>
> > > >> With this system, if you do

Re: [Wikitech-l] How does a build process look like for a mediawiki extension repository?

2017-06-15 Thread Joaquin Oltra Hernandez
Thanks for the comprehensive responses.











*I can certainly stillsee the possible benefit of having a full fledged
build step for core,skins, and extensions. It is something that should be
thought about abit before diving right into an implementation though. One
thing toconsider is if what would be best is a packaging step that leads to
atarball or similar artifact that can be dropped into a runtimeenvironment
for MediaWiki or if instead it would be better to have aunified post-deploy
build step that operates across MediaWiki core andthe entire collection of
optional extensions and skins deployed tocreate a particular wiki.*

Totally agree, it is something that needs careful consideration. Even if
the choice is to have a per-extension packaging step that produces a
deployable, it would be great to have shared conventions across repos to
run it (something like a *scripts/build{.sh,.bat}* that internally performs
the specific build steps of the project.

If that exists, then we can build into core the build step that coordinates
those sub-build steps where needed.






*One of the awesome features of working on a PHP codebase is the quickcycle
of making a change and seeing it live in your test environment.Today that
is mostly a matter of saving an edit and hitting refresh ina browser. It
would be sad to lose that, so the build system that isdevised should also
provide a path that allows a git clone to be aviable wiki.*

That is indeed nice, but it is already the case of many extensions and
repos that have build steps for frontend code (see grunt and makefiles in
extensions), it is just that we run them adhoc in developer's machines and
use git's master as the deploy tarball.

This means that the deploy tarball has built assets that depend on who
built and committed something, and whatever tools they had on their local
system (node, npm, grunt and node_modules libraries), instead of in a
reproducible place like our CI machines.

For whatever reason, the reality is the front-end world has moved to node
based tooling and build steps, so all the great tools that are well
maintained are run in a build step (unless you run node.js on your server
and plug it in there). That's why many projects use grunt and tools from
npm for linting, optimizing images, and other tasks.

I think that coming up with a standard build process would allow us to do
away with the adhoc way of building things into the repository on master,
and allow us to painlessly introduce some very interesting improvements to
front-end tooling.


On Thu, Jun 8, 2017 at 6:07 PM David Barratt  wrote:

> Symfony is going to start recommending the use of `make` starting with
> version 4, so it might be something worth exploring:
> http://fabien.potencier.org/symfony4-best-practices.html#makefile
>
> (I have no opinion on the matter)
>
> On Wed, Jun 7, 2017 at 5:48 PM, Bryan Davis  wrote:
>
> > On Wed, Jun 7, 2017 at 2:29 PM, Brion Vibber 
> > wrote:
> > > On Wed, Jun 7, 2017 at 10:18 AM, Joaquin Oltra Hernandez <
> > > jhernan...@wikimedia.org> wrote:
> > >
> > >> *Context*
> > >>
> > >> We'd like to have a build script/process for an extension so that I
> can
> > >> perform certain commands to install dependencies and perform
> > optimizations
> > >> on the extension sources. For example, on front-end sources.
> > >>
> > >> Some examples could be:
> > >>
> > >>- Installing libraries from bower or npm and bundling them into the
> > >>resources folder
> > >>- Applying post processing steps to CSS with something like post
> css
> > >>- Optimizing images
> > >>
> > >> We are aware of other projects that have build processes for building
> > >> deployables, but not extensions.
> > >> Such projects have different ways of dealing with this. A common way
> is
> > >> having a repository called /deploy and in there you pull from
> > >>  and run the build scripts, and that is the repository that
> > gets
> > >> deployed.
> > >>
> > >> *Current system*
> > >>
> > >> The current way we usually do this (if we do) is run those build
> > >> scripts/jobs on the developers machines and commit them into the git
> > >> repository on master.
> > >>
> > >> With this system, if you don't enforce anything in CI, then build
> > processes
> > >> may be skipped (human error).
> > >>
> > >> If you enforce it (by running the process and comparing with what has
> > been
> > >> committed in CI) then patches merged to master that touch the same
> files
> > >> will produce merge conflicts with existing open patches, forcing a
> > >> rebase+rebuild on open patches every time one is merged on master.
> > >>
> > >> *Questions*
> > >>
> > >> Can we have a shared configuration/convention/system for having a
> build
> > >> step on mediawiki extensions?
> > >>
> > >>- So that a build process is run
> > >>   - on CI jobs that require production assets like the selenium
> jobs
> > >>   - on the deployment job that deploys the extension to the beta
> > >> 

Re: [Wikitech-l] How does a build process look like for a mediawiki extension repository?

2017-06-08 Thread David Barratt
Symfony is going to start recommending the use of `make` starting with
version 4, so it might be something worth exploring:
http://fabien.potencier.org/symfony4-best-practices.html#makefile

(I have no opinion on the matter)

On Wed, Jun 7, 2017 at 5:48 PM, Bryan Davis  wrote:

> On Wed, Jun 7, 2017 at 2:29 PM, Brion Vibber 
> wrote:
> > On Wed, Jun 7, 2017 at 10:18 AM, Joaquin Oltra Hernandez <
> > jhernan...@wikimedia.org> wrote:
> >
> >> *Context*
> >>
> >> We'd like to have a build script/process for an extension so that I can
> >> perform certain commands to install dependencies and perform
> optimizations
> >> on the extension sources. For example, on front-end sources.
> >>
> >> Some examples could be:
> >>
> >>- Installing libraries from bower or npm and bundling them into the
> >>resources folder
> >>- Applying post processing steps to CSS with something like post css
> >>- Optimizing images
> >>
> >> We are aware of other projects that have build processes for building
> >> deployables, but not extensions.
> >> Such projects have different ways of dealing with this. A common way is
> >> having a repository called /deploy and in there you pull from
> >>  and run the build scripts, and that is the repository that
> gets
> >> deployed.
> >>
> >> *Current system*
> >>
> >> The current way we usually do this (if we do) is run those build
> >> scripts/jobs on the developers machines and commit them into the git
> >> repository on master.
> >>
> >> With this system, if you don't enforce anything in CI, then build
> processes
> >> may be skipped (human error).
> >>
> >> If you enforce it (by running the process and comparing with what has
> been
> >> committed in CI) then patches merged to master that touch the same files
> >> will produce merge conflicts with existing open patches, forcing a
> >> rebase+rebuild on open patches every time one is merged on master.
> >>
> >> *Questions*
> >>
> >> Can we have a shared configuration/convention/system for having a build
> >> step on mediawiki extensions?
> >>
> >>- So that a build process is run
> >>   - on CI jobs that require production assets like the selenium jobs
> >>   - on the deployment job that deploys the extension to the beta
> >>   cluster and to production
> >>
> >> How would it look like? Are any extensions doing a pre-deployment build
> >> step?
> >
> >
> > For JS dependencies, image optimizations etc the state of the art still
> > seems to be to have a local one-off script and commit the build artifacts
> > into the repo. (For instance TimedMediaHandler fetches some JS libs via
> npm
> > and copies/patches them into the resources/ dir.)
> >
> > For PHP deps we've got composer dependency installation for extensions,
> so
> > it seems like there's an opportunity to do other build steps in this
> > stage...
> >
> > Not sure offhand if that can be snuck into composer directly or if we'd
> > need to replace the "run composer" step with "run this script, which runs
> > composer and also does other build steps".
>
> When I first joined the Foundation and started working with MediaWiki
> on a daily basis I wondered about the lack of a build process. At past
> jobs I had built PHP application environments that had a "run from
> version control" mode for local development, but always included a
> build step for packaging and deployment that did the sort of things
> that Joaquin is talking about. When I was in the Java world Ant and
> then later Maven2 were the tools of choice for this work. Later in a
> PHP shop I selected Phing as the build tool and even committed some
> enhancements upstream to make it work nicer with the type of projects
> I was managing.
>
> I helped get Composer use into MediaWiki core and that added a post
> deploy build step for MediaWiki, but one that is pretty limited in
> what it can do easily. Composer is mostly a tool for installing PHP
> library dependencies. Most of the attempts I have seen to make it do
> things beyond that are clunky uses of the tool. I can certainly still
> see the possible benefit of having a full fledged build step for core,
> skins, and extensions. It is something that should be thought about a
> bit before diving right into an implementation though. One thing to
> consider is if what would be best is a packaging step that leads to a
> tarball or similar artifact that can be dropped into a runtime
> environment for MediaWiki or if instead it would be better to have a
> unified post-deploy build step that operates across MediaWiki core and
> the entire collection of optional extensions and skins deployed to
> create a particular wiki.
>
> The Foundation's production deployment use case will always be an
> anomaly. It should be considered, but really in my opinion only to
> ensure that nothing absolutely requires external network access in the
> final build. For Composer this turned out to be as easy as maintaining
> a submodule with all the vendored libraries included.

Re: [Wikitech-l] How does a build process look like for a mediawiki extension repository?

2017-06-07 Thread Bryan Davis
On Wed, Jun 7, 2017 at 2:29 PM, Brion Vibber  wrote:
> On Wed, Jun 7, 2017 at 10:18 AM, Joaquin Oltra Hernandez <
> jhernan...@wikimedia.org> wrote:
>
>> *Context*
>>
>> We'd like to have a build script/process for an extension so that I can
>> perform certain commands to install dependencies and perform optimizations
>> on the extension sources. For example, on front-end sources.
>>
>> Some examples could be:
>>
>>- Installing libraries from bower or npm and bundling them into the
>>resources folder
>>- Applying post processing steps to CSS with something like post css
>>- Optimizing images
>>
>> We are aware of other projects that have build processes for building
>> deployables, but not extensions.
>> Such projects have different ways of dealing with this. A common way is
>> having a repository called /deploy and in there you pull from
>>  and run the build scripts, and that is the repository that gets
>> deployed.
>>
>> *Current system*
>>
>> The current way we usually do this (if we do) is run those build
>> scripts/jobs on the developers machines and commit them into the git
>> repository on master.
>>
>> With this system, if you don't enforce anything in CI, then build processes
>> may be skipped (human error).
>>
>> If you enforce it (by running the process and comparing with what has been
>> committed in CI) then patches merged to master that touch the same files
>> will produce merge conflicts with existing open patches, forcing a
>> rebase+rebuild on open patches every time one is merged on master.
>>
>> *Questions*
>>
>> Can we have a shared configuration/convention/system for having a build
>> step on mediawiki extensions?
>>
>>- So that a build process is run
>>   - on CI jobs that require production assets like the selenium jobs
>>   - on the deployment job that deploys the extension to the beta
>>   cluster and to production
>>
>> How would it look like? Are any extensions doing a pre-deployment build
>> step?
>
>
> For JS dependencies, image optimizations etc the state of the art still
> seems to be to have a local one-off script and commit the build artifacts
> into the repo. (For instance TimedMediaHandler fetches some JS libs via npm
> and copies/patches them into the resources/ dir.)
>
> For PHP deps we've got composer dependency installation for extensions, so
> it seems like there's an opportunity to do other build steps in this
> stage...
>
> Not sure offhand if that can be snuck into composer directly or if we'd
> need to replace the "run composer" step with "run this script, which runs
> composer and also does other build steps".

When I first joined the Foundation and started working with MediaWiki
on a daily basis I wondered about the lack of a build process. At past
jobs I had built PHP application environments that had a "run from
version control" mode for local development, but always included a
build step for packaging and deployment that did the sort of things
that Joaquin is talking about. When I was in the Java world Ant and
then later Maven2 were the tools of choice for this work. Later in a
PHP shop I selected Phing as the build tool and even committed some
enhancements upstream to make it work nicer with the type of projects
I was managing.

I helped get Composer use into MediaWiki core and that added a post
deploy build step for MediaWiki, but one that is pretty limited in
what it can do easily. Composer is mostly a tool for installing PHP
library dependencies. Most of the attempts I have seen to make it do
things beyond that are clunky uses of the tool. I can certainly still
see the possible benefit of having a full fledged build step for core,
skins, and extensions. It is something that should be thought about a
bit before diving right into an implementation though. One thing to
consider is if what would be best is a packaging step that leads to a
tarball or similar artifact that can be dropped into a runtime
environment for MediaWiki or if instead it would be better to have a
unified post-deploy build step that operates across MediaWiki core and
the entire collection of optional extensions and skins deployed to
create a particular wiki.

The Foundation's production deployment use case will always be an
anomaly. It should be considered, but really in my opinion only to
ensure that nothing absolutely requires external network access in the
final build. For Composer this turned out to be as easy as maintaining
a submodule with all the vendored libraries included.

The two main use cases to consider for build tooling are (in this
order) 3rd party deployers of MediaWiki and local developers. 3rd
party users are the most important because this is the largest number
of people who will be impacted by tooling changes. In an ideal world
all or most of the changes could be hidden by changes to
ExtensionDistributor or similar tooling that makes it easy to create a
download and run tarball.

One of the awesome features of working on a PHP codebas

Re: [Wikitech-l] How does a build process look like for a mediawiki extension repository?

2017-06-07 Thread David Barratt
Much like npm, you can hook into the existing build steps, as well as
create your own custom scripts:
https://getcomposer.org/doc/articles/scripts.md

On Wed, Jun 7, 2017 at 4:29 PM, Brion Vibber  wrote:

> For JS dependencies, image optimizations etc the state of the art still
> seems to be to have a local one-off script and commit the build artifacts
> into the repo. (For instance TimedMediaHandler fetches some JS libs via npm
> and copies/patches them into the resources/ dir.)
>
> For PHP deps we've got composer dependency installation for extensions, so
> it seems like there's an opportunity to do other build steps in this
> stage...
>
> Not sure offhand if that can be snuck into composer directly or if we'd
> need to replace the "run composer" step with "run this script, which runs
> composer and also does other build steps".
>
> -- brion
>
>
> On Wed, Jun 7, 2017 at 10:18 AM, Joaquin Oltra Hernandez <
> jhernan...@wikimedia.org> wrote:
>
> > *Context*
> >
> > We'd like to have a build script/process for an extension so that I can
> > perform certain commands to install dependencies and perform
> optimizations
> > on the extension sources. For example, on front-end sources.
> >
> > Some examples could be:
> >
> >- Installing libraries from bower or npm and bundling them into the
> >resources folder
> >- Applying post processing steps to CSS with something like post css
> >- Optimizing images
> >
> > We are aware of other projects that have build processes for building
> > deployables, but not extensions.
> > Such projects have different ways of dealing with this. A common way is
> > having a repository called /deploy and in there you pull from
> >  and run the build scripts, and that is the repository that gets
> > deployed.
> >
> > *Current system*
> >
> > The current way we usually do this (if we do) is run those build
> > scripts/jobs on the developers machines and commit them into the git
> > repository on master.
> >
> > With this system, if you don't enforce anything in CI, then build
> processes
> > may be skipped (human error).
> >
> > If you enforce it (by running the process and comparing with what has
> been
> > committed in CI) then patches merged to master that touch the same files
> > will produce merge conflicts with existing open patches, forcing a
> > rebase+rebuild on open patches every time one is merged on master.
> >
> > *Questions*
> >
> > Can we have a shared configuration/convention/system for having a build
> > step on mediawiki extensions?
> >
> >- So that a build process is run
> >   - on CI jobs that require production assets like the selenium jobs
> >   - on the deployment job that deploys the extension to the beta
> >   cluster and to production
> >
> > How would it look like? Are any extensions doing a pre-deployment build
> > step?
> >
> > Thanks.
> > ___
> > Wikitech-l mailing list
> > Wikitech-l@lists.wikimedia.org
> > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] How does a build process look like for a mediawiki extension repository?

2017-06-07 Thread Brion Vibber
For JS dependencies, image optimizations etc the state of the art still
seems to be to have a local one-off script and commit the build artifacts
into the repo. (For instance TimedMediaHandler fetches some JS libs via npm
and copies/patches them into the resources/ dir.)

For PHP deps we've got composer dependency installation for extensions, so
it seems like there's an opportunity to do other build steps in this
stage...

Not sure offhand if that can be snuck into composer directly or if we'd
need to replace the "run composer" step with "run this script, which runs
composer and also does other build steps".

-- brion


On Wed, Jun 7, 2017 at 10:18 AM, Joaquin Oltra Hernandez <
jhernan...@wikimedia.org> wrote:

> *Context*
>
> We'd like to have a build script/process for an extension so that I can
> perform certain commands to install dependencies and perform optimizations
> on the extension sources. For example, on front-end sources.
>
> Some examples could be:
>
>- Installing libraries from bower or npm and bundling them into the
>resources folder
>- Applying post processing steps to CSS with something like post css
>- Optimizing images
>
> We are aware of other projects that have build processes for building
> deployables, but not extensions.
> Such projects have different ways of dealing with this. A common way is
> having a repository called /deploy and in there you pull from
>  and run the build scripts, and that is the repository that gets
> deployed.
>
> *Current system*
>
> The current way we usually do this (if we do) is run those build
> scripts/jobs on the developers machines and commit them into the git
> repository on master.
>
> With this system, if you don't enforce anything in CI, then build processes
> may be skipped (human error).
>
> If you enforce it (by running the process and comparing with what has been
> committed in CI) then patches merged to master that touch the same files
> will produce merge conflicts with existing open patches, forcing a
> rebase+rebuild on open patches every time one is merged on master.
>
> *Questions*
>
> Can we have a shared configuration/convention/system for having a build
> step on mediawiki extensions?
>
>- So that a build process is run
>   - on CI jobs that require production assets like the selenium jobs
>   - on the deployment job that deploys the extension to the beta
>   cluster and to production
>
> How would it look like? Are any extensions doing a pre-deployment build
> step?
>
> Thanks.
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l