On Sun, Feb 3, 2013 at 12:29 PM, Karl Rupp <rupp at mcs.anl.gov> wrote:
> New developers are usually willing to adapt to the workflow of the project > anyways, particularly if it follows established 'best practices'. 'Compile > cleanly at high warning levels' is one of them, so we are not proposing > something radically new. With the widespread adoption of DVCS and tools like github and bitbucket, the concept of contributing to a project without direct push access is now completely mainstream. We already (slightly) scaled back push access in favor of mailing list review and the quality of patches went way up. As core project members, organizing our own work to be more reviewable and so that pushes are more complete makes the project easier for people to contribute to without direct push access. Note that in many "dictatorships", all patches, even "obviously correct" ones written by the dictator go through the same branch and subsequent merge. It doesn't really take more time, but it makes the project more transparent and makes it easier to keep track of what is happening. A very good system from a review and opportunity-to-comment perspective is for urgent fixup patches to be committed directly on 'dev' (or 'release'), but to have all new features or refactors committed in a branch, with a pull request when they are ready, to be merged by the reviewer. This way, suggestions identified in review can be fixed up-front so that they are coherent in the history. It also means that any bug caught at that stage never affects users. Kernel-style workflow merges those topic branches to a throw-away integration branch called 'pu' (proposed updates) so that all the interesting patches can be tested together and people developing other stuff can check whether their work conflicts with anything that anyone else is working on. When something passes review, it "graduates" to 'next' (more testers), and eventually to 'master' (which is supposed to always be "as stable as a release" and is what most people base their patches on). http://www.kernel.org/pub/software/scm/git/docs/gitworkflows.html (more detail) The main downside of this workflow is not time required by skilled participants (your "productivity"), but the level of education/familiarity with tools to make it work. The advantage is that 'master' is very stable all the time, everything has a chance for review before becoming permanent history, it's very easy to tell what is being actively developed and to identify provenance for each feature. Note that contributors don't need to know that much to contribute. They just base their patches on 'master', submit a pull request when they think it's ready, fix up as requested by the reviewer, and they'll see it in 'master'. It's much less intimidating to submit a pull request than to push to a shared repository, and it means that if something needs repairing, more than one person shares some part of the blame. I'm not saying that PETSc should adapt the full workflow described above, but I think we should at least recognize the benefits it provides and cherry-pick those aspects that can improve our quality, reviewability, etc. Organizing patches into coherent feature development with semantically meaningful merges is one step in that direction. -------------- next part -------------- An HTML attachment was scrubbed... URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20130203/5cafd163/attachment.html>
