Last weekend, I started hacking on a Google App Engine app for helping with 
pull request review (screenshot: http://i.imgur.com/wwpZKYZ.png).  Some of my 
basic goals (not all implemented yet):

- Users sign in using GitHub and can browse a list of pull requests, including 
links to associated JIRAs, Jenkins statuses, a quick preview of the last 
comment, etc.

- Pull requests are auto-classified based on which components they modify (by 
looking at the diff).

- From the app’s own internal database of PRs, we can build dashboards to find 
“abandoned” PRs, graph average time to first review, etc.

- Since we authenticate users with GitHub, we can enable administrative 
functions via this dashboard (e.g. “assign this PR to me”, “vote to close in 
the weekly auto-close commit”, etc.

Right now, I’ve implemented GItHub OAuth support and code to update the issues 
database using the GitHub API.  Because we have access to the full API, it’s 
pretty easy to do fancy things like parsing the reason for Jenkins failure, 
etc.  You could even imagine some fancy mashup tools to pull up JIRAs and pull 
requests side-by in iframes.

After I hack on this a bit more, I plan to release a public preview version; if 
we find this tool useful, I’ll clean it up and open-source the app so folks can 
contribute to it.

- Josh

On August 26, 2014 at 8:16:46 AM, Nicholas Chammas (nicholas.cham...@gmail.com) 
wrote:

On Tue, Aug 26, 2014 at 2:02 AM, Patrick Wendell <pwend...@gmail.com> wrote:  

> I'd prefer if we took the approach of politely explaining why in the  
> current form the patch isn't acceptable and closing it (potentially w/ tips  
> on how to improve it or narrow the scope).  


Amen to this. Aiming for such a culture would set Spark apart from other  
projects in a great way.  

I've proposed several different solutions to ASF infra to streamline the  
> process, but thus far they haven't been open to any of my ideas:  


I've added myself as a watcher on those 2 INFRA issues. Sucks that the only  
solution on offer right now requires basically polluting the commit history.  

Short of moving Spark's repo to a non-ASF-managed GitHub account, do you  
think another bot could help us manage the number of stale PRs?  

I'm thinking a solution as follows might be very helpful:  

- Extend Spark QA / Jenkins to run on a weekly schedule and check for  
stale PRs. Let's say a stale PR is an open one that hasn't been updated in  
N months.  
- Spark QA maintains a list of known committers on its side.  
- During its weekly check of stale PRs, Spark QA takes the following  
action:  
- If the last person to comment on a PR was a committer, post to the  
PR asking for an update from the contributor.  
- If the last person to comment on a PR was a contributor, add the PR  
to a list. Email this list of *hanging PRs* out to the dev list on a  
weekly basis and ask committers to update them.  
- If the last person to comment on a PR was Spark QA asking the  
contributor to update it, then add the PR to a list. Email this  
list of *abandoned  
PRs* to the dev list for the record (or for closing, if that becomes  
possible in the future).  

This doesn't solve the problem of not being able to close PRs, but it does  
help make sure no PR is left hanging for long.  

What do you think? I'd be interested in implementing this solution if we  
like it.  

Nick  

Reply via email to