Sean Owen wrote
> Stale JIRAs are a symptom, not a problem per se. I also want to see
> the backlog cleared, but automatically closing doesn't help, if the
> problem is too many JIRAs and not enough committer-hours to look at
> them. Some noise gets closed, but some easy or important fixes may
> disappear as well.

Agreed. All of the problems mentioned in this thread are symptoms. There's
no shortage of talent and enthusiasm within the Spark community. The people
and the product are wonderful. The process: not so much. Spark has been
wildly successful, some growing pains are to be expected.

Given 100+ contributors, Spark is a big project. As with big data, big
projects can run into scaling issues. There's no magic to running a
successful big project, but it does require greater planning and discipline.
JIRA is great for issue tracking, but it's not a replacement for a project
plan. Quarterly releases are a great idea, everyone knows the schedule. What
we need is concise plan for each release with a clear scope statement.
Without knowing what is in scope and out of scope for a release, we end up
with a laundry list of things to do, but no clear goal. Laundry lists don't
scale well.

I don't mind helping with planning and documenting releases. This is
especially helpful for new contributors who don't know where to start. I
have done that successfully on many projects using Jira and Confluence, so I
know it can be done. To address immediate concerns of open PRs and
excessive, overlapping Jira issues, we probably have to create a meta issue
and assign resources to fix it. I don't mind helping with that also.



-----
--
Madhu
https://www.linkedin.com/in/msiddalingaiah
--
View this message in context: 
http://apache-spark-developers-list.1001551.n3.nabble.com/Handling-stale-PRs-tp8015p8031.html
Sent from the Apache Spark Developers List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org

Reply via email to