Hey Andrew,

Ah, I just meant to say that in cases like this it's usually a
mistake...  and we try to (in general) be inclusive about merging
patches :) Definitely appreciate you calling this one out... this is
what people should do in cases like this.

- Patrick

On Tue, Feb 25, 2014 at 8:00 PM, Andrew Ash <and...@andrewash.com> wrote:
> I've always felt that the Spark team was extremely responsive to PRs and
> I've been very impressed over the past year with your output.  As Matei
> said, probably the best thing to do here is to be more diligent about
> closing PRs that are old/abandoned so that every PR is active.  Whenever I
> comment I try to make it clear who has the next action to get the PR merged.
>
> I definitely don't want you to think that I'm critiquing the process!  The
> reason I brought this up in the first place was because I thought we were
> about to lose a contributor because something fell through the cracks,
> which would be unfortunate.
>
>
> On Tue, Feb 25, 2014 at 6:32 PM, Patrick Wendell <pwend...@gmail.com> wrote:
>
>> Hey Andrew,
>>
>> Indeed, sometimes there are patches that sit around a while and in
>> this case it can be because it's unclear to the reviewers whether they
>> are features worth having - or just by accident.
>>
>> To put things in perspective, Spark merges about 80% of the proposed
>> patches (if you look we are on around 600 since moving to the new repo
>> with 100 not merged) - so in general we try hard to be very supportive
>> of community patches, much more than other projects in this space.
>>
>> - Patrick
>>
>> On Mon, Feb 24, 2014 at 1:39 PM, Matei Zaharia <matei.zaha...@gmail.com>
>> wrote:
>> > Thanks for bringing this up. One issue that makes this harder is that
>> old inactive PRs on GitHub are not really getting closed, so active ones
>> might be lost between those. For now please just post on the dev list if
>> your PR is being ignored. We'll implement some kind of cleanup (at least
>> manually) to close the old ones.
>> >
>> > Matei
>> >
>> > On Feb 24, 2014, at 1:30 PM, Andrew Ash <and...@andrewash.com> wrote:
>> >
>> >> Yep that's the one thanks! That's quite a few more people than I thought
>> >>
>> >> Sent from my mobile phone
>> >> On Feb 24, 2014 1:20 PM, "Nan Zhu" <zhunanmcg...@gmail.com> wrote:
>> >>
>> >>> Do you mean this
>> >>> https://cwiki.apache.org/confluence/display/SPARK/Committers?
>> >>>
>> >>> --
>> >>> Nan Zhu
>> >>>
>> >>>
>> >>> On Monday, February 24, 2014 at 4:18 PM, Andrew Ash wrote:
>> >>>
>> >>>> Would love to have a discussion since I know the core contributors are
>> >>>> facing a barrage of PRs and things are falling through the cracks.
>> >>>>
>> >>>> Is there a list of who can commit to core Spark somewhere? Maybe that
>> >>> list
>> >>>> should be expanded or there should be a rotation of PR duty of some
>> sort.
>> >>>>
>> >>>> One of the perils of having a vibrant, organic community is that you
>> get
>> >>>> way more contributions than you expected!
>> >>>>
>> >>>>
>> >>>> On Mon, Feb 24, 2014 at 1:16 PM, Nan Zhu <zhunanmcg...@gmail.com
>> (mailto:
>> >>> zhunanmcg...@gmail.com)> wrote:
>> >>>>
>> >>>>> yet another email about forgotten PR
>> >>>>>
>> >>>>> I think Sean would like to start some discussion on the current
>> >>> situation
>> >>>>> where committers are facing a flood of PRs recently (as he said in
>> the
>> >>>>> discussion thread about how to prevent the blob of RDD API)?
>> >>>>>
>> >>>>> Best,
>> >>>>>
>> >>>>> --
>> >>>>> Nan Zhu
>> >>>>>
>> >>>>>
>> >>>>> On Monday, February 24, 2014 at 4:07 PM, Andrew Ash wrote:
>> >>>>>
>> >>>>>> Hi Spark devs,
>> >>>>>>
>> >>>>>> Kyle identified a deficiency in Spark where generating iterators are
>> >>>>>> unrolled into memory and then flushed to disk rather than sent
>> >>> straight
>> >>>>>>
>> >>>>>
>> >>>>> to
>> >>>>>> disk when possible.
>> >>>>>>
>> >>>>>> He's had a patch sitting ready for code review for quite some time
>> >>> now
>> >>>>> (100
>> >>>>>> days) but no response.
>> >>>>>>
>> >>>>>> Is this something that an admin would be able to review? I for one
>> >>> would
>> >>>>>> find this quite valuable.
>> >>>>>>
>> >>>>>> Thanks!
>> >>>>>> Andrew
>> >>>>>>
>> >>>>>>
>> >>>>>> https://spark-project.atlassian.net/browse/SPARK-942
>> >>>>>> https://github.com/apache/incubator-spark/pull/180
>> >>>>>>
>> >>>>>
>> >>>>>
>> >>>>
>> >>>>
>> >>>>
>> >>>
>> >>>
>> >>>
>> >
>>

Reply via email to