Do you mean this https://cwiki.apache.org/confluence/display/SPARK/Committers?

-- 
Nan Zhu


On Monday, February 24, 2014 at 4:18 PM, Andrew Ash wrote:

> Would love to have a discussion since I know the core contributors are
> facing a barrage of PRs and things are falling through the cracks.
> 
> Is there a list of who can commit to core Spark somewhere? Maybe that list
> should be expanded or there should be a rotation of PR duty of some sort.
> 
> One of the perils of having a vibrant, organic community is that you get
> way more contributions than you expected!
> 
> 
> On Mon, Feb 24, 2014 at 1:16 PM, Nan Zhu <zhunanmcg...@gmail.com 
> (mailto:zhunanmcg...@gmail.com)> wrote:
> 
> > yet another email about forgotten PR
> > 
> > I think Sean would like to start some discussion on the current situation
> > where committers are facing a flood of PRs recently (as he said in the
> > discussion thread about how to prevent the blob of RDD API)?
> > 
> > Best,
> > 
> > --
> > Nan Zhu
> > 
> > 
> > On Monday, February 24, 2014 at 4:07 PM, Andrew Ash wrote:
> > 
> > > Hi Spark devs,
> > > 
> > > Kyle identified a deficiency in Spark where generating iterators are
> > > unrolled into memory and then flushed to disk rather than sent straight
> > > 
> > 
> > to
> > > disk when possible.
> > > 
> > > He's had a patch sitting ready for code review for quite some time now
> > (100
> > > days) but no response.
> > > 
> > > Is this something that an admin would be able to review? I for one would
> > > find this quite valuable.
> > > 
> > > Thanks!
> > > Andrew
> > > 
> > > 
> > > https://spark-project.atlassian.net/browse/SPARK-942
> > > https://github.com/apache/incubator-spark/pull/180
> > > 
> > 
> > 
> 
> 
> 


Reply via email to