BTW, this PR https://github.com/apache/spark/pull/2524 is related to a blocker 
level bug, 

and this is actually close to be merged (have been reviewed for several rounds)

I would appreciated if anyone can continue the process, 

@mateiz 

-- 
Nan Zhu
http://codingcat.me


On Thursday, November 20, 2014 at 10:17 AM, Corey Nolet wrote:

> I was actually about to post this myself- I have a complex join that could
> benefit from something like a GroupComparator vs having to do multiple
> grouyBy operations. This is probably the wrong thread for a full discussion
> on this but I didn't see a JIRA ticket for this or anything similar- any
> reasons why this would not make sense given Spark's design?
> 
> On Thu, Nov 20, 2014 at 9:39 AM, Madhu <ma...@madhu.com 
> (mailto:ma...@madhu.com)> wrote:
> 
> > Thanks Patrick.
> > 
> > I've been testing some 1.2 features, looks good so far.
> > I have some example code that I think will be helpful for certain MR-style
> > use cases (secondary sort).
> > Can I still add that to the 1.2 documentation, or is that frozen at this
> > point?
> > 
> > 
> > 
> > -----
> > --
> > Madhu
> > https://www.linkedin.com/in/msiddalingaiah
> > --
> > View this message in context:
> > http://apache-spark-developers-list.1001551.n3.nabble.com/ANNOUNCE-Spark-1-2-0-Release-Preview-Posted-tp9400p9449.html
> > Sent from the Apache Spark Developers List mailing list archive at
> > Nabble.com (http://Nabble.com).
> > 
> > ---------------------------------------------------------------------
> > To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org 
> > (mailto:dev-unsubscr...@spark.apache.org)
> > For additional commands, e-mail: dev-h...@spark.apache.org 
> > (mailto:dev-h...@spark.apache.org)
> > 
> 
> 
> 
> 


Reply via email to