Github user dongjoon-hyun commented on the issue:
https://github.com/apache/spark/pull/15899
Since the issue is closed, this PR will be closed at the next infra clean
ups.
---
-
To unsubscribe, e-mail:
Github user HyukjinKwon commented on the issue:
https://github.com/apache/spark/pull/15899
+1 for the decision and closing it.
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional
Github user dongjoon-hyun commented on the issue:
https://github.com/apache/spark/pull/15899
I see. Thank you for the clear decision, @rxin ! I'll close the issue as
`Won't Fix`.
And, could you close this PR, @reggert ?
---
Github user rxin commented on the issue:
https://github.com/apache/spark/pull/15899
Thanks for the example. I didn't even know that was possible in earlier
versions. I just looked it up: looks like Scala 2.11 rewrites for
comprehensions into map, filter, and flatMap.
That
Github user dongjoon-hyun commented on the issue:
https://github.com/apache/spark/pull/15899
Hi, @rxin , @srowen , @dbtsai , @felixcheung , @gatorsmile , @cloud-fan .
I know this was not a recommended style, but there really exists users with
this issue. And, from Spark
Github user AmplabJenkins commented on the issue:
https://github.com/apache/spark/pull/15899
Can one of the admins verify this patch?
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional
Github user AmplabJenkins commented on the issue:
https://github.com/apache/spark/pull/15899
Can one of the admins verify this patch?
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional
Github user AmplabJenkins commented on the issue:
https://github.com/apache/spark/pull/15899
Can one of the admins verify this patch?
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional
Github user danielyli commented on the issue:
https://github.com/apache/spark/pull/15899
I'm simply making an argument for a specific use case, though you're right,
it's used for more than just pattern matching.
---
If your project is set up for it, you can reply to this email and
Github user reggert commented on the issue:
https://github.com/apache/spark/pull/15899
Strictly speaking, this doesn't just affect pair RDDs. It affects any RDDs
on which a `for` expression involving a filter operation, which includes
explicit `if` clauses as well as pattern matches.
Github user danielyli commented on the issue:
https://github.com/apache/spark/pull/15899
Hey,
Checking in again on this PR. Can we please support `withFilter` for pair
RDDs? For-expressions are a central sugar in Scala syntax, and without them
developers are hampered in
Github user danielyli commented on the issue:
https://github.com/apache/spark/pull/15899
@rxin, is it possible for Spark to support extractors in for expressions
with pair RDDs?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as
Github user reggert commented on the issue:
https://github.com/apache/spark/pull/15899
The (k,v) <- pairRDD expression involves a pattern match, which the
compiler converts into a filter/withFilter call on items that match the pattern.
---
If your project is set up for it, you can
Github user danielyli commented on the issue:
https://github.com/apache/spark/pull/15899
Hello,
I found this issue after encountering the error `'withFilter' method does
not yet exist on RDD[(Int, Double)], using 'filter' method instead` in my code.
I'm writing a
Github user reggert commented on the issue:
https://github.com/apache/spark/pull/15899
I don't get why you say that it "doesn't even work in general". Under what
circumstances doesn't it work? I've never run into any problems with it.
The "simple syntactic sugar" allows very
Github user rxin commented on the issue:
https://github.com/apache/spark/pull/15899
I don't get it. The only thing you can do here is just a simple syntactic
sugar, and the sugar doesn't even work in general. Isn't it more surprising to
fail in some cases?
---
If your project is
Github user reggert commented on the issue:
https://github.com/apache/spark/pull/15899
I disagree strongly. I've used RDDs in for comprehensions for almost 2
years without issue. Being able to "extend for loops" in this way is a
major language feature of Scala that helps in
Github user rxin commented on the issue:
https://github.com/apache/spark/pull/15899
@reggert @srowen
any reaction to my pushback?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this
Github user rxin commented on the issue:
https://github.com/apache/spark/pull/15899
I would vote to explicitly discourage this kind of use case. By encouraging
this we are creating an illusion that for comprehension can be used, but in
reality there are a lot of gotchas.
---
If
Github user reggert commented on the issue:
https://github.com/apache/spark/pull/15899
The only other weird case I've run into is trying to `flatMap` across
multiple RDDs, e.g., `for (x <- rdd1; y <- rdd2) yield x + y`, but it simply
won't compile because `RDD.flatMap` doesn't
Github user srowen commented on the issue:
https://github.com/apache/spark/pull/15899
OK, that makes sense. Yes, of course in reality it's just syntactic sugar.
I suppose I wonder: if this works, are there other sugary things one would
expect to work that don't? and, does it add to
Github user reggert commented on the issue:
https://github.com/apache/spark/pull/15899
Using RDD's in `for` comprehensions dates back _at least_ to the examples
given in the old 2010 AMPLab paper "Spark: Cluster Computing with Working
Sets". `for` comprehensions are in no way limited
Github user srowen commented on the issue:
https://github.com/apache/spark/pull/15899
You're right that it's not hard to add but is this really intended usage of
an RDD? I don't know how much we want to make it operate like a local
collection. I'm not strongly against it though, but
Github user AmplabJenkins commented on the issue:
https://github.com/apache/spark/pull/15899
Can one of the admins verify this patch?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this
24 matches
Mail list logo