Github user srowen commented on the pull request:
https://github.com/apache/spark/pull/2346#issuecomment-55484675
@rxin (Yeah wasn't sure how to handle the continuation indent, feel free to
change it.) I didn't add it to `SparkContext` because I figured the purpose of
the change was t
Github user asfgit closed the pull request at:
https://github.com/apache/spark/pull/2346
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enab
Github user rxin commented on the pull request:
https://github.com/apache/spark/pull/2346#issuecomment-55482552
@srowen any reason you did not add this to the Scala SparkContext?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as
Github user rxin commented on a diff in the pull request:
https://github.com/apache/spark/pull/2346#discussion_r17510558
--- Diff:
core/src/main/scala/org/apache/spark/api/java/JavaSparkContext.scala ---
@@ -40,7 +41,9 @@ import org.apache.spark.rdd.{EmptyRDD, HadoopRDD,
NewHadoop
Github user andrewor14 commented on the pull request:
https://github.com/apache/spark/pull/2346#issuecomment-55333468
Yup, LGTM
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
en
Github user mattf commented on the pull request:
https://github.com/apache/spark/pull/2346#issuecomment-55258618
would like a unit test, but lgtm
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not ha
Github user srowen commented on the pull request:
https://github.com/apache/spark/pull/2346#issuecomment-55152475
`AutoCloseable` is a superinterface of `Closeable`, so, this is already
true. Implementing `Closeable` is a stronger condition, strangely.
---
If your project is set up f
Github user mattf commented on the pull request:
https://github.com/apache/spark/pull/2346#issuecomment-55152257
> It doesn't target Java 7:
https://github.com/apache/spark/blob/master/pom.xml#L113
you're right. i hope it becomes java 7+ and we can move to AutoCloseable
---
Github user srowen commented on the pull request:
https://github.com/apache/spark/pull/2346#issuecomment-55150236
It doesn't target Java 7:
https://github.com/apache/spark/blob/master/pom.xml#L113
---
If your project is set up for it, you can reply to this email and have your
reply a
Github user mattf commented on the pull request:
https://github.com/apache/spark/pull/2346#issuecomment-55149328
since spark targets java 7 & 8, why not just use the correct AutoCloseable?
---
If your project is set up for it, you can reply to this email and have your
reply appear on
Github user SparkQA commented on the pull request:
https://github.com/apache/spark/pull/2346#issuecomment-55136925
[QA tests have
finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/20104/consoleFull)
for PR 2346 at commit
[`612c21d`](https://github.com/a
Github user witgo commented on the pull request:
https://github.com/apache/spark/pull/2346#issuecomment-55132817
The relevant PR: #991
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this fea
Github user SparkQA commented on the pull request:
https://github.com/apache/spark/pull/2346#issuecomment-55126763
[QA tests have
started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/20104/consoleFull)
for PR 2346 at commit
[`612c21d`](https://github.com/ap
GitHub user srowen opened a pull request:
https://github.com/apache/spark/pull/2346
SPARK-3470 [CORE] [STREAMING] Add Closeable / close() to Java context
objects
... that expose a stop() lifecycle method. This doesn't add
`AutoCloseable`, which is Java 7+ only. But it should be po
14 matches
Mail list logo