Github user dbtsai closed the pull request at:
https://github.com/apache/spark/pull/2709
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is
Github user dbtsai commented on the pull request:
https://github.com/apache/spark/pull/2709#issuecomment-59667207
@andrewor14 Sorry for late reply since I was on vacation in Europe last
week. I can continue work on this after I finish my talk in IOTA conf tomorrow.
---
If your
Github user andrewor14 commented on the pull request:
https://github.com/apache/spark/pull/2709#issuecomment-59596156
Hey @dbtsai can you update this now that #2379 has gone in? In particular
this is now used by the Spark daemons too (i.e. Worker, Master, HistoryServer).
I'm don't
GitHub user dbtsai opened a pull request:
https://github.com/apache/spark/pull/2709
Minor change in the comment of spark-defaults.conf.template
spark-defaults.conf is used in spark-shell as well, and this PR added this
into the comment.
You can merge this pull request into a Git
Github user SparkQA commented on the pull request:
https://github.com/apache/spark/pull/2709#issuecomment-58329045
[QA tests have
started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/21457/consoleFull)
for PR 2709 at commit
Github user SparkQA commented on the pull request:
https://github.com/apache/spark/pull/2709#issuecomment-58336192
[QA tests have
finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/21457/consoleFull)
for PR 2709 at commit
Github user AmplabJenkins commented on the pull request:
https://github.com/apache/spark/pull/2709#issuecomment-58336202
Test PASSed.
Refer to this link for build results (access rights to CI server needed):
Github user andrewor14 commented on the pull request:
https://github.com/apache/spark/pull/2709#issuecomment-58419538
Well, `spark-shell` goes through `spark-submit`. Actually as of a recent PR
(#2379) that will be merged soon, this file will be used for other Spark
daemons too