Github user sryza commented on the pull request:
https://github.com/apache/incubator-spark/pull/538#issuecomment-34683445
Updated the patch to work with yarn-standalone mode as well. Does a doAs
in the application master when running the user class.
Github user sryza commented on the pull request:
https://github.com/apache/incubator-spark/pull/553#issuecomment-34830852
It's true that you may need it when running in yarn-client mode. And also
true that you will not when running spark-shell. Because it depends, I think
Github user sryza commented on the pull request:
https://github.com/apache/incubator-spark/pull/553#issuecomment-34906646
@tgravescs I should have tried this - it looks like it actually works fine
when SPARK_YARN_APP_JAR isn't specified. The client must be serving the jar in
Github user sryza commented on the pull request:
https://github.com/apache/incubator-spark/pull/615#issuecomment-35432894
Why use MEMORY for the daemon, but MEM for the driver?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as
Github user sryza closed the pull request at:
https://github.com/apache/incubator-spark/pull/555
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. To do so, please top-post your response.
If your project does not have this
Github user sryza commented on the pull request:
https://github.com/apache/incubator-spark/pull/615#issuecomment-35572922
Will this nomenclature make sense in the context of yarn-standalone mode,
where spark-class is used, but the driver is run inside an application master
on the
Github user sryza commented on the pull request:
https://github.com/apache/incubator-spark/pull/615#issuecomment-35684281
Right. What I mean is that calling the variable SPARK_DRIVER_MEMORY might
be confusing in the context of yarn-standalone because its value would apply to
the
Github user sryza commented on the pull request:
https://github.com/apache/incubator-spark/pull/615#issuecomment-35684291
Right. What I mean is that calling the variable SPARK_DRIVER_MEMORY might
be confusing in the context of yarn-standalone because its value would apply to
the
Github user sryza commented on the pull request:
https://github.com/apache/incubator-spark/pull/615#issuecomment-35684293
Right. What I mean is that calling the variable SPARK_DRIVER_MEMORY might
be confusing in the context of yarn-standalone because its value would apply to
the
GitHub user sryza opened a pull request:
https://github.com/apache/incubator-spark/pull/640
SPARK-1004: PySpark on YARN
Make pyspark work in yarn-client mode. This build's on Josh's work. I
tested verified it works on a 5-node cluster.
You can merge this pull request
10 matches
Mail list logo