Github user asfgit closed the pull request at:
https://github.com/apache/spark/pull/1253
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is
Github user pwendell commented on the pull request:
https://github.com/apache/spark/pull/1253#issuecomment-49971888
@sryza I merged this just now because another patch was going to change
this code and I wanted to avoid you having to rebase again. That said, I found
an issue with
Github user pwendell commented on the pull request:
https://github.com/apache/spark/pull/1253#issuecomment-49707690
LGTM - @mateiz @rxin any final comments here?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your
Github user rxin commented on a diff in the pull request:
https://github.com/apache/spark/pull/1253#discussion_r15214419
--- Diff:
core/src/main/scala/org/apache/spark/deploy/SparkSubmitArguments.scala ---
@@ -55,6 +55,7 @@ private[spark] class SparkSubmitArguments(args:
Github user rxin commented on a diff in the pull request:
https://github.com/apache/spark/pull/1253#discussion_r15214475
--- Diff: docs/configuration.md ---
@@ -42,13 +42,14 @@ val sc = new SparkContext(new SparkConf())
Then, you can supply configuration values at
Github user rxin commented on a diff in the pull request:
https://github.com/apache/spark/pull/1253#discussion_r15214491
--- Diff: docs/configuration.md ---
@@ -42,13 +42,14 @@ val sc = new SparkContext(new SparkConf())
Then, you can supply configuration values at
Github user pwendell commented on a diff in the pull request:
https://github.com/apache/spark/pull/1253#discussion_r15215030
--- Diff: docs/configuration.md ---
@@ -42,13 +42,14 @@ val sc = new SparkContext(new SparkConf())
Then, you can supply configuration values at
Github user pwendell commented on a diff in the pull request:
https://github.com/apache/spark/pull/1253#discussion_r15215054
--- Diff: docs/configuration.md ---
@@ -42,13 +42,14 @@ val sc = new SparkContext(new SparkConf())
Then, you can supply configuration values at
Github user mateiz commented on the pull request:
https://github.com/apache/spark/pull/1253#issuecomment-49709826
Can you support `-c` in addition to `--conf`?
Also, the spark-submit doc
(http://spark.apache.org/docs/latest/submitting-applications.html) should be
updated to
Github user SparkQA commented on the pull request:
https://github.com/apache/spark/pull/1253#issuecomment-49766236
QA tests have started for PR 1253. This patch merges cleanly. brView
progress:
https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/16971/consoleFull
---
Github user SparkQA commented on the pull request:
https://github.com/apache/spark/pull/1253#issuecomment-49780231
QA results for PR 1253:br- This patch FAILED unit tests.br- This patch
merges cleanlybr- This patch adds no public classesbrbrFor more
information see test
Github user sryza commented on the pull request:
https://github.com/apache/spark/pull/1253#issuecomment-49780719
The failure appears to be unrelated (MIMA compatibility issue in MLLib).
---
If your project is set up for it, you can reply to this email and have your
reply appear on
Github user pwendell commented on the pull request:
https://github.com/apache/spark/pull/1253#issuecomment-49799619
Jenkins, retest this please.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not
Github user SparkQA commented on the pull request:
https://github.com/apache/spark/pull/1253#issuecomment-49799923
QA tests have started for PR 1253. This patch merges cleanly. brView
progress:
https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/16988/consoleFull
---
Github user SparkQA commented on the pull request:
https://github.com/apache/spark/pull/1253#issuecomment-49811230
QA results for PR 1253:br- This patch PASSES unit tests.br- This patch
merges cleanlybr- This patch adds no public classesbrbrFor more
information see test
Github user pwendell commented on a diff in the pull request:
https://github.com/apache/spark/pull/1253#discussion_r15199183
--- Diff:
core/src/main/scala/org/apache/spark/deploy/SparkSubmitArguments.scala ---
@@ -290,6 +291,14 @@ private[spark] class SparkSubmitArguments(args:
Github user pwendell commented on a diff in the pull request:
https://github.com/apache/spark/pull/1253#discussion_r15199316
--- Diff:
core/src/main/scala/org/apache/spark/deploy/SparkSubmitArguments.scala ---
@@ -363,6 +372,7 @@ private[spark] class SparkSubmitArguments(args:
Github user pwendell commented on a diff in the pull request:
https://github.com/apache/spark/pull/1253#discussion_r15199538
--- Diff: docs/configuration.md ---
@@ -42,13 +42,14 @@ val sc = new SparkContext(new SparkConf())
Then, you can supply configuration values at
Github user sryza commented on a diff in the pull request:
https://github.com/apache/spark/pull/1253#discussion_r15202044
--- Diff:
core/src/main/scala/org/apache/spark/deploy/SparkSubmitArguments.scala ---
@@ -290,6 +291,14 @@ private[spark] class SparkSubmitArguments(args:
Github user sryza commented on the pull request:
https://github.com/apache/spark/pull/1253#issuecomment-49679081
Updated patch addresses Patrick's feedback
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project
Github user SparkQA commented on the pull request:
https://github.com/apache/spark/pull/1253#issuecomment-49679255
QA tests have started for PR 1253. This patch merges cleanly. brView
progress:
https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/16934/consoleFull
---
Github user SparkQA commented on the pull request:
https://github.com/apache/spark/pull/1253#issuecomment-49685979
QA results for PR 1253:br- This patch PASSES unit tests.br- This patch
merges cleanlybr- This patch adds no public classesbrbrFor more
information see test
Github user mateiz commented on the pull request:
https://github.com/apache/spark/pull/1253#issuecomment-49575215
I agree, -D is for JVM options, but these are not arbitrary JVM options.
---
If your project is set up for it, you can reply to this email and have your
reply appear on
Github user srowen commented on the pull request:
https://github.com/apache/spark/pull/1253#issuecomment-49577965
Good points. I meant triaging all -D options but yes those then have very
'local' semantics.
---
If your project is set up for it, you can reply to this email and have
Github user sryza commented on the pull request:
https://github.com/apache/spark/pull/1253#issuecomment-49578945
Updated patch to use --conf
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have
Github user lianhuiwang commented on the pull request:
https://github.com/apache/spark/pull/1253#issuecomment-49542251
how about -Dspark.app.name=blah? because in jvm or Hadoop, they use -D flag
to represent conf properties.
---
If your project is set up for it, you can reply to
Github user srowen commented on the pull request:
https://github.com/apache/spark/pull/1253#issuecomment-49542435
-D feels more natural indeed; I would expect those args to be passed
through to the JVM as-is. Because that's a way to set these env properties too
right? In fact,
Github user pwendell commented on the pull request:
https://github.com/apache/spark/pull/1253#issuecomment-49570062
IMO `-D` does not have the right semantics here because the user isn't
logically setting java properties for the submission tool, they are setting
spark configuration
Github user pwendell commented on the pull request:
https://github.com/apache/spark/pull/1253#issuecomment-49536672
Hey @sryza - I did a straw poll offline discussing this with a few other
contributors. The consensus was that it might be better to have a `--conf` flag
with an `=`
GitHub user sryza opened a pull request:
https://github.com/apache/spark/pull/1253
SPARK-2310. Support arbitrary Spark properties on the command line with ...
...spark-submit
The PR allows invocations like
spark-submit --class org.MyClass --spark.shuffle.spill false
Github user sryza commented on the pull request:
https://github.com/apache/spark/pull/1253#issuecomment-47408737
Verified this on a pseudo-distributed cluster
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your
Github user AmplabJenkins commented on the pull request:
https://github.com/apache/spark/pull/1253#issuecomment-47408893
Merged build triggered.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not
Github user AmplabJenkins commented on the pull request:
https://github.com/apache/spark/pull/1253#issuecomment-47408904
Merged build started.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not
Github user AmplabJenkins commented on the pull request:
https://github.com/apache/spark/pull/1253#issuecomment-47411041
All automated tests passed.
Refer to this link for build results:
https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/16222/
---
If your
Github user AmplabJenkins commented on the pull request:
https://github.com/apache/spark/pull/1253#issuecomment-47411040
Merged build finished. All automated tests passed.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well.
35 matches
Mail list logo