Github user srowen closed the pull request at:
https://github.com/apache/spark/pull/926
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is
Github user srowen commented on the pull request:
https://github.com/apache/spark/pull/926#issuecomment-44941811
Got it. I'm going to close this, and will open a PR for the JIRA you may
have in mind (https://issues.apache.org/jira/browse/SPARK-1906) to just set a
spark.master
Github user vanzin commented on the pull request:
https://github.com/apache/spark/pull/926#issuecomment-44899788
Hey Sean,
I'm still a little confused about what it is you're doing. What is the
javadoc you refer to? I've looked at a few classes in org.apache.spark.examples
Github user syedhashmi commented on the pull request:
https://github.com/apache/spark/pull/926#issuecomment-44899931
@vanzin : I was running into this issue while trying to run and debug
examples from intellij. It was working fine pre-1.0.
---
If your project is set up for it, you
Github user vanzin commented on the pull request:
https://github.com/apache/spark/pull/926#issuecomment-44900217
Ah, that case makes sense. For that I think Sean's current fix should be
enough. But still, if there's documentation telling users to run examples that
way, it should
Github user srowen commented on the pull request:
https://github.com/apache/spark/pull/926#issuecomment-44901237
@vanzin I was just copying-and-pasting the substance of the main() method
from an example to modify it. You are right that most examples tell you to use
`run-example` and
Github user mateiz commented on the pull request:
https://github.com/apache/spark/pull/926#issuecomment-44910051
I don't think we want to change the examples like this. Instead, we should
make the default spark.master be local instead of having an exception (I
believe there was a
Github user srowen commented on the pull request:
https://github.com/apache/spark/pull/926#issuecomment-44744710
Yeah I think it's essential to not prevent `-Dspark.master=...` from
working, oops. I think it may be useful to have this work if one
copies-and-pastes too, as I just did.
Github user AmplabJenkins commented on the pull request:
https://github.com/apache/spark/pull/926#issuecomment-44746502
Merged build triggered.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not
Github user AmplabJenkins commented on the pull request:
https://github.com/apache/spark/pull/926#issuecomment-44746504
Merged build started.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have
Github user srowen commented on the pull request:
https://github.com/apache/spark/pull/926#issuecomment-44746537
I pushed again, with `setIfMissing`. Is it better in the `SparkConf`
constructor? or I am off base here?
---
If your project is set up for it, you can reply to this email
Github user AmplabJenkins commented on the pull request:
https://github.com/apache/spark/pull/926#issuecomment-44747315
Refer to this link for build results:
https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/15321/
---
If your project is set up for it, you can
Github user AmplabJenkins commented on the pull request:
https://github.com/apache/spark/pull/926#issuecomment-44747313
Merged build finished.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not
Github user pwendell commented on the pull request:
https://github.com/apache/spark/pull/926#issuecomment-44753603
Hey Sean, how are you running the examples. Are you using the `run-example`
script? That script should set the master to `local[*]` if the user hasn't
specified it,
Github user srowen commented on the pull request:
https://github.com/apache/spark/pull/926#issuecomment-44753736
I'm just copying-and-pasting to get something similar running externally.
Maybe it's a little surprising that the example code doesn't work that way --
being in a `main()`
GitHub user srowen opened a pull request:
https://github.com/apache/spark/pull/926
SPARK-1974. Most examples fail at startup because spark.master is not set
Most example code has a few lines like:
```
val sparkConf = new SparkConf().setAppName(Foo)
val sc = new
Github user AmplabJenkins commented on the pull request:
https://github.com/apache/spark/pull/926#issuecomment-44703759
Merged build started.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have
Github user AmplabJenkins commented on the pull request:
https://github.com/apache/spark/pull/926#issuecomment-44703748
Merged build triggered.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not
Github user vanzin commented on the pull request:
https://github.com/apache/spark/pull/926#issuecomment-44704211
Won't this break things if you try to submit the examples with spark-submit
(which I think is the New And Approved Way (tm))?
spark-submit will set spark.master,
Github user srowen commented on the pull request:
https://github.com/apache/spark/pull/926#issuecomment-44704420
Hm good question, you could make it work with `-Dspark.master=...`. This
would overwrite those settings. It seems like the examples are intended to work
without that
Github user AmplabJenkins commented on the pull request:
https://github.com/apache/spark/pull/926#issuecomment-44706840
Merged build finished.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not
Github user AmplabJenkins commented on the pull request:
https://github.com/apache/spark/pull/926#issuecomment-44706841
Refer to this link for build results:
https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/15309/
---
If your project is set up for it, you can
Github user vanzin commented on the pull request:
https://github.com/apache/spark/pull/926#issuecomment-44713032
If you really want to keep the tests working outside of spark-submit, I'd
suggest using the SparkContext(SparkConf) constructor instead, and using
SparkConf.setIfMissing()
23 matches
Mail list logo