Github user iven closed the pull request at:
https://github.com/apache/spark/pull/60
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled
Github user iven commented on the pull request:
https://github.com/apache/spark/pull/60#issuecomment-75344185
@srowen It's OK. I'm not currently using Spark.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your
Github user iven commented on the pull request:
https://github.com/apache/spark/pull/1969#issuecomment-53830594
@liancheng @andrewor14 Thanks, it works! I'm closing this.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well
Github user iven closed the pull request at:
https://github.com/apache/spark/pull/1969
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature
Github user iven commented on a diff in the pull request:
https://github.com/apache/spark/pull/60#discussion_r16817226
--- Diff:
core/src/main/scala/org/apache/spark/scheduler/cluster/mesos/MesosSchedulerBackend.scala
---
@@ -70,8 +71,16 @@ private[spark] class
Github user iven commented on the pull request:
https://github.com/apache/spark/pull/1969#issuecomment-53665867
@liancheng Yes. Although I'm using `spark-submit`, not `spark-shell`.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub
Github user iven commented on the pull request:
https://github.com/apache/spark/pull/60#issuecomment-53515593
I'll update a new working version against master later this week.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub
Github user iven commented on the pull request:
https://github.com/apache/spark/pull/1860#issuecomment-53228456
@MartinWeindel I think you should check if there's enough memory in the
offer first.
---
If your project is set up for it, you can reply to this email and have your
reply
Github user iven commented on the pull request:
https://github.com/apache/spark/pull/1969#issuecomment-52388625
@JoshRosen I'm using Spark 1.0.1 with Mesos. If I don't specify SPARK_HOME
in the driver, Mesos executors will LOST with error:
```
sh: /root/spark_master/sbin
Github user iven commented on the pull request:
https://github.com/apache/spark/pull/1969#issuecomment-52388869
@andrewor14 OK. I've update the patch when we confirm this PR is necessary.
---
If your project is set up for it, you can reply to this email and have your
reply appear
GitHub user iven opened a pull request:
https://github.com/apache/spark/pull/1969
Use user defined $SPARK_HOME in spark-submit if possible
You can merge this pull request into a Git repository by running:
$ git pull https://github.com/iven/spark spark-home
Alternatively you
Github user iven commented on the pull request:
https://github.com/apache/spark/pull/60#issuecomment-40677116
I run SparkPi serveral times, and get different results:
1. With slices 200, the executors all succeeds, and the framework gives
correct result.
2. With slices 400
Github user iven commented on the pull request:
https://github.com/apache/spark/pull/60#issuecomment-40280398
I've finally get this working, and fixed several bugs in the original PR.
It's really hard to get Spark(0.9 and higher) on Mesos working. Here's some
note
Github user iven commented on the pull request:
https://github.com/apache/spark/pull/60#issuecomment-40282298
I've no idea why the test fails.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have
Github user iven commented on the pull request:
https://github.com/apache/spark/pull/60#issuecomment-40300282
Oh, sorry. The spark version limit is not necessary. I retested master with
the right Hadoop version, and it works.
---
If your project is set up for it, you can reply
15 matches
Mail list logo