[GitHub] spark pull request: [SPARK-6350][Mesos] Make mesosExecutorCores co...

2015-03-25 Thread elyast
Github user elyast commented on the pull request: https://github.com/apache/spark/pull/5063#issuecomment-86113945 @jongyoul looks good to me thanks :) --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does

[GitHub] spark pull request: [SPARK-6350][Mesos] Make mesosExecutorCores co...

2015-03-24 Thread elyast
Github user elyast commented on the pull request: https://github.com/apache/spark/pull/5063#issuecomment-85594000 @jongyoul @tnachen imho its fine if CPU_PER_TASK will remain Integers, however if your job is less CPU intensive, that might be beneficial to optimize it, although I

[GitHub] spark pull request: [SPARK-6350][Mesos] Make mesosExecutorCores co...

2015-03-23 Thread elyast
Github user elyast commented on the pull request: https://github.com/apache/spark/pull/5063#issuecomment-85002672 @sryza you can request from Mesos fraction of CPU, however I haven't realized that we have wrong type in this patch, we should change it to Double instead of Int

[GitHub] spark pull request: [SPARK-6350][Mesos] Make mesosExecutorCores co...

2015-03-23 Thread elyast
Github user elyast commented on the pull request: https://github.com/apache/spark/pull/5063#issuecomment-85336463 @jongyoul I can try to test it out with allocating CPU to 0, but cannot promise it will work, otherwise it's just doesn't make much sense imho, I'd discuss with project

[GitHub] spark pull request: [SPARK-6350][Mesos] Make mesosExecutorCores co...

2015-03-22 Thread elyast
Github user elyast commented on the pull request: https://github.com/apache/spark/pull/5063#issuecomment-84733162 @sryza I don't think it actually need more than a single core, the issue is you cannot give less than 1 CPU. --- If your project is set up for it, you can reply

[GitHub] spark pull request: [SPARK-6350][Mesos] Make mesosExecutorCores co...

2015-03-20 Thread elyast
Github user elyast commented on the pull request: https://github.com/apache/spark/pull/5063#issuecomment-84059146 Hi @sryza @jongyoul, To give an illustration of this, let's say I have 10 nodes, 64 cores each, lets say 10 streaming jobs are running with 1 minute window (so every

[GitHub] spark pull request: [SPARK-6350][Mesos] Make mesosExecutorCores co...

2015-03-18 Thread elyast
Github user elyast commented on the pull request: https://github.com/apache/spark/pull/5063#issuecomment-83225590 We can also consider more describing config options, like spark.mesos.noOfAllocatedCoresPerExecutor ... --- If your project is set up for it, you can reply to this email

[GitHub] spark pull request: [SPARK-6350][Mesos] Make mesosExecutorCores co...

2015-03-17 Thread elyast
Github user elyast commented on a diff in the pull request: https://github.com/apache/spark/pull/5063#discussion_r26575818 --- Diff: core/src/main/scala/org/apache/spark/scheduler/cluster/mesos/MesosSchedulerBackend.scala --- @@ -67,6 +67,8 @@ private[spark] class

[GitHub] spark pull request: [SPARK-5376][Mesos] MesosExecutor should have ...

2015-03-16 Thread elyast
Github user elyast commented on the pull request: https://github.com/apache/spark/pull/4170#issuecomment-82104142 cool thanks --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature

[GitHub] spark pull request: [SPARK-5376][Mesos] MesosExecutor should have ...

2015-03-13 Thread elyast
Github user elyast commented on the pull request: https://github.com/apache/spark/pull/4170#issuecomment-79046678 Sure its totally fine not to share, but at least it should be possible to configure allocation. Allocating 1 CPU per executor may just too much, obviously it depends how

[GitHub] spark pull request: [SPARK-5376][Mesos] MesosExecutor should have ...

2015-03-12 Thread elyast
Github user elyast commented on the pull request: https://github.com/apache/spark/pull/4170#issuecomment-78799120 One comment, however if you run multiple Spark applications even tough executor-id == slave-id, multiple executors can be started on the same host. (And every one of them

[GitHub] spark pull request: SPARK-2168 [Spark core] Use relative URIs for ...

2015-02-26 Thread elyast
Github user elyast commented on a diff in the pull request: https://github.com/apache/spark/pull/4778#discussion_r25488122 --- Diff: core/src/test/scala/org/apache/spark/deploy/history/HistoryServerSuite.scala --- @@ -0,0 +1,53 @@ +/* + * Licensed to the Apache Software

[GitHub] spark pull request: SPARK-2168 [Spark core] Use relative URIs for ...

2015-02-26 Thread elyast
Github user elyast commented on a diff in the pull request: https://github.com/apache/spark/pull/4778#discussion_r25488124 --- Diff: core/src/test/scala/org/apache/spark/deploy/history/HistoryServerSuite.scala --- @@ -0,0 +1,53 @@ +/* + * Licensed to the Apache Software

[GitHub] spark pull request: SPARK-2168 [Spark core] Use relative URIs for ...

2015-02-25 Thread elyast
GitHub user elyast opened a pull request: https://github.com/apache/spark/pull/4778 SPARK-2168 [Spark core] Use relative URIs for the app links in the History Server. As agreed in PR #1160 adding test to verify if history server generates relative links to applications. You can

[GitHub] spark pull request: SPARK-2168 [Spark core] Use relative URIs for ...

2015-02-09 Thread elyast
Github user elyast commented on the pull request: https://github.com/apache/spark/pull/1160#issuecomment-73587979 Fine with me, I will add tests on master with new PR --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well

[GitHub] spark pull request: SPARK-2168 [Spark core] Use relative URIs for ...

2015-02-09 Thread elyast
Github user elyast closed the pull request at: https://github.com/apache/spark/pull/1160 --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature

[GitHub] spark pull request: SPARK-2168 [Spark core] Use relative URIs for ...

2014-12-24 Thread elyast
Github user elyast commented on the pull request: https://github.com/apache/spark/pull/1160#issuecomment-68077764 @andrewor14 I think it's been fixed on master branch, so if you don't want to release maintenance release for 1.0.x then I would suggest to close it. --- If your project

[GitHub] spark pull request: [MLLIB] [spark-2352] Implementation of an 1-hi...

2014-09-06 Thread elyast
Github user elyast commented on the pull request: https://github.com/apache/spark/pull/1290#issuecomment-54734035 Hi @avulanov I run your NeuralNetworkSuite (from your fork / neuralnetwork brach), however it fails randomly, are you sure you have implemented in correctly

[GitHub] spark pull request: SPARK-2168 [Spark core] Use relative URIs for ...

2014-08-25 Thread elyast
Github user elyast commented on a diff in the pull request: https://github.com/apache/spark/pull/1160#discussion_r16685916 --- Diff: core/src/test/scala/org/apache/spark/deploy/history/HistoryServerSuite.scala --- @@ -0,0 +1,52 @@ +/* + * Licensed to the Apache Software

[GitHub] spark pull request: SPARK-2168 [Spark core] History Server renered...

2014-06-26 Thread elyast
Github user elyast commented on the pull request: https://github.com/apache/spark/pull/1160#issuecomment-47294490 I have added more descriptive title --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does

[GitHub] spark pull request: SPARK-2168 [Spark core] History Server renered...

2014-06-26 Thread elyast
Github user elyast commented on the pull request: https://github.com/apache/spark/pull/1160#issuecomment-47294545 so should I open another PR for master branch? --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your

[GitHub] spark pull request: SPARK-2168 Spark core

2014-06-20 Thread elyast
GitHub user elyast opened a pull request: https://github.com/apache/spark/pull/1160 SPARK-2168 Spark core Removing full URI leaving only relative path in link to the completed application plus unit test You can merge this pull request into a Git repository by running: $ git