Github user SparkQA commented on the pull request:
https://github.com/apache/spark/pull/5131#issuecomment-85424641
[Test build #29072 has
finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/29072/consoleFull)
for PR 5131 at commit
Github user AmplabJenkins commented on the pull request:
https://github.com/apache/spark/pull/5131#issuecomment-85424652
Test PASSed.
Refer to this link for build results (access rights to CI server needed):
GitHub user preaudc opened a pull request:
https://github.com/apache/spark/pull/5165
[SPARK-6469] The YARN driver in yarn-client mode will not use the local
directories configured for YARN
Clarify the local directories usage in YARN
You can merge this pull request into a Git
Github user SparkQA commented on the pull request:
https://github.com/apache/spark/pull/5152#issuecomment-85434029
[Test build #29077 has
finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/29077/consoleFull)
for PR 5152 at commit
Github user srowen commented on a diff in the pull request:
https://github.com/apache/spark/pull/5165#discussion_r27016084
--- Diff: docs/running-on-yarn.md ---
@@ -274,6 +274,6 @@ If you need a reference to the proper location to put
log files in the YARN so t
# Important
Github user SparkQA commented on the pull request:
https://github.com/apache/spark/pull/5134#issuecomment-85443157
[Test build #29080 has
started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/29080/consoleFull)
for PR 5134 at commit
GitHub user DoingDone9 opened a pull request:
https://github.com/apache/spark/pull/5166
[SPARK-6493][SQL]Support numeric(a,b) in the sqlContext
support sql like that :
select cast(20.12 as numeric(4,2)) from src limit 1;
You can merge this pull request into a Git repository by
Github user SparkQA commented on the pull request:
https://github.com/apache/spark/pull/5168#issuecomment-85473106
[Test build #29085 has
finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/29085/consoleFull)
for PR 5168 at commit
Github user SparkQA commented on the pull request:
https://github.com/apache/spark/pull/5168#issuecomment-85472692
[Test build #29085 has
started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/29085/consoleFull)
for PR 5168 at commit
Github user AmplabJenkins commented on the pull request:
https://github.com/apache/spark/pull/5168#issuecomment-85473110
Test FAILed.
Refer to this link for build results (access rights to CI server needed):
Github user srowen commented on a diff in the pull request:
https://github.com/apache/spark/pull/5115#discussion_r27023852
--- Diff: core/src/main/scala/org/apache/spark/util/Utils.scala ---
@@ -792,6 +793,11 @@ private[spark] object Utils extends Logging {
lazy val
Github user tgravescs commented on the pull request:
https://github.com/apache/spark/pull/3976#issuecomment-85498686
@andrewor14 for python application, if SPARK_HOME of submission node is
different from the nodeManager, so it can not work in my test.
example:submission node's
Github user AmplabJenkins commented on the pull request:
https://github.com/apache/spark/pull/4909#issuecomment-85482991
Test PASSed.
Refer to this link for build results (access rights to CI server needed):
Github user SparkQA commented on the pull request:
https://github.com/apache/spark/pull/4909#issuecomment-85482978
[Test build #29083 has
finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/29083/consoleFull)
for PR 4909 at commit
Github user asfgit closed the pull request at:
https://github.com/apache/spark/pull/5111
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is
Github user SparkQA commented on the pull request:
https://github.com/apache/spark/pull/5167#issuecomment-85489042
[Test build #29084 has
finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/29084/consoleFull)
for PR 5167 at commit
Github user AmplabJenkins commented on the pull request:
https://github.com/apache/spark/pull/5167#issuecomment-85489073
Test PASSed.
Refer to this link for build results (access rights to CI server needed):
Github user tgravescs commented on the pull request:
https://github.com/apache/spark/pull/3976#issuecomment-85502310
so it appears setting SPARK_HOME to anything on AM and executors works
around this error:
spark.yarn.appMasterEnv.SPARK_HOME /bogus
spark.executorEnv.SPARK_HOME
Github user AmplabJenkins commented on the pull request:
https://github.com/apache/spark/pull/5168#issuecomment-85478262
Test FAILed.
Refer to this link for build results (access rights to CI server needed):
Github user SparkQA commented on the pull request:
https://github.com/apache/spark/pull/5168#issuecomment-85478255
[Test build #29086 has
finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/29086/consoleFull)
for PR 5168 at commit
Github user SparkQA commented on the pull request:
https://github.com/apache/spark/pull/5168#issuecomment-85478028
[Test build #29086 has
started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/29086/consoleFull)
for PR 5168 at commit
Github user srowen commented on the pull request:
https://github.com/apache/spark/pull/5111#issuecomment-85485820
OK I merged this with a couple fixes to the text (e.g. spark - Spark)
---
If your project is set up for it, you can reply to this email and have your
reply appear on
Github user MechCoder commented on the pull request:
https://github.com/apache/spark/pull/4986#issuecomment-85493045
@mengxr Sorry, I misunderstood your comment before. Should look good now.
---
If your project is set up for it, you can reply to this email and have your
reply appear
Github user SparkQA commented on the pull request:
https://github.com/apache/spark/pull/5156#issuecomment-85522670
[Test build #29092 has
started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/29092/consoleFull)
for PR 5156 at commit
Github user SparkQA commented on the pull request:
https://github.com/apache/spark/pull/5168#issuecomment-85481492
[Test build #29087 has
started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/29087/consoleFull)
for PR 5168 at commit
Github user srowen commented on a diff in the pull request:
https://github.com/apache/spark/pull/5119#discussion_r27023977
--- Diff: pom.xml ---
@@ -1472,6 +1473,45 @@
groupIdorg.scalatest/groupId
artifactIdscalatest-maven-plugin/artifactId
Github user srowen commented on a diff in the pull request:
https://github.com/apache/spark/pull/5119#discussion_r27024006
--- Diff: pom.xml ---
@@ -1472,6 +1473,45 @@
groupIdorg.scalatest/groupId
artifactIdscalatest-maven-plugin/artifactId
Github user SparkQA commented on the pull request:
https://github.com/apache/spark/pull/4986#issuecomment-85495571
[Test build #29089 has
started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/29089/consoleFull)
for PR 4986 at commit
Github user jongyoul commented on the pull request:
https://github.com/apache/spark/pull/5063#issuecomment-85500449
@elyast @tnachen Do you think `CPUS_PER_TASK` also support fractional
value? If it's not, I may be support executorCores as fractional value without
huge changes.
---
Github user tgravescs commented on the pull request:
https://github.com/apache/spark/pull/5095#issuecomment-85508290
In our local network, the yarn-node do not know the hostname of the
client. So I have to set spark.driver.host to the client's ip address, so the
driver will use it's
Github user yanboliang commented on a diff in the pull request:
https://github.com/apache/spark/pull/4997#discussion_r27030181
--- Diff:
mllib/src/main/scala/org/apache/spark/mllib/api/python/PythonMLLibAPI.scala ---
@@ -111,9 +111,11 @@ private[python] class PythonMLLibAPI
Github user SparkQA commented on the pull request:
https://github.com/apache/spark/pull/4997#issuecomment-85511976
[Test build #29090 has
started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/29090/consoleFull)
for PR 4997 at commit
Github user SparkQA commented on the pull request:
https://github.com/apache/spark/pull/5137#issuecomment-85518708
[Test build #29091 has
started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/29091/consoleFull)
for PR 5137 at commit
Github user SaintBacchus commented on the pull request:
https://github.com/apache/spark/pull/5152#issuecomment-85492976
@srowen Ok, I will take this `RDD` into the transformation of `coalesce`
later.
And about the performance, in our scenario we reused the cached RDD tens of
Github user asfgit closed the pull request at:
https://github.com/apache/spark/pull/5143
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is
Github user SparkQA commented on the pull request:
https://github.com/apache/spark/pull/5168#issuecomment-85505612
[Test build #29087 has
finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/29087/consoleFull)
for PR 5168 at commit
Github user tgravescs commented on the pull request:
https://github.com/apache/spark/pull/5165#issuecomment-85505673
Thanks for providing the pull request. The text here looks good. I'm +1.
I'll let @srowen finish his review.
---
If your project is set up for it, you can reply
Github user AmplabJenkins commented on the pull request:
https://github.com/apache/spark/pull/5168#issuecomment-85505638
Test PASSed.
Refer to this link for build results (access rights to CI server needed):
Github user hhbyyh commented on the pull request:
https://github.com/apache/spark/pull/4419#issuecomment-85505276
Updated, let me know if this is closer to what's in your mind. Thanks.
---
If your project is set up for it, you can reply to this email and have your
reply appear on
Github user nchammas commented on the pull request:
https://github.com/apache/spark/pull/5109#issuecomment-85521950
OK sounds good. FWIW, I think it's fine if we won't support the
master-spot/slaves-non-spot combo as long as we error out early and cleanly.
But if you want to make
Github user zsxwing commented on a diff in the pull request:
https://github.com/apache/spark/pull/5136#discussion_r27022067
--- Diff:
core/src/main/scala/org/apache/spark/storage/DiskBlockManager.scala ---
@@ -91,7 +90,12 @@ private[spark] class DiskBlockManager(blockManager:
Github user zsxwing commented on a diff in the pull request:
https://github.com/apache/spark/pull/5136#discussion_r27022725
--- Diff:
core/src/main/scala/org/apache/spark/storage/DiskBlockManager.scala ---
@@ -91,7 +90,12 @@ private[spark] class DiskBlockManager(blockManager:
Github user SparkQA commented on the pull request:
https://github.com/apache/spark/pull/5135#issuecomment-85493076
[Test build #29088 has
started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/29088/consoleFull)
for PR 5135 at commit
Github user SparkQA commented on the pull request:
https://github.com/apache/spark/pull/5156#issuecomment-85478833
[Test build #29081 has
finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/29081/consoleFull)
for PR 5156 at commit
Github user AmplabJenkins commented on the pull request:
https://github.com/apache/spark/pull/5156#issuecomment-85478857
Test PASSed.
Refer to this link for build results (access rights to CI server needed):
Github user SparkQA commented on the pull request:
https://github.com/apache/spark/pull/5165#issuecomment-85481290
[Test build #29082 has
finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/29082/consoleFull)
for PR 5165 at commit
Github user AmplabJenkins commented on the pull request:
https://github.com/apache/spark/pull/5165#issuecomment-85481308
Test PASSed.
Refer to this link for build results (access rights to CI server needed):
Github user AmplabJenkins commented on the pull request:
https://github.com/apache/spark/pull/4588#issuecomment-85417800
Test FAILed.
Refer to this link for build results (access rights to CI server needed):
Github user SparkQA commented on the pull request:
https://github.com/apache/spark/pull/5167#issuecomment-85465161
[Test build #29084 has
started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/29084/consoleFull)
for PR 5167 at commit
Github user redbaron commented on the pull request:
https://github.com/apache/spark/pull/5096#issuecomment-85445464
I wonder how does it deal with vcores management. One executor with 1
allocated core can execute one task, if this task is sparkR then even if it
allocates 1 vcore, R
Github user srowen commented on a diff in the pull request:
https://github.com/apache/spark/pull/5156#discussion_r27016697
--- Diff: bin/compute-classpath.sh ---
@@ -25,17 +25,19 @@ FWDIR=$(cd `dirname $0`/..; pwd)
. $FWDIR/bin/load-spark-env.sh
-if [ -n
Github user SparkQA commented on the pull request:
https://github.com/apache/spark/pull/5165#issuecomment-85453961
[Test build #29079 has
finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/29079/consoleFull)
for PR 5165 at commit
Github user AmplabJenkins commented on the pull request:
https://github.com/apache/spark/pull/5165#issuecomment-85453966
Test FAILed.
Refer to this link for build results (access rights to CI server needed):
GitHub user yanboliang opened a pull request:
https://github.com/apache/spark/pull/5167
LogisticRegressionWithLBFGS.run(input, initialWeights) should initialize
numFeatures
LogisticRegressionWithLBFGS.run(input, initialWeights) should initialize
numFeatures
You can merge this
Github user srowen commented on a diff in the pull request:
https://github.com/apache/spark/pull/5167#discussion_r27020882
--- Diff:
mllib/src/main/scala/org/apache/spark/mllib/regression/GeneralizedLinearAlgorithm.scala
---
@@ -211,6 +211,10 @@ abstract class
Github user AmplabJenkins commented on the pull request:
https://github.com/apache/spark/pull/5134#issuecomment-85466952
Test PASSed.
Refer to this link for build results (access rights to CI server needed):
Github user srowen commented on the pull request:
https://github.com/apache/spark/pull/5164#issuecomment-85427681
Most of this pull request is wrong, including introducing a logic error.
The bits that might be of marginal value are probably too trivial to bother
with, like using
Github user zsxwing commented on a diff in the pull request:
https://github.com/apache/spark/pull/5136#discussion_r27014378
--- Diff:
core/src/main/scala/org/apache/spark/storage/DiskBlockManager.scala ---
@@ -91,7 +90,12 @@ private[spark] class DiskBlockManager(blockManager:
Github user SparkQA commented on the pull request:
https://github.com/apache/spark/pull/5163#issuecomment-85430073
[Test build #29075 has
finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/29075/consoleFull)
for PR 5163 at commit
Github user AmplabJenkins commented on the pull request:
https://github.com/apache/spark/pull/5163#issuecomment-85430123
Test PASSed.
Refer to this link for build results (access rights to CI server needed):
Github user preaudc commented on the pull request:
https://github.com/apache/spark/pull/5165#issuecomment-85435545
I don't know if this is normal that all commits show up in the PR, I'm not
yet familiar with this tool... Anyway, only the last one is relevant for this
PR, and that's
Github user SparkQA commented on the pull request:
https://github.com/apache/spark/pull/5156#issuecomment-85445863
[Test build #29081 has
started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/29081/consoleFull)
for PR 5156 at commit
Github user zsxwing commented on a diff in the pull request:
https://github.com/apache/spark/pull/5136#discussion_r27019370
--- Diff:
core/src/main/scala/org/apache/spark/storage/DiskBlockManager.scala ---
@@ -91,7 +90,12 @@ private[spark] class DiskBlockManager(blockManager:
Github user AmplabJenkins commented on the pull request:
https://github.com/apache/spark/pull/5166#issuecomment-85462436
Can one of the admins verify this patch?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your
Github user SparkQA commented on the pull request:
https://github.com/apache/spark/pull/4588#issuecomment-85417764
[Test build #29073 has
finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/29073/consoleFull)
for PR 4588 at commit
Github user zsxwing commented on the pull request:
https://github.com/apache/spark/pull/4588#issuecomment-85422331
Jenkins, retest this please.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not
Github user srowen commented on a diff in the pull request:
https://github.com/apache/spark/pull/5164#discussion_r27013631
--- Diff: core/src/main/scala/org/apache/spark/rdd/HadoopRDD.scala ---
@@ -17,36 +17,28 @@
package org.apache.spark.rdd
+import
Github user srowen commented on a diff in the pull request:
https://github.com/apache/spark/pull/5164#discussion_r27013679
--- Diff: core/src/main/scala/org/apache/spark/rdd/HadoopRDD.scala ---
@@ -99,27 +93,27 @@ private[spark] class HadoopPartition(rddId: Int, idx:
Int,
Github user srowen commented on a diff in the pull request:
https://github.com/apache/spark/pull/5136#discussion_r27014574
--- Diff:
core/src/main/scala/org/apache/spark/storage/DiskBlockManager.scala ---
@@ -91,7 +90,12 @@ private[spark] class DiskBlockManager(blockManager:
Github user SparkQA commented on the pull request:
https://github.com/apache/spark/pull/5165#issuecomment-85436539
[Test build #29079 has
started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/29079/consoleFull)
for PR 5165 at commit
Github user nyaapa commented on a diff in the pull request:
https://github.com/apache/spark/pull/5115#discussion_r27019243
--- Diff:
core/src/main/scala/org/apache/spark/deploy/LocalSparkCluster.scala ---
@@ -53,7 +53,7 @@ class LocalSparkCluster(
/* Start the Master */
Github user SparkQA commented on the pull request:
https://github.com/apache/spark/pull/4588#issuecomment-85423410
[Test build #29078 has
started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/29078/consoleFull)
for PR 4588 at commit
Github user SparkQA commented on the pull request:
https://github.com/apache/spark/pull/5161#issuecomment-85423312
[Test build #29070 has
finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/29070/consoleFull)
for PR 5161 at commit
Github user AmplabJenkins commented on the pull request:
https://github.com/apache/spark/pull/5161#issuecomment-85423321
Test PASSed.
Refer to this link for build results (access rights to CI server needed):
Github user srowen commented on the pull request:
https://github.com/apache/spark/pull/5156#issuecomment-85445388
ok to test
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
Github user srowen commented on the pull request:
https://github.com/apache/spark/pull/5150#issuecomment-85451838
(Add `[SQL]` to the title please to sort it properly. It's borderline
whether this should have a JIRA, but it's pretty much the same to describe as
to fix.)
---
If your
Github user SparkQA commented on the pull request:
https://github.com/apache/spark/pull/4588#issuecomment-85451891
[Test build #29078 has
finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/29078/consoleFull)
for PR 4588 at commit
Github user srowen commented on a diff in the pull request:
https://github.com/apache/spark/pull/5136#discussion_r27020752
--- Diff:
core/src/main/scala/org/apache/spark/storage/DiskBlockManager.scala ---
@@ -91,7 +90,12 @@ private[spark] class DiskBlockManager(blockManager:
Github user AmplabJenkins commented on the pull request:
https://github.com/apache/spark/pull/5152#issuecomment-85434041
Test FAILed.
Refer to this link for build results (access rights to CI server needed):
Github user zhpengg closed the pull request at:
https://github.com/apache/spark/pull/4909
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is
Github user zhpengg commented on the pull request:
https://github.com/apache/spark/pull/4909#issuecomment-85455264
OK, I'll close this PR.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have
Github user SparkQA commented on the pull request:
https://github.com/apache/spark/pull/4909#issuecomment-85455274
[Test build #29083 has
started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/29083/consoleFull)
for PR 4909 at commit
GitHub user lianhuiwang opened a pull request:
https://github.com/apache/spark/pull/5168
[SPARK-5763][Core]add Sort-Merge Join to resolve skewed data
add sort-merge join to resolve skewed data.
i provide three interface to achieve join operator using SortMergeJoinRDD,
there
Github user SparkQA commented on the pull request:
https://github.com/apache/spark/pull/5162#issuecomment-85423411
[Test build #29069 has
finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/29069/consoleFull)
for PR 5162 at commit
Github user AmplabJenkins commented on the pull request:
https://github.com/apache/spark/pull/5162#issuecomment-85423424
Test PASSed.
Refer to this link for build results (access rights to CI server needed):
Github user srowen commented on a diff in the pull request:
https://github.com/apache/spark/pull/5136#discussion_r27013388
--- Diff:
core/src/main/scala/org/apache/spark/storage/DiskBlockManager.scala ---
@@ -91,7 +90,12 @@ private[spark] class DiskBlockManager(blockManager:
Github user asfgit closed the pull request at:
https://github.com/apache/spark/pull/5145
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is
Github user zsxwing commented on a diff in the pull request:
https://github.com/apache/spark/pull/5136#discussion_r27018077
--- Diff:
core/src/main/scala/org/apache/spark/storage/DiskBlockManager.scala ---
@@ -91,7 +90,12 @@ private[spark] class DiskBlockManager(blockManager:
Github user AmplabJenkins commented on the pull request:
https://github.com/apache/spark/pull/4588#issuecomment-85451900
Test PASSed.
Refer to this link for build results (access rights to CI server needed):
Github user SparkQA commented on the pull request:
https://github.com/apache/spark/pull/5134#issuecomment-85466931
[Test build #29080 has
finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/29080/consoleFull)
for PR 5134 at commit
Github user srowen commented on a diff in the pull request:
https://github.com/apache/spark/pull/5164#discussion_r27013777
--- Diff: core/src/main/scala/org/apache/spark/rdd/HadoopRDD.scala ---
@@ -231,11 +231,11 @@ class HadoopRDD[K, V](
var reader: RecordReader[K, V] =
Github user srowen commented on a diff in the pull request:
https://github.com/apache/spark/pull/5164#discussion_r27013742
--- Diff: core/src/main/scala/org/apache/spark/rdd/HadoopRDD.scala ---
@@ -185,8 +181,10 @@ class HadoopRDD[K, V](
// done in each local process.
Github user srowen commented on a diff in the pull request:
https://github.com/apache/spark/pull/5164#discussion_r27013801
--- Diff:
core/src/main/scala/org/apache/spark/rdd/ZippedPartitionsRDD.scala ---
@@ -60,7 +60,7 @@ private[spark] abstract class ZippedPartitionsBaseRDD[V:
Github user steveloughran commented on the pull request:
https://github.com/apache/spark/pull/5119#issuecomment-85448943
-Updated patch with the indentation corrected; plugin version entrusted to
the apache parent template
---
If your project is set up for it, you can reply to this
Github user srowen commented on the pull request:
https://github.com/apache/spark/pull/5152#issuecomment-85450894
This seems like a lot of extra code to add; can we quantify the performance
difference at all? I agree with @sryza that it sounds like one of those things
worth doing
Github user srowen commented on a diff in the pull request:
https://github.com/apache/spark/pull/5136#discussion_r27018550
--- Diff:
core/src/main/scala/org/apache/spark/storage/DiskBlockManager.scala ---
@@ -91,7 +90,12 @@ private[spark] class DiskBlockManager(blockManager:
Github user SparkQA commented on the pull request:
https://github.com/apache/spark/pull/5165#issuecomment-85453449
[Test build #29082 has
started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/29082/consoleFull)
for PR 5165 at commit
Github user SparkQA commented on the pull request:
https://github.com/apache/spark/pull/5162#issuecomment-85711127
[Test build #29107 has
finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/29107/consoleFull)
for PR 5162 at commit
Github user mengxr commented on a diff in the pull request:
https://github.com/apache/spark/pull/5173#discussion_r27079067
--- Diff: python/pyspark/mllib/clustering.py ---
@@ -168,8 +168,8 @@ def predictSoft(self, x):
if isinstance(x, RDD):
means,
Github user SparkQA commented on the pull request:
https://github.com/apache/spark/pull/4027#issuecomment-85718866
[Test build #29110 has
finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/29110/consoleFull)
for PR 4027 at commit
1 - 100 of 646 matches
Mail list logo