Github user rxin commented on the pull request:
https://github.com/apache/spark/pull/1241#issuecomment-48569956
Actually it's still necessary once the PR gets merged. It basically
compares all the public interfaces at bytecode level against 1.0, and see if
any public inter
Github user rxin commented on a diff in the pull request:
https://github.com/apache/spark/pull/1351#discussion_r14751335
--- Diff: sql/core/src/main/scala/org/apache/spark/sql/csv/CsvRDD.scala ---
@@ -0,0 +1,91 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF
Github user rxin commented on a diff in the pull request:
https://github.com/apache/spark/pull/1351#discussion_r14751361
--- Diff: sql/core/src/main/scala/org/apache/spark/sql/SQLContext.scala ---
@@ -130,6 +131,47 @@ class SQLContext(@transient val sparkContext:
SparkContext
Github user rxin commented on the pull request:
https://github.com/apache/spark/pull/1352#issuecomment-48570213
Merging in master. Thanks!
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have
Github user rxin commented on the pull request:
https://github.com/apache/spark/pull/1350#issuecomment-48570288
Jenkins, test this please.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have
Github user rxin commented on the pull request:
https://github.com/apache/spark/pull/1346#issuecomment-48570378
@yhuai I haven't looked at the changes yet, but can you make sure the end
API is usable in Java?
---
If your project is set up for it, you can reply to this email and
Github user rxin commented on a diff in the pull request:
https://github.com/apache/spark/pull/1351#discussion_r14752632
--- Diff: sql/core/src/main/scala/org/apache/spark/sql/csv/CsvRDD.scala ---
@@ -0,0 +1,91 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF
Github user rxin commented on a diff in the pull request:
https://github.com/apache/spark/pull/1351#discussion_r14752678
--- Diff: sql/core/src/main/scala/org/apache/spark/sql/SQLContext.scala ---
@@ -130,6 +131,47 @@ class SQLContext(@transient val sparkContext:
SparkContext
Github user rxin commented on a diff in the pull request:
https://github.com/apache/spark/pull/1351#discussion_r14753118
--- Diff: sql/core/src/main/scala/org/apache/spark/sql/csv/CsvRDD.scala ---
@@ -0,0 +1,91 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF
Github user rxin commented on a diff in the pull request:
https://github.com/apache/spark/pull/1351#discussion_r14753240
--- Diff: sql/core/src/main/scala/org/apache/spark/sql/SQLContext.scala ---
@@ -130,6 +131,47 @@ class SQLContext(@transient val sparkContext:
SparkContext
Github user rxin commented on a diff in the pull request:
https://github.com/apache/spark/pull/1351#discussion_r14753286
--- Diff: sql/core/src/main/scala/org/apache/spark/sql/SQLContext.scala ---
@@ -130,6 +131,47 @@ class SQLContext(@transient val sparkContext:
SparkContext
Github user rxin commented on a diff in the pull request:
https://github.com/apache/spark/pull/1351#discussion_r14753321
--- Diff: sql/core/src/main/scala/org/apache/spark/sql/SQLContext.scala ---
@@ -130,6 +131,47 @@ class SQLContext(@transient val sparkContext:
SparkContext
Github user rxin commented on a diff in the pull request:
https://github.com/apache/spark/pull/1351#discussion_r14753430
--- Diff: sql/core/src/main/scala/org/apache/spark/sql/csv/CsvRDD.scala ---
@@ -0,0 +1,91 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF
Github user rxin commented on a diff in the pull request:
https://github.com/apache/spark/pull/1351#discussion_r14753602
--- Diff:
sql/core/src/main/scala/org/apache/spark/sql/csv/CsvTokenizer.scala ---
@@ -0,0 +1,118 @@
+/*
+ * Licensed to the Apache Software Foundation
Github user rxin commented on the pull request:
https://github.com/apache/spark/pull/1350#issuecomment-48576867
Thanks - merging this in master.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not
Github user rxin commented on the pull request:
https://github.com/apache/spark/pull/1238#issuecomment-48577069
If this is no longer WIP, can you change the title to make it more
descriptive? At the very least I wouldn't include the word "Prototype".
---
If your pr
Github user rxin commented on a diff in the pull request:
https://github.com/apache/spark/pull/1346#discussion_r14754430
--- Diff:
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/types/dataTypes.scala
---
@@ -93,47 +92,56 @@ abstract class DataType
Github user rxin commented on a diff in the pull request:
https://github.com/apache/spark/pull/1346#discussion_r14754724
--- Diff: sql/core/src/main/scala/org/apache/spark/sql/SchemaRDDLike.scala
---
@@ -123,9 +123,12 @@ private[sql] trait SchemaRDDLike {
def saveAsTable
Github user rxin commented on a diff in the pull request:
https://github.com/apache/spark/pull/1353#discussion_r14755685
--- Diff:
examples/src/main/scala/org/apache/spark/examples/SparkPageRank.scala ---
@@ -31,8 +31,12 @@ import org.apache.spark.{SparkConf, SparkContext
Github user rxin commented on the pull request:
https://github.com/apache/spark/pull/1353#issuecomment-48640422
Jenkins, test this please.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have
Github user rxin commented on a diff in the pull request:
https://github.com/apache/spark/pull/1744#discussion_r15780210
--- Diff: dev/lint-python ---
@@ -0,0 +1,63 @@
+#!/usr/bin/env bash
+
+#
+# Licensed to the Apache Software Foundation (ASF) under one or more
Github user rxin commented on a diff in the pull request:
https://github.com/apache/spark/pull/1744#discussion_r15780205
--- Diff: dev/lint-python ---
@@ -0,0 +1,63 @@
+#!/usr/bin/env bash
+
+#
+# Licensed to the Apache Software Foundation (ASF) under one or more
GitHub user rxin opened a pull request:
https://github.com/apache/spark/pull/1772
[SPARK-2323] Exception in accumulator update should not crash DAGScheduler
& SparkContext
You can merge this pull request into a Git repository by running:
$ git pull https://github.com/
Github user rxin commented on the pull request:
https://github.com/apache/spark/pull/1772#issuecomment-51132429
This doesn't actually weaken the semantics because previously it would just
crash. Given the 1.1 timeline, it is best to use the current semantics, which
is reall
Github user rxin commented on the pull request:
https://github.com/apache/spark/pull/1744#issuecomment-51132704
Haha, I don't know if it responds to such complex phrases.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as
Github user rxin commented on the pull request:
https://github.com/apache/spark/pull/1744#issuecomment-51132720
Jenkins, retest this please.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have
Github user rxin commented on the pull request:
https://github.com/apache/spark/pull/1772#issuecomment-51134330
Jenkins, retest this please.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have
Github user rxin commented on the pull request:
https://github.com/apache/spark/pull/1772#issuecomment-51141863
Jenkins, retest this please.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have
Github user rxin commented on the pull request:
https://github.com/apache/spark/pull/1254#issuecomment-51142915
I'm not sure what's happening. Maybe Jenkins is lazy today. We can retry
tomorrow, and if it doesn't work, create a new PR.
---
If your project is set
Github user rxin commented on the pull request:
https://github.com/apache/spark/pull/1772#issuecomment-51146388
Merging this in master. Thanks.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not
Github user rxin commented on the pull request:
https://github.com/apache/spark/pull/1772#issuecomment-51146442
Actually I merged it in master, branch-1.0. and branch-1.1.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well
GitHub user rxin opened a pull request:
https://github.com/apache/spark/pull/1780
[SPARK-2856] Decrease initial buffer size for Kryo to 64KB.
You can merge this pull request into a Git repository by running:
$ git pull https://github.com/rxin/spark kryo-init-size
Github user rxin commented on a diff in the pull request:
https://github.com/apache/spark/pull/1648#discussion_r15796222
--- Diff:
sql/hive/compatibility/src/test/scala/org/apache/spark/sql/hive/execution/HiveCompatibilitySuite.scala
---
@@ -38,6 +39,7 @@ class
GitHub user rxin opened a pull request:
https://github.com/apache/spark/pull/1781
[SPARK-2503] Lower shuffle output buffer (spark.shuffle.file.buffer.kb) to
32KB.
This can substantially reduce memory usage during shuffle.
You can merge this pull request into a Git repository
Github user rxin commented on a diff in the pull request:
https://github.com/apache/spark/pull/1780#discussion_r15797592
--- Diff:
core/src/main/scala/org/apache/spark/serializer/KryoSerializer.scala ---
@@ -47,7 +47,9 @@ class KryoSerializer(conf: SparkConf)
with Logging
Github user rxin commented on the pull request:
https://github.com/apache/spark/pull/1744#issuecomment-51164849
Maybe it was started before that fix got merged. Let's run this again,
---
If your project is set up for it, you can reply to this email and have your
reply appe
Github user rxin commented on the pull request:
https://github.com/apache/spark/pull/1744#issuecomment-51164855
Jenkins, retest this please.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have
Github user rxin commented on the pull request:
https://github.com/apache/spark/pull/1780#issuecomment-51164898
Thanks. Merging in master. @andrewor14 if you feel strongly about it, I can
push a commit to add an one-line comment.
---
If your project is set up for it, you can reply
Github user rxin commented on the pull request:
https://github.com/apache/spark/pull/1782#issuecomment-51165308
Jenkins, test this please.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have
Github user rxin commented on the pull request:
https://github.com/apache/spark/pull/1648#issuecomment-51171274
Jenkins, retest this please.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have
GitHub user rxin opened a pull request:
https://github.com/apache/spark/pull/1784
Set Spark SQL Hive compatibility test shuffle partitions to 2.
This should improve test runtime because majority of the test runtime are
scheduling and task overheads.
You can merge this pull request
Github user rxin commented on a diff in the pull request:
https://github.com/apache/spark/pull/1792#discussion_r15840899
--- Diff: core/src/main/scala/org/apache/spark/rdd/JdbcRDD.scala ---
@@ -70,20 +70,30 @@ class JdbcRDD[T: ClassTag](
override def compute(thePart
Github user rxin commented on a diff in the pull request:
https://github.com/apache/spark/pull/1792#discussion_r15843358
--- Diff: core/src/main/scala/org/apache/spark/rdd/JdbcRDD.scala ---
@@ -106,7 +106,7 @@ class JdbcRDD[T: ClassTag](
case e: Exception => logWarn
Github user rxin commented on the pull request:
https://github.com/apache/spark/pull/1744#issuecomment-51263909
He's probably en route somewhere because he woke up early this morning to
catch a flight...
---
If your project is set up for it, you can reply to this email and have
Github user rxin commented on the pull request:
https://github.com/apache/spark/pull/1744#issuecomment-51263913
Jenkins, retest this please.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have
Github user rxin commented on the pull request:
https://github.com/apache/spark/pull/1744#issuecomment-51263924
Jenkins, add to whitelist.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have
Github user rxin commented on the pull request:
https://github.com/apache/spark/pull/1758#issuecomment-51270610
Jenkins, retest this please.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have
Github user rxin commented on the pull request:
https://github.com/apache/spark/pull/1744#issuecomment-51273157
I think Jenkins is sick ... we can keep trying ...
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your
Github user rxin commented on the pull request:
https://github.com/apache/spark/pull/1744#issuecomment-51273180
Jenkins, retest this please.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have
Github user rxin commented on the pull request:
https://github.com/apache/spark/pull/1781#issuecomment-51274284
Thanks. Merging in master & branch-1.1.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project
Github user rxin commented on the pull request:
https://github.com/apache/spark/pull/1792#issuecomment-51277474
Jenkins, test this please.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have
GitHub user rxin opened a pull request:
https://github.com/apache/spark/pull/1794
Tighten the visibility of various SQLConf methods and renamed setter/getters
You can merge this pull request into a Git repository by running:
$ git pull https://github.com/rxin/spark sql-conf
Github user rxin commented on the pull request:
https://github.com/apache/spark/pull/1792#issuecomment-51282959
Thanks. Merging this in master & branch-1.0 and branch-1.1.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as
Github user rxin commented on the pull request:
https://github.com/apache/spark/pull/1787#issuecomment-51283209
Jenkins, test this please.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have
Github user rxin commented on the pull request:
https://github.com/apache/spark/pull/1786#issuecomment-51285191
Jenkins, test this please.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have
Github user rxin commented on the pull request:
https://github.com/apache/spark/pull/1794#issuecomment-51289764
@concretevitamin
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
Github user rxin commented on the pull request:
https://github.com/apache/spark/pull/1744#issuecomment-51367875
This looks good to me. What do you think, @JoshRosen / @davies ?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as
Github user rxin commented on the pull request:
https://github.com/apache/spark/pull/1744#issuecomment-51373160
The line length is already 100 in this PR.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project
Github user rxin commented on a diff in the pull request:
https://github.com/apache/spark/pull/1799#discussion_r15897877
--- Diff: core/src/main/scala/org/apache/spark/SparkEnv.scala ---
@@ -246,8 +250,13 @@ object SparkEnv extends Logging
Github user rxin commented on a diff in the pull request:
https://github.com/apache/spark/pull/1799#discussion_r15898020
--- Diff:
core/src/main/scala/org/apache/spark/shuffle/sort/SortShuffleWriter.scala ---
@@ -54,87 +55,36 @@ private[spark] class SortShuffleWriter[K, V, C
Github user rxin commented on a diff in the pull request:
https://github.com/apache/spark/pull/1799#discussion_r15898462
--- Diff:
core/src/main/scala/org/apache/spark/util/collection/ExternalSorter.scala ---
@@ -640,9 +713,122 @@ private[spark] class ExternalSorter[K, V, C
Github user rxin commented on a diff in the pull request:
https://github.com/apache/spark/pull/1799#discussion_r15898826
--- Diff:
core/src/main/scala/org/apache/spark/util/collection/ExternalSorter.scala ---
@@ -640,9 +713,122 @@ private[spark] class ExternalSorter[K, V, C
Github user rxin commented on a diff in the pull request:
https://github.com/apache/spark/pull/1799#discussion_r15898996
--- Diff:
core/src/main/scala/org/apache/spark/util/collection/ExternalSorter.scala ---
@@ -120,6 +128,18 @@ private[spark] class ExternalSorter[K, V, C
Github user rxin commented on the pull request:
https://github.com/apache/spark/pull/1744#issuecomment-51388197
Ok I'm merging this in master. Thanks, @nchammas.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If
Github user rxin commented on the pull request:
https://github.com/apache/spark/pull/1744#issuecomment-51388412
And branch-1.1 too.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this
Github user rxin commented on the pull request:
https://github.com/apache/spark/pull/1758#issuecomment-51388657
It actually failed deterministically on Josh's laptop. It could be a bug
somewhere in the configuration or a racing condition in Spark.
---
If your project is set u
Github user rxin commented on a diff in the pull request:
https://github.com/apache/spark/pull/1805#discussion_r15899974
--- Diff: pom.xml ---
@@ -143,11 +143,11 @@
- maven-repo
+ central
--- End diff --
any reason we call
Github user rxin commented on a diff in the pull request:
https://github.com/apache/spark/pull/1689#discussion_r15900503
--- Diff: core/src/main/scala/org/apache/spark/Partitioner.scala ---
@@ -113,8 +113,13 @@ class RangePartitioner[K : Ordering : ClassTag, V](
private var
Github user rxin commented on a diff in the pull request:
https://github.com/apache/spark/pull/1689#discussion_r15919352
--- Diff: core/src/main/scala/org/apache/spark/Partitioner.scala ---
@@ -222,7 +228,8 @@ class RangePartitioner[K : Ordering : ClassTag, V
Github user rxin commented on a diff in the pull request:
https://github.com/apache/spark/pull/1689#discussion_r15919599
--- Diff: core/src/main/scala/org/apache/spark/Partitioner.scala ---
@@ -113,8 +113,12 @@ class RangePartitioner[K : Ordering : ClassTag, V](
private var
Github user rxin commented on a diff in the pull request:
https://github.com/apache/spark/pull/1812#discussion_r15919681
--- Diff: core/src/main/scala/org/apache/spark/rdd/RDD.scala ---
@@ -1004,7 +1004,7 @@ abstract class RDD[T: ClassTag](
},
(h1
Github user rxin commented on the pull request:
https://github.com/apache/spark/pull/1824#issuecomment-51434585
What is this pull request for? Do you mind closing it?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If
Github user rxin commented on the pull request:
https://github.com/apache/spark/pull/1825#issuecomment-51434809
Jenkins, add to whitelist.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have
Github user rxin commented on a diff in the pull request:
https://github.com/apache/spark/pull/1776#discussion_r15919794
--- Diff: core/src/main/scala/org/apache/spark/HttpServer.scala ---
@@ -50,7 +50,7 @@ private[spark] class HttpServer(resourceBase: File,
securityManager
Github user rxin commented on a diff in the pull request:
https://github.com/apache/spark/pull/1776#discussion_r15919800
--- Diff: core/src/main/scala/org/apache/spark/SecurityManager.scala ---
@@ -146,7 +146,7 @@ private[spark] class SecurityManager(sparkConf:
SparkConf) extends
Github user rxin commented on a diff in the pull request:
https://github.com/apache/spark/pull/1776#discussion_r15919803
--- Diff: core/src/main/scala/org/apache/spark/SecurityManager.scala ---
@@ -170,8 +170,8 @@ private[spark] class SecurityManager(sparkConf:
SparkConf) extends
Github user rxin commented on a diff in the pull request:
https://github.com/apache/spark/pull/1776#discussion_r15919807
--- Diff: core/src/main/scala/org/apache/spark/SecurityManager.scala ---
@@ -179,8 +179,8 @@ private[spark] class SecurityManager(sparkConf:
SparkConf) extends
Github user rxin commented on a diff in the pull request:
https://github.com/apache/spark/pull/1776#discussion_r15919824
--- Diff: core/src/main/scala/org/apache/spark/SparkContext.scala ---
@@ -810,7 +810,7 @@ class SparkContext(config: SparkConf) extends Logging
Github user rxin commented on a diff in the pull request:
https://github.com/apache/spark/pull/1776#discussion_r15919835
--- Diff: core/src/main/scala/org/apache/spark/executor/Executor.scala ---
@@ -167,7 +167,7 @@ private[spark] class Executor(
SparkEnv.set(env
Github user rxin commented on a diff in the pull request:
https://github.com/apache/spark/pull/1776#discussion_r15919838
--- Diff: core/src/main/scala/org/apache/spark/executor/Executor.scala ---
@@ -167,7 +167,7 @@ private[spark] class Executor(
SparkEnv.set(env
Github user rxin commented on a diff in the pull request:
https://github.com/apache/spark/pull/1776#discussion_r15919852
--- Diff: core/src/main/scala/org/apache/spark/rdd/HadoopRDD.scala ---
@@ -188,7 +188,7 @@ class HadoopRDD[K, V](
val iter = new NextIterator[(K, V
Github user rxin commented on a diff in the pull request:
https://github.com/apache/spark/pull/1776#discussion_r15919867
--- Diff: core/src/main/scala/org/apache/spark/scheduler/DAGScheduler.scala
---
@@ -859,7 +859,7 @@ class DAGScheduler(
return
Github user rxin commented on a diff in the pull request:
https://github.com/apache/spark/pull/1776#discussion_r15919860
--- Diff: core/src/main/scala/org/apache/spark/scheduler/DAGScheduler.scala
---
@@ -535,12 +535,12 @@ class DAGScheduler(
* Cancel a job that is running
Github user rxin commented on a diff in the pull request:
https://github.com/apache/spark/pull/1776#discussion_r15919859
--- Diff: core/src/main/scala/org/apache/spark/scheduler/DAGScheduler.scala
---
@@ -535,12 +535,12 @@ class DAGScheduler(
* Cancel a job that is running
Github user rxin commented on a diff in the pull request:
https://github.com/apache/spark/pull/1776#discussion_r15919873
--- Diff: core/src/main/scala/org/apache/spark/scheduler/DAGScheduler.scala
---
@@ -1086,7 +1086,7 @@ class DAGScheduler(
handleJobCancellation
Github user rxin commented on a diff in the pull request:
https://github.com/apache/spark/pull/1776#discussion_r15919874
--- Diff: core/src/main/scala/org/apache/spark/scheduler/DAGScheduler.scala
---
@@ -1117,7 +1117,7 @@ class DAGScheduler(
failJobAndIndependentStages
Github user rxin commented on a diff in the pull request:
https://github.com/apache/spark/pull/1776#discussion_r15919880
--- Diff:
core/src/main/scala/org/apache/spark/scheduler/TaskSetManager.scala ---
@@ -431,7 +431,7 @@ private[spark] class TaskSetManager(
// a
Github user rxin commented on a diff in the pull request:
https://github.com/apache/spark/pull/1776#discussion_r15919888
--- Diff:
core/src/main/scala/org/apache/spark/scheduler/TaskSetManager.scala ---
@@ -503,7 +503,7 @@ private[spark] class TaskSetManager(
isZombie
Github user rxin commented on a diff in the pull request:
https://github.com/apache/spark/pull/1776#discussion_r15919916
--- Diff: core/src/main/scala/org/apache/spark/storage/BlockManager.scala
---
@@ -908,7 +908,7 @@ private[spark] class BlockManager(
* Remove all blocks
Github user rxin commented on a diff in the pull request:
https://github.com/apache/spark/pull/1776#discussion_r15919915
--- Diff: core/src/main/scala/org/apache/spark/storage/BlockManager.scala
---
@@ -898,7 +898,7 @@ private[spark] class BlockManager(
*/
def
Github user rxin commented on a diff in the pull request:
https://github.com/apache/spark/pull/1776#discussion_r15919909
--- Diff: core/src/main/scala/org/apache/spark/storage/BlockManager.scala
---
@@ -518,12 +518,12 @@ private[spark] class BlockManager(
def get(blockId
Github user rxin commented on a diff in the pull request:
https://github.com/apache/spark/pull/1776#discussion_r15919913
--- Diff: core/src/main/scala/org/apache/spark/storage/BlockManager.scala
---
@@ -856,7 +856,7 @@ private[spark] class BlockManager(
// Drop
Github user rxin commented on a diff in the pull request:
https://github.com/apache/spark/pull/1776#discussion_r15919907
--- Diff: core/src/main/scala/org/apache/spark/storage/BlockManager.scala
---
@@ -518,12 +518,12 @@ private[spark] class BlockManager(
def get(blockId
Github user rxin commented on a diff in the pull request:
https://github.com/apache/spark/pull/1776#discussion_r15919922
--- Diff:
core/src/main/scala/org/apache/spark/storage/BlockManagerMaster.scala ---
@@ -38,14 +38,14 @@ class BlockManagerMaster(var driverActor: ActorRef
Github user rxin commented on a diff in the pull request:
https://github.com/apache/spark/pull/1776#discussion_r15919918
--- Diff: core/src/main/scala/org/apache/spark/storage/BlockManager.scala
---
@@ -920,7 +920,7 @@ private[spark] class BlockManager(
* Remove a block
Github user rxin commented on a diff in the pull request:
https://github.com/apache/spark/pull/1776#discussion_r15919910
--- Diff: core/src/main/scala/org/apache/spark/storage/BlockManager.scala
---
@@ -837,7 +837,7 @@ private[spark] class BlockManager(
blockId: BlockId
Github user rxin commented on a diff in the pull request:
https://github.com/apache/spark/pull/1776#discussion_r15919930
--- Diff:
core/src/main/scala/org/apache/spark/storage/BlockManagerMaster.scala ---
@@ -38,14 +38,14 @@ class BlockManagerMaster(var driverActor: ActorRef
Github user rxin commented on a diff in the pull request:
https://github.com/apache/spark/pull/1776#discussion_r15919933
--- Diff:
core/src/main/scala/org/apache/spark/storage/BlockManagerMaster.scala ---
@@ -57,7 +57,7 @@ class BlockManagerMaster(var driverActor: ActorRef, conf
Github user rxin commented on a diff in the pull request:
https://github.com/apache/spark/pull/1776#discussion_r15919929
--- Diff:
core/src/main/scala/org/apache/spark/storage/BlockManagerMaster.scala ---
@@ -38,14 +38,14 @@ class BlockManagerMaster(var driverActor: ActorRef
Github user rxin commented on a diff in the pull request:
https://github.com/apache/spark/pull/1776#discussion_r15919942
--- Diff:
core/src/main/scala/org/apache/spark/storage/BlockManagerMasterActor.scala ---
@@ -69,7 +69,7 @@ class BlockManagerMasterActor(val isLocal: Boolean
1201 - 1300 of 15085 matches
Mail list logo