Repository: spark Updated Branches: refs/heads/master ad0fde10b -> 7894de276
Revert "[SPARK-4183] Enable NettyBlockTransferService by default" This reverts commit 59e626c701227634336110e1bc23afd94c535ede. Project: http://git-wip-us.apache.org/repos/asf/spark/repo Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/7894de27 Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/7894de27 Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/7894de27 Branch: refs/heads/master Commit: 7894de276b8d0b0e4efc654d0b254fc2a6f6077c Parents: ad0fde1 Author: Patrick Wendell <[email protected]> Authored: Sat Nov 1 15:18:58 2014 -0700 Committer: Patrick Wendell <[email protected]> Committed: Sat Nov 1 15:18:58 2014 -0700 ---------------------------------------------------------------------- core/src/main/scala/org/apache/spark/SparkEnv.scala | 2 +- docs/configuration.md | 10 ---------- 2 files changed, 1 insertion(+), 11 deletions(-) ---------------------------------------------------------------------- http://git-wip-us.apache.org/repos/asf/spark/blob/7894de27/core/src/main/scala/org/apache/spark/SparkEnv.scala ---------------------------------------------------------------------- diff --git a/core/src/main/scala/org/apache/spark/SparkEnv.scala b/core/src/main/scala/org/apache/spark/SparkEnv.scala index e2f13ac..7fb2b91 100644 --- a/core/src/main/scala/org/apache/spark/SparkEnv.scala +++ b/core/src/main/scala/org/apache/spark/SparkEnv.scala @@ -274,7 +274,7 @@ object SparkEnv extends Logging { val shuffleMemoryManager = new ShuffleMemoryManager(conf) val blockTransferService = - conf.get("spark.shuffle.blockTransferService", "netty").toLowerCase match { + conf.get("spark.shuffle.blockTransferService", "nio").toLowerCase match { case "netty" => new NettyBlockTransferService(conf) case "nio" => http://git-wip-us.apache.org/repos/asf/spark/blob/7894de27/docs/configuration.md ---------------------------------------------------------------------- diff --git a/docs/configuration.md b/docs/configuration.md index 78c4bf3..3007706 100644 --- a/docs/configuration.md +++ b/docs/configuration.md @@ -359,16 +359,6 @@ Apart from these, the following properties are also available, and may be useful map-side aggregation and there are at most this many reduce partitions. </td> </tr> -<tr> - <td><code>spark.shuffle.blockTransferService</code></td> - <td>netty</td> - <td> - Implementation to use for transferring shuffle and cached blocks between executors. There - are two implementations available: <code>netty</code> and <code>nio</code>. Netty-based - block transfer is intended to be simpler but equally efficient and is the default option - starting in 1.2. - </td> -</tr> </table> #### Spark UI --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
