Repository: spark
Updated Branches:
  refs/heads/master 0aea22895 -> 809c785bc


[SPARK-2652] [PySpark] donot use KyroSerializer as default serializer

KyroSerializer can not serialize customized class without registered 
explicitly, use it as default serializer in PySpark will introduce some 
regression in MLlib.

cc mengxr

Author: Davies Liu <[email protected]>

Closes #2916 from davies/revert and squashes the following commits:

43eb6d3 [Davies Liu] donot use KyroSerializer as default serializer


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/809c785b
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/809c785b
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/809c785b

Branch: refs/heads/master
Commit: 809c785bcc33e684a68ea14240a466def864199a
Parents: 0aea228
Author: Davies Liu <[email protected]>
Authored: Thu Oct 23 23:58:00 2014 -0700
Committer: Xiangrui Meng <[email protected]>
Committed: Thu Oct 23 23:58:00 2014 -0700

----------------------------------------------------------------------
 python/pyspark/context.py | 1 -
 1 file changed, 1 deletion(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/spark/blob/809c785b/python/pyspark/context.py
----------------------------------------------------------------------
diff --git a/python/pyspark/context.py b/python/pyspark/context.py
index 8d27ccb..5f8dced 100644
--- a/python/pyspark/context.py
+++ b/python/pyspark/context.py
@@ -43,7 +43,6 @@ __all__ = ['SparkContext']
 # These are special default configs for PySpark, they will overwrite
 # the default ones for Spark if they are not configured by user.
 DEFAULT_CONFIGS = {
-    "spark.serializer": "org.apache.spark.serializer.KryoSerializer",
     "spark.serializer.objectStreamReset": 100,
     "spark.rdd.compress": True,
 }


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to