cloud-fan commented on a change in pull request #29242:
URL: https://github.com/apache/spark/pull/29242#discussion_r484193839



##########
File path: python/pyspark/sql/dataframe.py
##########
@@ -678,13 +678,14 @@ def cache(self):
         return self
 
     @since(1.3)
-    def persist(self, storageLevel=StorageLevel.MEMORY_AND_DISK):
+    def persist(self, storageLevel=StorageLevel.MEMORY_AND_DISK_DESER):

Review comment:
       Now I see the confusion. In Scala, `MEMORY_AND_DISK` means 
`deserialized=true`, while in Python, `MEMORY_AND_DISK` means 
`deserialized=false`.

##########
File path: python/pyspark/sql/dataframe.py
##########
@@ -678,13 +678,14 @@ def cache(self):
         return self
 
     @since(1.3)
-    def persist(self, storageLevel=StorageLevel.MEMORY_AND_DISK):
+    def persist(self, storageLevel=StorageLevel.MEMORY_AND_DISK_DESER):
         """Sets the storage level to persist the contents of the 
:class:`DataFrame` across
         operations after the first time it is computed. This can only be used 
to assign
         a new storage level if the :class:`DataFrame` does not have a storage 
level set yet.
-        If no storage level is specified defaults to (`MEMORY_AND_DISK`).
+        If no storage level is specified defaults to (`MEMORY_AND_DISK_DESER`)
 
-        .. note:: The default storage level has changed to `MEMORY_AND_DISK` 
to match Scala in 2.0.
+        .. note:: The default storage level has changed to 
`MEMORY_AND_DISK_DESER` to match Scala
+            in 2.0.

Review comment:
       `in 3.0`?

##########
File path: python/pyspark/sql/dataframe.py
##########
@@ -678,13 +678,14 @@ def cache(self):
         return self
 
     @since(1.3)
-    def persist(self, storageLevel=StorageLevel.MEMORY_AND_DISK):
+    def persist(self, storageLevel=StorageLevel.MEMORY_AND_DISK_DESER):
         """Sets the storage level to persist the contents of the 
:class:`DataFrame` across
         operations after the first time it is computed. This can only be used 
to assign
         a new storage level if the :class:`DataFrame` does not have a storage 
level set yet.
-        If no storage level is specified defaults to (`MEMORY_AND_DISK`).
+        If no storage level is specified defaults to (`MEMORY_AND_DISK_DESER`)
 
-        .. note:: The default storage level has changed to `MEMORY_AND_DISK` 
to match Scala in 2.0.
+        .. note:: The default storage level has changed to 
`MEMORY_AND_DISK_DESER` to match Scala
+            in 2.0.

Review comment:
       `in 3.0`?




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to