GitHub user holdenk opened a pull request:

    https://github.com/apache/spark/pull/2280

    Spark-3406 add a default storage level to python RDD persist API

    

You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/holdenk/spark 
SPARK-3406-Python-RDD-persist-api-does-not-have-default-storage-level

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/spark/pull/2280.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #2280
    
----
commit e95a6c53ffed3c85a3a6cbc509d3643f9c6afa05
Author: Holden Karau <[email protected]>
Date:   2014-09-05T00:28:13Z

    The Python persist function did not have a default storageLevel unlike the 
Scala API. Noticed this issue because we got a bug report back from the book 
where we had documented it as if it was the same as the Scala API

commit e6582271d84d43a4744a276d0fb651716509487c
Author: Holden Karau <[email protected]>
Date:   2014-09-05T00:29:30Z

    Fix the test I added

----


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to